It is increasingly recognised that the computations in the brain can be understood based on the theory of dynamical systems conformed by the activity of large neural populations. Moreover, several works have observed that dominant dynamical patterns of computation are highly preserved across animals performing similar tasks. In my talk, I will argue that these preserved dynamical patterns manifest from the existence of invariancesconserved quantities and symmetries in population dynamics. I will then describe our efforts to mathematically formalise and computationally capture these invariances from the geometric activity of neural populations. Specifically, in the first part of my talk I will talk vector field descriptions of neural dynamics, highlighting MARBLE, a geometric deep learning method that allows finding consistent latent representations across neural recordings. Then, in the second part, I will highlight current work to formulate a data-driven and predictive model for learning invariances.