Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Symmetry and Geometry in Neural Representations

Connecting Neural Models Latent Geometries with Relative Geodesic Representations

Hanlin Yu · Berfin Inal · Marco Fumero

Keywords: [ representation learning ] [ Representation alignment ] [ latent space geometry ]


Abstract:

Neural models learn representations of data that lie on low dimensional manifolds. Multiple factors, including stochasticities in the training process may induce different representations, even when learning the same task on the same data. However when there exist a latent structure shared between different representational spaces,recent works have showed that is possible to model a transformation between them. In this work we show how by leveraging the differential geometrical structure of latent spaces of neural models, it is possible to capture precisely the transformations between model latent spaces. We validate experimentally our method on autoencoder models and real pretrained foundational vision models across diverse architectures, initializations and tasks.

Chat is not available.