Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Symmetry and Geometry in Neural Representations

An Informational Parsimony Perspective on Symmetry-Based Structure Extraction

Hippolyte Charvin · Nicola Catenacci Volpi · Daniel Polani

Keywords: [ Geometric Complexity ] [ Probabilistic symmetries ] [ Information Bottleneck ] [ Equivariance ]


Abstract:

Extraction of structure, in particular of group symmetries, is increasingly crucial to understanding and building intelligent models. In particular, some information-theoretic models of complexity-constrained learning have been argued to induce invariance extraction. Here, we formalise these arguments from a group-theoretic perspective, and extend them to the study of more general probabilistic symmetries through dedicated structure-preserving compressions. More precisely, we consider compressions that are optimal under the constraint of preserving the divergence from a given exponential family, yielding a novel generalisation of the Information Bottleneck framework. Through appropriate choices of exponential families, we fully characterise (in the discrete and full support case) channel invariance, channel equivariance and distribution invariance under permutation. Allowing imperfect divergence preservation then leads to principled definitions of "soft symmetries'", where the "coarseness" corresponds to the degree of compression of the system. In simple synthetic experiments, we demonstrate that our method successively recovers, at increasingly compressed "resolutions", nested but increasingly perturbed equivariances, where new equivariances emerge at bifurcation points of the distortion parameter. Our framework opens a new path towards the extraction of generalised probabilistic symmetries.

Chat is not available.