Talk by Taco Cohen
in
Workshop: Differential Geometry meets Deep Learning (DiffGeo4DL)
Invited Talk 2: Gauge Theory in Geometric Deep Learning
Taco Cohen
It is often said that differential geometry is in essence the study of connections on a principal bundle. These notions have been discovered independently in gauge theory in physics, and over the last few years it has become clear that they also provide a very general and systematic way to model convolutional neural networks on homogeneous spaces and general manifolds. Specifically, representation spaces in these networks are described as fields of geometric quantities on a manifold (i.e. sections of associated vector bundles). These quantities can only be expressed numerically after making an arbitrary choice of frame / gauge (section of a principal bundle). Network layers map between representation spaces, and should be equivariant to symmetry transformations. In this talk I will discuss two results that have a bearing on geometric deep learning research. First, we discuss the “convolution is all you need theorem” which states that any linear equivariant map between homogeneous representation spaces is a generalized convolution. Secondly, in the case of gauge symmetry (when all frames should be considered equivalent), we show that defining a non-trivial equivariant linear map between representation spaces requires the introduction of a principal connection which defines parallel transport. We will not assume familiarity with bundles or gauge theory, and use examples relevant to neural networks to illustrate the ideas.