Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences
Clifford Flows
Francesco Alesiani · Takashi Maruyama
Geometric machine learning incorporates geometric priors when modeling physicalsystems, as particle or molecular systems. Clifford Algebra extends Euclideanvector space by introducing algebraic structure and thus represents an appealingtool to model geometrical features. An example of this model is the Clifford neuralnetwork, an equivariant neural network based on Clifford Algebra. When modelingdistributions over geometric objects using Clifford Algebra, we need to define howthese distributions transform. We thus introduce probability density function overClifford algebra and their transformation based on gradients of functions definedover Clifford Algebra. Here we show that the gradient of functions between Cliffordalgebras on Euclidean spaces induces the canonical gradient of the functionsrestricted to the base vector spaces. This ensures that the gradient of Cliffordneural networks coincides with that obtained through widely adopted automaticdifferentiation modules such as Autograd. We empirically evaluate the benefitof the gradient of Clifford neural networks and the transformation of distributionover Clifford Algebra for the problem of sampling from distributions in scientificdiscovery.