Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences

Higher-order cumulants in diffusion models

Gert Aarts · Ntiaa Entin Habibi · Lingxiao Wang · Kai Zhou


Abstract:

To analyse how diffusion models learn correlations beyond Gaussian ones, we study the behaviour of higher-order cumulants under both the forward and backward process. We present explicit expressions for the moment- and cumulant-generating functionals, in terms of the distribution of the initial data and properties of the forward process. We show analytically that higher-order cumulants are conserved under pure diffusion, i.e., in models without drift, during the forward process, and that therefore the endpoint of the forward process maintains non-trivial correlations. We demonstrate that since these correlations are encoded in the score function, higher-order cumulants are learnt quickly in the backward process, also when starting from a normal prior. We confirm our analytical results in an exactly solvable toy model and in scalar lattice field theory.

Chat is not available.