Skip to yearly menu bar Skip to main content


Invited Talk
in
Workshop: Mathematics of Modern Machine Learning (M3L)

A Theoretical Perspective on Hardness of Sampling and Learning from Samples in High Dimensions

Lenka Zdeborová

[ ]
Sat 14 Dec 9:45 a.m. PST — 10:30 a.m. PST

Abstract:

Recent advancements in generative modelling, including flow-based, diffusion-based, and autoregressive networks, have achieved remarkable success in data generation. However, understanding their performance and limitations, particularly in high-dimensional settings, remains an open challenge. This talk explores the intersection of generative models and statistical physics, leveraging insights from spin-glass theory and denoising frameworks.

We first examine the efficiency of generative models compared to classical methods like Monte Carlo and Langevin dynamics in sampling from complex distributions, focusing on phase transitions that impact sampling performance. Next, we analyze denoising autoencoders in high dimensions, providing closed-form results that reveal their advantage over simpler architectures. Finally, we analyze the training of flow-based generative models on limited samples, presenting sharp theoretical characterizations of their learning dynamics.

Talk based on: Sampling with flows, diffusion and autoregressive neural networks: A spin-glass perspective, arXiv:2308.14085, PNAS’24, [Ghio, Dandi, Krzakala, LZ] High-dimensional Asymptotics of Denoising Autoencoders, arXiv:2305.11041, NeurIPS’23 spotlight [Cui, LZ] Analysis of learning a flow-based generative model from limited sample complexity, arXiv:2310.03575, ICLR’24 [Cui, Vanden-Eijnden, Krzakala, LZ]

Chat is not available.