Poster
in
Workshop: NeurIPS 2022 Workshop on Score-Based Methods
Convergence of score-based generative modeling for general data distributions
Holden Lee · Jianfeng Lu · Yixin Tan
Abstract:
We give polynomial convergence guarantees for denoising diffusion models that do not rely on the data distribution satisfying functional inequalities or strong smoothness assumptions. Assuming a $L^2$-accurate score estimate, we obtain Wasserstein distance guarantees for any distributions of bounded support or sufficiently decaying tails, as well as TV guarantees for distributions with further smoothness assumptions.
Chat is not available.