Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences
Diffusion-Based Inverse Solver on Function Spaces With Applications to PDEs
Abbas Mammadov · Julius Berner · Kamyar Azizzadenesheli · Jong Chul Ye · Animashree Anandkumar
We present a novel framework for solving function space inverse problems using diffusion-based generative models. Unlike traditional methods, which often discretize the domain and operate on fixed grids, our approach is discretization-agnostic, allowing for flexibility during sampling and generalization across different resolutions. Built upon function space diffusion models with neural operator architectures, we adapt the denoising process of pre-trained diffusion models to efficiently sample from posterior distributions in function spaces. This framework can be applied to a variety of problems, such as denoising tasks or recovery of initial conditions and coefficient functions in PDE-based inverse problems like Darcy flows. To the best of our knowledge, this is the first diffusion-based plug-and-play solver for inverse problems that operates in a discretization-agnostic manner, providing a new perspective on inverse problems with functional data, as typically arising in the context of PDEs.