Poster
in
Workshop: D3S3: Data-driven and Differentiable Simulations, Surrogates, and Solvers
Projected Low-Rank Gradient in Diffusion-based Models for Inverse Problems
Rayhan Zirvi · Bahareh Tolooshams · Animashree Anandkumar
Keywords: [ Inverse problems ] [ Diffusion models ] [ Robustness ]
Recent advancements in diffusion models have demonstrated their potential as powerful learned data priors for solving inverse problems. A popular Bayesian approach leverages diffusion sampling steps for inducing a data prior, generating images from noise while incorporating measurement gradient updates at each step to impose data consistency. However, diffusion models exhibit high sensitivity to measurement gradient step size and face challenges in preserving the process on the manifold, leading to performance degradation and artifact introduction in the sampled posterior. We propose a Projected Low-Rank Gradient (PLoRG) method, approximating the data manifold structure to enhance the performance and robustness of diffusion models in solving inverse problems. Our approach leverages singular value decomposition to approximate the measurement gradient in a lower-rank subspace defined by the current state, effectively preserving the manifold structure and filtering out artifact-inducing components. In addition to superior robustness, we show that PLoRG improves the performance of diffusion models on a range of linear and nonlinear inverse problems, especially those that are inherently challenging such as phase retrieval.