Skip to yearly menu bar Skip to main content


Spotlight Poster

EigenVI: score-based variational inference with orthogonal function expansions

Diana Cai · Chirag Modi · Charles Margossian · Robert Gower · David Blei · Lawrence Saul

[ ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract: We develop EigenVI, a new approach for black-box variational inference (BBVI). EigenVI fits a novel class variational approximations based on orthogonal function expansions. For distributions over $\mathbb{R}^D$, the lowest order term in these expansions provides a Gaussian variational approximation, while higher-order terms provide a systematic way to model non-Gaussianity. These variational approximations are flexible enough to model complex distributions (multimodal, asymmetric), but they are simple enough that one can calculate their low-order moments and draw samples from them. Further, by choosing different families of orthogonal functions, EigenVI can model different types of random variables (e.g., real-valued, nonnegative, bounded). To fit the approximation, EigenVI matches score functions by minimizing a Fisher divergence. Notably, this optimization reduces to solving a minimum eigenvalue problem, so that EigenVI effectively sidesteps the iterative gradient-based optimizations that are required for many other BBVI algorithms. (Gradient-based methods can be sensitive to learning rates, termination criteria, and other tunable hyperparameters.) We study EigenVI on a variety of target distributions, including a benchmark suite of Bayesian models from posteriordb. Compared to existing methods for BBVI, EigenVI is more accurate.

Live content is unavailable. Log in and register to view live content