Skip to yearly menu bar Skip to main content


Spotlight Poster

Particle Semi-Implicit Variational Inference

Jen Ning Lim · Adam Johansen

East Exhibit Hall A-C #4106
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Semi-implicit variational inference (SIVI) enriches the expressiveness of variationalfamilies by utilizing a kernel and a mixing distribution to hierarchically define thevariational distribution. Existing SIVI methods parameterize the mixing distributionusing implicit distributions, leading to intractable variational densities. As a result,directly maximizing the evidence lower bound (ELBO) is not possible, so theyresort to one of the following: optimizing bounds on the ELBO, employing costlyinner-loop Markov chain Monte Carlo runs, or solving minimax objectives. In thispaper, we propose a novel method for SIVI called Particle Variational Inference(PVI) which employs empirical measures to approximate the optimal mixingdistributions characterized as the minimizer of a free energy functional. PVI arisesnaturally as a particle approximation of a Euclidean–Wasserstein gradient flow and,unlike prior works, it directly optimizes the ELBO whilst making no parametricassumption about the mixing distribution. Our empirical results demonstrate thatPVI performs favourably compared to other SIVI methods across various tasks.Moreover, we provide a theoretical analysis of the behaviour of the gradient flowof a related free energy functional: establishing the existence and uniqueness ofsolutions as well as propagation of chaos results.

Live content is unavailable. Log in and register to view live content