Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Bayesian Decision-making and Uncertainty: from probabilistic and spatiotemporal modeling to sequential experiment design

Inverse-Free Sparse Variational Gaussian Processes

Stefano Cortinovis · Stefanos Eleftheriadis · Laurence Aitchison · James Hensman · Mark van der Wilk

Keywords: [ Gaussian Processes ] [ Variational Inference ] [ Natural Gradients ]


Abstract: Gaussian processes (GPs) are a powerful prior over functions, but they require to invert/decompose the kernel matrix to perform inference, making them poorly suited to modern hardware. To address this, variational bounds that require only matmuls by introducing an additional variational parameter $\mathbf T \in \mathbb{R}^{M\times M}$ were proposed. However, in practice, the optimisation of $\mathbf{T}$ with typical deep learning optimisers is challenging, limiting the practical utility of these bounds. In this work, we solve this by introducing a preconditioner for a variational parameter in the bound, a tailored update for $\mathbf T$ based on natural gradients, and a stopping criterion to determine the number of updates. This leads to an inverse-free method on-par with existing approaches on an iteration basis, with low-precision computation and wall-clock speedups being the next step.

Chat is not available.