Skip to yearly menu bar Skip to main content


Poster

The Functional Neural Process

Christos Louizos · Xiahan Shi · Klamer Schutte · Max Welling

East Exhibition Hall B, C #52

Keywords: [ Deep Learning ] [ Hierar ] [ Algorithms -> Relational Learning; Algorithms -> Uncertainty Estimation; Probabilistic Methods; Probabilistic Methods ]


Abstract:

We present a new family of exchangeable stochastic processes, the Functional Neural Processes (FNPs). FNPs model distributions over functions by learning a graph of dependencies on top of latent representations of the points in the given dataset. In doing so, they define a Bayesian model without explicitly positing a prior distribution over latent global parameters; they instead adopt priors over the relational structure of the given dataset, a task that is much simpler. We show how we can learn such models from data, demonstrate that they are scalable to large datasets through mini-batch optimization and describe how we can make predictions for new points via their posterior predictive distribution. We experimentally evaluate FNPs on the tasks of toy regression and image classification and show that, when compared to baselines that employ global latent parameters, they offer both competitive predictions as well as more robust uncertainty estimates.

Live content is unavailable. Log in and register to view live content