Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Bayesian Decision-making and Uncertainty: from probabilistic and spatiotemporal modeling to sequential experiment design

Data-Efficient Variational Mutual Information Estimation via Bayesian Self-Consistency

Desi R Ivanova · Marvin Schmitt · Stefan Radev

Keywords: [ Mutual Information ] [ Variational Inference ] [ Bayesian experimental design ] [ amortized inference ]


Abstract:

Mutual information (MI) is a central quantity of interest in information theory and machine learning, but estimating it accurately and efficiently remains challenging.In this paper, we propose a novel approach that exploits Bayesian self-consistency to improve the data efficiency of variational MI estimators.Our method incorporates a principled variance penalty that encourages consistency in marginal likelihood estimates, ultimately leading to more accurate MI estimation and posterior approximation with fewer gradient steps.We demonstrate the effectiveness of our method on two tasks: (1) MI estimation for correlated Gaussian distributions; and (2) Bayesian experimental design for the Michaelis-Menten model.Our results demonstrate that our self-consistent estimator converges faster whilst producing higher quality MI and posterior estimates compared to baselines.

Chat is not available.