Skip to yearly menu bar Skip to main content


Poster

Breaking the Bandwidth Barrier: Geometrical Adaptive Entropy Estimation

Weihao Gao · Sewoong Oh · Pramod Viswanath

Area 5+6+7+8 #64

Keywords: [ Information Theory ] [ (Other) Statistics ]


Abstract: Estimators of information theoretic measures such as entropy and mutual information from samples are a basic workhorse for many downstream applications in modern data science. State of the art approaches have been either geometric (nearest neighbor (NN) based) or kernel based (with bandwidth chosen to be data independent and vanishing sub linearly in the sample size). In this paper we combine both these approaches to design new estimators of entropy and mutual information that strongly outperform all state of the art methods. Our estimator uses bandwidth choice of fixed $k$-NN distances; such a choice is both data dependent and linearly vanishing in the sample size and necessitates a bias cancellation term that is universal and independent of the underlying distribution. As a byproduct, we obtain a unified way of obtaining both kernel and NN estimators. The corresponding theoretical contribution relating the geometry of NN distances to asymptotic order statistics is of independent mathematical interest.

Live content is unavailable. Log in and register to view live content