Skip to yearly menu bar Skip to main content


Poster

Entropy testing and its application to testing Bayesian networks

ClĂ©ment L Canonne · Qiping Yang

West Ballroom A-D #5707
[ ]
Wed 11 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract: This paper studies the problem of \emph{entropy identity testing}: given sample access to a distribution $p$ and a fully described distribution $q$ (both are discrete distributions over the support of size $k$), and the promise that either $p = q$ or $ | H(p) - H(q) | \geqslant \varepsilon$, where $H(\cdot)$ denotes the Shannon entropy, a tester needs to distinguish between the two cases with high probability. We establish a near-optimal sample complexity bound of $\tilde{\Theta}(\sqrt{k}/\varepsilon + {1}/{\varepsilon^2}$) for this problem, and show how to apply it to the problem of identity testing for in-degree-$d$ $n$-dimensional Bayesian networks, obtaining an upper bound of $\tilde{O}\left( {2^{d / 2} n}/{\varepsilon^2} + {n^2}/{\varepsilon^4} \right)$. This improves on the sample complexity bound of $\tilde{O}(2^{d/2}n^2/\varepsilon^4)$ from Canonne, Diakonikolas, Kane, and Stewart (2020), which required an additional assumption on the structure of the (unknown) Bayesian network.

Chat is not available.