Skip to yearly menu bar Skip to main content


Poster

Generalization Bounds via Conditional $f$-Information

Ziqiao Wang · Yongyi Mao

[ ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract: In this work, we introduce novel information-theoretic generalization bounds using the conditional $f$-information framework, an extension of the traditional conditional mutual information (MI) framework. We provide a generic approach to derive generalization bounds via $f$-information in the supersample setting, applicable to both bounded and unbounded loss functions. Unlike previous MI-based bounds, our proof strategy does not rely on upper bounding the cumulant-generating function (CGF) in the variational formula of MI. Instead, we set the CGF or its upper bound to zero by carefully selecting the measurable function. Although some of our techniques are inspired by recent advances in the coin-betting framework, our results are independent of any prior findings from regret guarantees of online gambling algorithms. Additionally, our newly derived MI-based bound recovers many previous results and improves our understanding of their potential limitations. Finally, we empirically compare various $f$-information measures for generalization, demonstrating improvement of our new bounds over the previous bounds.

Live content is unavailable. Log in and register to view live content