Skip to yearly menu bar Skip to main content


Poster

Information-theoretic Generalization Analysis for Expected Calibration Error

Futoshi Futami · Masahiro Fujisawa

[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

While the expected calibration error (ECE), which employs binning, is widely adopted to evaluate the calibration performance of machine learning models, theoretical understanding of its estimation bias is limited. In this paper, we present the first comprehensive analysis of the estimation bias in the two common binning strategies, uniform mass and uniform width binning.Our analysis establishes upper bounds on the bias, achieving an improved convergence rate. Moreover, our bounds reveal, for the first time, the optimal number of bins to minimize the estimation bias. We further extend our bias analysis to generalization error analysis based on the information-theoretic approach, deriving upper bounds that enable the numerical evaluation of how small the ECE is for unknown data. Experiments using deep learning models show that our bounds are nonvacuous thanks to this information-theoretic generalization analysis approach.

Live content is unavailable. Log in and register to view live content