Skip to yearly menu bar Skip to main content


Poster

Better Uncertainty Calibration via Proper Scores for Classification and Beyond

Sebastian Gruber · Florian Buettner

Hall J (level 1) #429

Keywords: [ Predictive Uncertainty ] [ Classification ] [ calibration ] [ Regression ]


Abstract:

With model trustworthiness being crucial for sensitive real-world applications, practitioners are putting more and more focus on improving the uncertainty calibration of deep neural networks.Calibration errors are designed to quantify the reliability of probabilistic predictions but their estimators are usually biased and inconsistent.In this work, we introduce the framework of \textit{proper calibration errors}, which relates every calibration error to a proper score and provides a respective upper bound with optimal estimation properties.This relationship can be used to reliably quantify the model calibration improvement.We theoretically and empirically demonstrate the shortcomings of commonly used estimators compared to our approach.Due to the wide applicability of proper scores, this gives a natural extension of recalibration beyond classification.

Chat is not available.