Skip to yearly menu bar Skip to main content


Poster

LaSCal: Label-Shift Calibration without target labels

Teodora Popordanoska · Gorjan Radevski · Tinne Tuytelaars · Matthew Blaschko

[ ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

When machine learning systems face dataset shift, model calibration plays a pivotal role in ensuring their reliability.Calibration error (CE) provides insights into the alignment between the predicted confidence scores and the classifier accuracy.While prior works have delved into the implications of dataset shift on calibration, existing CE estimators either (i) assume access to labeled data from the target domain, often unavailable in practice, or (ii) are derived under a covariate shift assumption. In this work we propose a novel, label-free, consistent CE estimator under label shift, characterized by changes in the marginal label distribution p(Y), with a constant conditional p(X|Y) distribution between the source and target. Furthermore, we introduce a novel calibration method, called LaSCal, which uses the estimator in conjunction with a post-hoc calibration strategy, to perform unsupervised calibration on the target distribution. Our thorough empirical analysis demonstrates the effectiveness and reliability of the proposed approach across different modalities, model architectures and label shift intensities.

Live content is unavailable. Log in and register to view live content