Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Statistical Frontiers in LLMs and Foundation Models

Obtaining Conformal Prediction-like guarantees by standard concentration: an observation

Emmanouil Seferis

Keywords: [ Learning Theory ] [ Conformal Prediction ] [ Measure Concentration ]

[ ] [ Project Page ]
Sat 14 Dec 3:45 p.m. PST — 4:30 p.m. PST

Abstract:

Conformal Prediction (CP) has been recently used successfully as a distribution-free method for creating statistical guarantees for the performance of black-box Machine Learning (ML) models. For example, we can use it to construct sets that are guaranteed to contain the ground-truth class with a pre-specified probability such as 90%. This work started with a question: is CP the only way for obtaining such guarantees? Interestingly, we find that we can obtain similar bounds by standard concentration, albeit with a penalty on the sample complexity (quadratic vs linear for CP). On the other hand, our simple derivation covers not only the standard classification-based CP, but also Conformal Risk Control, a very recent and technically not easy extension of CP. Overall, we think that the intersection between CP and classical concentration / learning theory is interesting for future exploration.

Chat is not available.