Skip to yearly menu bar Skip to main content


Poster

Conformalized Credal Set Predictors

Alireza Javanmardi · David Stutz · Eyke Hüllermeier

[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Credal sets are sets of probability distributions that are considered as candidates for an imprecisely known ground-truth distribution. In machine learning, they have recently attracted attention as an appealing formalism for uncertainty representation, in particular due to their ability to represent both the aleatoric and epistemic uncertainty in a prediction. However, the design of methods for learning credal set predictors remains a challenging problem. In this paper, we make use of conformal prediction for this purpose. More specifically, we propose a method for predicting credal sets in the classification task, given training data labeled by probability distributions. Since our method inherits the coverage guarantees of conformal prediction, our conformal credal sets are guaranteed to be valid with high probability (without any assumptions on model or distribution). We demonstrate the applicability of our method on ambiguous classification tasks for uncertainty quantification.

Live content is unavailable. Log in and register to view live content