Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Bayesian Decision-making and Uncertainty: from probabilistic and spatiotemporal modeling to sequential experiment design

Decision-Driven Calibration for Cost-Sensitive Uncertainty Quantification

Gregory Canal · Vladimir Leung · John Guerrerio · Philip Sage · I-Jeng Wang

Keywords: [ temperature scaling ] [ decision-driven calibration ] [ uncertainty quantification ]


Abstract:

In recent years, the ability of artificial intelligence (AI) systems to quantity their uncertainty has become paramount in building trustworthy AI systems. In standard uncertainty quantification (UQ), AI uncertainty is calibrated such that the confidence of its predictions matches the statistics of the underlying data distribution. However, this method of calibration does not take into consideration the direct influence of UQ on the subsequent actions taken by a downstream decision-maker. Here we demonstrate an alternate, decision-driven method of UQ calibration that explicitly minimizes the incurred costs of downstream decisions. After formulating decision-driven calibration as an optimization program with respect to a known decision-maker, we show in a simulated search-and-rescue scenario how decision-driven temperature scaling can lead to lower incurred decision costs.

Chat is not available.