Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Statistical Frontiers in LLMs and Foundation Models

Conformal Reasoning: Uncertainty Estimation in Interactive Environments

Eric Frankel · Stella Li · Lillian Ratliff · Yulia Tsvetkov · Sewoong Oh · Pang Wei Koh

[ ] [ Project Page ]
Sat 14 Dec noon PST — 12:45 p.m. PST

Abstract:

We introduce conformal reasoning, a principled method for models in interactive environments to reason about their uncertainty and decide whether to seek out more information or to return a prediction. The challenge with standard conformal prediction---a popular statistical framework for uncertainty estimation that constructs prediction sets with formal coverage guarantees---is that it relies on a fixed set of calibration data points.In interactive environments, however, the calibration trajectories require certain termination criteria determined a priori, introducing heuristic bias and/or circular dependency that break the assumptions needed for coverage guarantees.We address this issue by building on techniques from adaptive conformal inference, which is traditionally used in online settings to account for distribution shift.On two real-world tasks on medical diagnosis and embodied question answering, we show that conformal reasoning empirically achieves its theoretical coverage guarantees---in contrast with standard conformal prediction approaches that can significantly over- or under-cover---while improving exploration efficiency by approximately 20\% on both tasks.

Chat is not available.