Skip to yearly menu bar Skip to main content


Poster

Linear Uncertainty Quantification of Graphical Model Inference

Chenghua Guo · Han Yu · Jiaxin Liu · Chao Chen · Qi Li · Sihong Xie · Xi Zhang

[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Uncertainty Quantification (UQ) is vital for decision makers as it offers insights into the potential reliability of data and model, enabling more informed and risk-aware decision-making. Graphical models, known for their capability to represent data with complex dependencies, are widely utilized across various domains. Existing sampling-based UQ methods are unbiased but cannot guarantee convergence and is time-consuming on large-scale graphs. There are fast UQ methods for graphical models with closed-form solutions and convergence guarantee but with biased uncertainty underestimation.We propose LinUProp, a UQ method that utilizes a novel linear propagation of uncertainty to model uncertainty among related nodes additively instead of multiplicatively, to offer linear scalability, guaranteed convergence, and unbiased closed-form solutions.Theoretically, we decompose the expected prediction error of the graphical model and prove that the uncertainty computed by LinUProp is the generalized variance component of the decomposition. Experimentally, we demonstrated that LinUProp is consistent with the sampling-based method but with linear scalability and fast convergence.Moreover, LinUProp outperforms competitors in uncertainty-based active learning on four real-world graph datasets, achieving higher accuracy with a lower labeling budget.

Live content is unavailable. Log in and register to view live content