Poster
in
Workshop: Bayesian Decision-making and Uncertainty: from probabilistic and spatiotemporal modeling to sequential experiment design
Efficient Local Unlearning for Gaussian Processes with Out-of-Distribution Data
Juliusz Ziomek · Ilija Bogunovic
Keywords: [ Gaussian process ] [ unlearning ]
Gaussian Processes (GPs) offer robust uncertainty estimates crucial for data-efficient applications like Black-box Optimization or Model Predictive Control. However, when the underlying function changes, previously gathered data can mislead predictions, impacting performance. Instead of indiscriminately removing all data points (or a large fraction) after detecting a change, the goal is to efficiently identify and remove only the obsolete data points, a process we refer to as unlearning in GPs. Leveraging the model's uncertainty estimates, we transform the unlearning problem into one of maximizing variance (nearly reverting to GP prior values) at detected change points by selectively removing the most informative training points. Though the exact solution to this problem is NP-hard, we propose an efficient algorithm that approximates the optimal solution while significantly reducing computational complexity. This algorithm utilizes novel fast reverse update equations for GP models, enabling linear-time sequential computation of the posterior variance function with removed training points. We test the performance of our unlearning procedure across various tasks, including Model Predictive Control, Transfer Bayesian Optimization, and Time-Varying Bayesian Optimization. Our approach offers a comprehensive solution for handling out-of-distribution issues in GP modeling, significantly outperforming baseline methods.