Skip to yearly menu bar Skip to main content


Poster

Noise-Aware Differentially Private Regression via Meta-Learning

Ossi Räisä · Stratis Markou · Matthew Ashman · Wessel Bruinsma · Marlon Tobaben · Antti Honkela · Richard Turner

[ ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Many high-stakes applications require machine learning models that protect user privacy and provide well-calibrated, accurate predictions. While Differential Privacy (DP) is the gold standard for protecting user privacy, standard DP mechanisms typically significantly impair performance. One approach to mitigating this issue is pre-training models on simulated data before DP learning on the private data. In this work we go a step further, using simulated data to train a meta-learning model that combines the Convolutional Conditional Neural Process (ConvCNP) with an improved functional DP mechanism of Hall et al. (2013), yielding the DPConvCNP. DPConvCNP learns from simulated data how to map private data to a DP predictive model in one forward pass, and then provides accurate, well-calibrated predictions. We compare DPConvCNP with a DP Gaussian Process (GP) baseline with carefully tuned hyperparameters. The DPConvCNP outperforms the GP baseline, especially on non-Gaussian data, yet is much faster at test time and requires less tuning.

Live content is unavailable. Log in and register to view live content