Poster
in
Workshop: Federated Learning: Recent Advances and New Challenges
The Interpolated MVU Mechanism For Communication-efficient Private Federated Learning
Chuan Guo · Kamalika Chaudhuri · Pierre STOCK · Mike Rabbat
Abstract:
We consider private federated learning (FL), where a server aggregates differentially private gradient updates from a large number of clients in order to train a machine learning model. The main challenge here is balancing privacy with both classification accuracy of the learned model as well as the amount of communication between the clients and server. In this work, we build on a recently proposed method for communication-efficient private FL---the MVU mechanism---by introducing a new interpolation mechanism that can accommodate a more efficient privacy analysis. The result is the new Interpolated MVU mechanism that provides SOTA results on communication-efficient private FL on a variety of datasets.
Chat is not available.