Poster
Personalized Federated Learning via Feature Distribution Adaptation
Connor Mclaughlin · Lili Su
Federated learning (FL) is a distributed learning framework that leverages commonalities between distributed client datasets to train a global model. Under heterogeneous clients, however, FL can fail to produce stable training results. Personalized federated learning (PFL) seeks to address this by learning individual models tailored to each client. One approach is to decompose model training into shared representation learning and personalized classifier training. Nonetheless, previous works struggle to navigate the bias-variance trade-off in classifier learning, relying solely on limited local datasets or introducing costly techniques to improve generalization. In this work, we frame global representation learning as a generative modeling task, with the representation trained under a classifier based on the global feature distribution. We then propose an algorithm (pFedFDA) that efficiently generates personalized models by adapting global generative classifiers to their local feature distributions. Through extensive computer vision benchmarks, we demonstrate that our method can adjust to complex distribution shifts with significant improvements over current state-of-the-art in data-scarce settings. Our source code will be made publicly available.
Live content is unavailable. Log in and register to view live content