Poster
in
Workshop: 5th Workshop on Self-Supervised Learning: Theory and Practice
Decoupling Vertical Federated Learning using Local Self-Supervision
Avi Amalanshu · Yash Sirvi · David Inouye
Vertical Federated Learning (VFL) enables collaborative learning between clients who have disjoint features of common entities. However, standard VFL lacks fault tolerance, with each participant and connection being a single point of failure. Prior attempts to induce fault tolerance in VFL focus on the scenario of "straggling clients", usually entailing that all messages eventually arrive or that there is an upper bound on the number of late messages. To handle the more general problem of arbitrary crashes, we propose Decoupled VFL (DVFL). To handle training with faults, DVFL decouples training between communication rounds using local unsupervised objectives. By further decoupling label supervision from aggregation, DVFL also enables redundant aggregators. As secondary benefits, DVFL can enhance data efficiency and security against gradient-based attacks. In this work, we implement DVFL for split neural networks with a self-supervised autoencoder loss. This performs comparably to VFL on a split-MNIST task and degrades more gracefully under faults than our best VFL-based method. We also discuss its gradient privacy and demonstrate its data efficiency.