Oral
in
Workshop: International Workshop on Federated Foundation Models in Conjunction with NeurIPS 2024 (FL@FM-NeurIPS'24)
EncCluster: Bringing Functional Encryption in Federated Foundational Models
Vasileios Tsouvalas · Samaneh Mohammadi · Ali Balador · Tanir Özçelebi · Francesco Flammini · Nirvana Meratnia
Abstract:
Federated Learning (FL) decentralizes model training by transmitting local model updates to a central server, yet it remains vulnerable to inference attacks during these transmissions. Existing solutions, such as Differential Privacy (DP) and Functional Encryption (FE), often degrade performance or impose significant operational burdens on clients. Meanwhile, the advent of Foundation Models (FMs) has transformed FL with their adaptability and high performance across diverse tasks. However, delivering strong privacy guarantees with these highly parameterized FMs in FL using existing privacy-preserving frameworks amplifies existing challenges and further complicates the efficiency-privacy trade-off. We present EncCluster, a novel method that integrates model compression through weight clustering with decentralized FE and privacy-enhancing data encoding using probabilistic filters to deliver strong privacy guarantees in FL without affecting model performance or adding unnecessary burdens to clients. We perform a comprehensive evaluation, spanning $4$ datasets and $5$ architectures, to demonstrate EncCluster scalability across encryption levels. Our findings reveal that EncCluster significantly reduces communication costs —below even conventional FedAvg— and accelerates encryption up to $1000\times$ over baselines; at the same time, it maintains high model accuracy and enhanced privacy assurances.
Chat is not available.