Poster
in
Workshop: Federated Learning: Recent Advances and New Challenges
DASH: Decentralized CASH for Federated Learning
Md Ibrahim Ibne Alam · Koushik Kar · Theodoros Salonidis · Horst Samulowitz
We present DASH, a decentralized framework that addresses for the first time the Combined Algorithm Selection and HyperParameter Optimization (CASH) problem in Federated Learning (FL) settings. DASH generates a set of algorithm-hyper-parameter (Alg-HP) pairs using existing centralized HPO algorithms which are then evaluated by clients individually on their local datasets. The clients transmit to the server the loss functions and the server aggregates them in order to generate a loss signal that will aid the next Alg-HP pair selection. This approach avoids the communication complexity of performing client evaluations using communication-intensive FL training. FL training is only performed when the final Alg-HP pair is selected. Thus, DASH allows the use of sophisticated HPO algorithms at the FL server, while requiring clients to perform simpler model training and evaluation on their individual datasets than communication-intensive FL training. We provide a theoretical analysis of the loss rate attained by DASH as compared to a fully centralized solution (with access to all client datasets), and show that regret depends on the dissimilarity between the datasets of the clients, resulting from the FL restriction that client datasets remain private. Experimental studies on several datasets show that DASH performs favorably against several baselines and closely approximates centralized CASH performance.