Poster
in
Workshop: Optimization for ML Workshop
Consensus Based Optimization Accelerates Gradient Descent
Anagha Satish · Ricardo Baptista · Franca Hoffmann
We propose a novel algorithm for integrating gradient information into Consensus Based Optimization (CBO), a recently proposed multi-particle gradient-free optimization method. During each iteration, a subset of particles are updated using local gradient information, while others are updated using a traditional CBO step. We propose a method for subset selection and investigate its empirical performance. The algorithm combines gradient and gradient-free optimization to encourage exploring the state space while maintaining fast convergence. We investigate the tradeoff between accuracy and computational cost when adjusting the number of gradient evaluations. When applied to classification tasks in machine learning, the proposed algorithm attains a similar accuracy to ensemble gradient methods based on Gradient Descent or Adam at a reduced computational cost.