Poster
in
Workshop: OPT 2021: Optimization for Machine Learning
Optimization with Adaptive Step Size Selection from a Dynamical Systems Perspective
Neha Wadia · Michael Jordan · Michael Muehlebach
Abstract:
We investigate how adaptive step size methods from numerical analysis can be used to speed up optimization routines. In contrast to line search strategies, the proposed methods recycle available gradient evaluations. Thus, neither evaluations of the objective function nor additional gradient evaluations are required, and the computational complexity per iteration remains at $\mathcal{O}(d)$, where $d$ is the problem dimension. On strongly convex functions, our results show a consistent improvement in the number of iterations by up to a factor of $2$ with the gradient method and of $1.4$ with the heavy ball method across a range of condition numbers.
Chat is not available.