Poster
in
Workshop: OPT 2023: Optimization for Machine Learning
Stochastic FISTA Step Search Algorithm for Convex Optimization
Trang H. Tran · Lam Nguyen · Katya Scheinberg
Abstract:
In this paper, we propose an accelerated stochastic step search algorithm which combines an accelerated method with a fully adaptive step size parameter for convex problems in (Scheinberg et. al., 2014) with stochastic step search analysis in (Paquette and Scheinberg, 2020). Under appropriate conditions on the accuracy of the estimates of gradient and function value our algorithm achieves expected iteration complexity of $\mathcal{O}(1/\sqrt{\epsilon})$ to reach an $\epsilon$-accurate solution which satisfies $f(x) - f_* \leq \epsilon$. This complexity matches with the iteration complexity of deterministic Nesterov's accelerated and FISTA algorithms (Nesterov, 1983, Beck and Teboulle, 2009). This paper continues the line of work on stochastic adaptive algorithms studied in (Berahas et. al., 2021, Blanchet et. al., 2019, Paquette and Scheinberg, 2020) and is the first one to develop an accelerated gradient descent type algorithm in this domain.
Chat is not available.