Talk
in
Workshop: OPT2020: Optimization for Machine Learning
Contributed Video: Distributed Proximal Splitting Algorithms with Rates and Acceleration, Laurent Condat
Laurent Condat
Abstract:
We propose new generic distributed proximal splitting algorithms, well suited for large-scale convex nonsmooth optimization. We derive sublinear and linear convergence results with new nonergodic rates, as well as new accelerated versions of the algorithms, using varying stepsizes.