Poster
Escaping from saddle points on Riemannian manifolds
Yue Sun · Nicolas Flammarion · Maryam Fazel
East Exhibition Hall B, C #199
Keywords: [ Optimization ]
[
Abstract
]
Abstract:
We consider minimizing a nonconvex, smooth function $f$ on a Riemannian manifold $\mathcal{M}$. We show that a perturbed version of the gradient descent algorithm converges to a second-order stationary point for this problem (and hence is able to escape saddle points on the manifold). While the unconstrained problem is well-studied, our result is the first to prove such a rate for nonconvex, manifold-constrained problems.
The rate of convergence depends as $1/\epsilon^2$ on the accuracy $\epsilon$, which matches a rate known only for unconstrained smooth minimization. The convergence rate also has a polynomial dependence on the parameters denoting the curvature of the manifold and the smoothness of the function.
Live content is unavailable. Log in and register to view live content