Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Optimization for ML Workshop

On the Hardness of Meaningful Local Guarantees in Nonsmooth Nonconvex Optimization

Guy Kornowski · Swati Padmanabhan · Ohad Shamir


Abstract: We study the oracle complexity of nonsmooth nonconvex optimization, with the algorithm allowed access only to local function information. It has been shown by Davis, Drusvyatskiy, and Jiang (2023) that for nonsmooth Lipschitz functions satisfying certain regularity and strictness conditions, perturbed gradient descent converges to local minimizers *asymptotically*. Motivated by this and other recent algorithmic advances in nonconvex nonsmooth optimization, we consider the question of obtaining a non-asymptotic rate of convergence to local minima for this problem class. We provide the following negative answer to this question: Local algorithms acting on regular Lipschitz functions *cannot*, in the worst case, provide meaningful local guarantees in terms of function value in sub-exponential time, even when all near-stationary points are global minima. This sharply contrasts with the smooth setting, for which standard gradient methods are known to do so at a dimension-free rate. Our result complements the rich body of work in the theoretical computer science literature that provide hardness results conditional on conjectures such as $\mathsf{P}\neq\mathsf{NP}$ or cryptographic assumptions, in that ours holds unconditional of any such assumptions.

Chat is not available.