Spotlight Talk
in
Workshop: Order up! The Benefits of Higher-Order Optimization in Machine Learning
Cubic Regularized Quasi-Newton Methods
Klea Ziu
Abstract:
In this paper, we propose a Cubic Regularized L-BFGS. Cubic Regularized Newton outperforms the classical Newton method in terms of global performance. In classics, L-BFGS approximation is applied for the Newton method. We propose a new variant of inexact Cubic Regularized Newton. Then, we use L-BFGS approximation as an inexact Hessian for Cubic Regularized Newton. It allows us to get better theoretical convergence rates and good practical performance, especially from the points where classical Newton is diverging.
Chat is not available.