Poster
in
Workshop: Order up! The Benefits of Higher-Order Optimization in Machine Learning
Improving Levenberg-Marquardt Algorithm for Neural Networks
Omead Pooladzandi · Yiming Zhou
Abstract:
We explore the usage of the Levenberg-Marquardt(LM) algorithm for regression (non-linear least squares) and classification (generalized Gauss-Newton methods) tasks in neural networks. We compare the performance of the LM method with other popular first-order algorithms such as SGD and Adam, as well as other second-order algorithms such as L-BFGS, Hessian-Free and KFAC. We further speed up the LM method by using adaptive momentum, learning rate line search, and uphill step acceptance.
Chat is not available.