Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Optimization for ML Workshop

Scalable Second-Order Optimization Algorithms for Minimizing Low-rank Functions

Edward Tansley · Coralia Cartis


Abstract:

We present a random-subspace variant of cubic regularization algorithm that chooses the size of the subspace adaptively, based on the rank of the projected second derivative matrix. Iteratively, our variant only requires access to (small-dimensional) projections of first- and second-order problem derivatives and calculates a reduced step inexpensively. The ensuing method maintains the optimal global rate of convergence of (full-dimensional) cubic regularization, while showing improved scal- ability both theoretically and numerically, particularly when applied to low-rank functions. When applied to the latter, our algorithm naturally adapts the subspace size to the true rank of the function, without knowing it a priori.

Chat is not available.