Poster
in
Workshop: Optimization for ML Workshop
Applications of fractional calculus in learned optimization
Teodor Szente · James Harrison · Mihai Zanfir · Cristian Sminchisescu
Abstract:
The problem of fractional gradient descent has been studied extensively, focusing on its ability to extend traditional gradient descent methods by incorporating fractional-order derivatives. This approach allows for more flexibility in navigating complex optimization landscapes and may offer advantages in certain types of problems, particularly those involving nonlinearity and chaotic dynamics.Yet, the challenge of fine-tuning the fractional order parameters remains unresolved. In this work, we demonstrate that it is possible to train a neural network to predict the order of the gradient effectively.
Chat is not available.