Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient Descent: The Ultimate Optimizer #112

Open
redknightlois opened this issue Feb 15, 2023 · 0 comments
Open

Gradient Descent: The Ultimate Optimizer #112

redknightlois opened this issue Feb 15, 2023 · 0 comments
Assignees
Labels
feature request Request features

Comments

@redknightlois
Copy link

https://arxiv.org/abs/1909.13371

Working with any gradient-based machine learning algorithm involves the tedious
task of tuning the optimizer’s hyperparameters, such as its step size. Recent work
has shown how the step size can itself be optimized alongside the model parameters
by manually deriving expressions for “hypergradients” ahead of time.
We show how to automatically compute hypergradients with a simple and elegant
modification to backpropagation. This allows us to easily apply the method to
other optimizers and hyperparameters (e.g. momentum coefficients). We can even
recursively apply the method to its own hyper-hyperparameters, and so on ad infinitum. As these towers of optimizers grow taller, they become less sensitive to the initial choice of hyperparameters. We present experiments validating this for MLPs,
CNNs, and RNNs. Finally, we provide a simple PyTorch implementation of this
algorithm (see people.csail.mit.edu/kach/gradient-descent-the-ultimate-optimizer).

Reference implementation: https://github.com/kach/gradient-descent-the-ultimate-optimizer

Had been using this for great effect on some small tasks, but the problem is that it is not very framework friendly (clearly not a plug and play optimizer) and it requires engineering around how it works. Would be great if you can figure out how to make it more plug-and-play.

@kozistr kozistr added the feature request Request features label Feb 15, 2023
@kozistr kozistr self-assigned this Feb 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Request features
Projects
None yet
Development

No branches or pull requests

2 participants