Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementing other "Optimized Gradient Descent Variants" #152

Open
kwanit1142 opened this issue Mar 18, 2021 · 0 comments
Open

Implementing other "Optimized Gradient Descent Variants" #152

kwanit1142 opened this issue Mar 18, 2021 · 0 comments
Labels

Comments

@kwanit1142
Copy link
Collaborator

kwanit1142 commented Mar 18, 2021

Is your feature request related to a problem? Please describe.
In addition to Gradient Descent Variants from Research Papers, other more advanced optimizers are present as well.

Describe the solution you'd like
https://ruder.io/optimizing-gradient-descent/ <----------(See this till more references will be uploaded)
https://www.kdnuggets.com/2019/06/gradient-descent-algorithms-cheat-sheet.html

Describe alternatives you've considered
Decide accordingly, whether to add classes in Optimizer.py or work on Optim Folder for this as well.
Explore accordingly, and simultaneously add resources here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant