-
Notifications
You must be signed in to change notification settings - Fork 201
Issues: google-deepmind/optax
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Add a mathematical description of the algorithms
documentation
Improvements or additions to documentation
good first issue
Good for newcomers
#757
opened Feb 2, 2024 by
vroulet
5 of 19 tasks
Updates dtype do not need to match params dtype but only grads dtype a priori
#1098
opened Oct 9, 2024 by
vroulet
unitwise_norm fails for 3D convolutions
enhancement
New feature or request
#906
opened Apr 5, 2024 by
froody
Bug in optax 0.2.4: Adam optimiser does not work in a jax tracer function, but Optax 0.2.3 does
#1159
opened Dec 19, 2024 by
olive004
Add an example of reading the learning rate from the optimizer state
good first issue
Good for newcomers
#312
opened Mar 4, 2022 by
rosshemsley
New Example on creating a custom Improvements or additions to documentation
GradientTransformation
documentation
#652
opened Dec 2, 2023 by
SauravMaheshkar
explain what optax.centralize does
documentation
Improvements or additions to documentation
good first issue
Good for newcomers
#1080
opened Oct 1, 2024 by
fabianp
Allow all optimizer
update
methods to receive an optional value
argument
#1131
opened Nov 9, 2024 by
carlosgmartin
How to use rectification with other optimizers?
enhancement
New feature or request
#24
opened Feb 15, 2021 by
DaniyarM
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.