Skip to content

Commit

Permalink
docs: fix docstring
Browse files Browse the repository at this point in the history
  • Loading branch information
kozistr committed Nov 23, 2024
1 parent 90d43d4 commit e731a85
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion pytorch_optimizer/optimizer/adopt.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ class ADOPT(BaseOptimizer):
:param weight_decay: float. weight decay (L2 penalty).
:param weight_decouple: bool. the optimizer uses decoupled weight decay as in AdamW.
:param fixed_decay: bool. fix weight decay.
:param r: float. EMA factor. between 0.9 ~ 0.99 is preferred.
:param adanorm: bool. whether to use the AdaNorm variant.
:param adam_debias: bool. Only correct the denominator to avoid inflating step sizes early in training.
:param eps: float. term added to the denominator to improve numerical stability.
Expand Down

0 comments on commit e731a85

Please sign in to comment.