Skip to content

pytorch-optimizer v0.0.5

Compare
Choose a tag to compare
@kozistr kozistr released this 22 Sep 07:42
3c7a89f
  • Combine AdaBoundW into AdaBound optimizer with weight_decouple parameter.
  • Implement AdaBelief optimizer
    • Support fp16
    • Support weight_decouple with AdamW scheme
    • Support rectified update similar to RAdam