Skip to content

pytorch-optimizer v2.0.0

Compare
Choose a tag to compare
@kozistr kozistr released this 21 Oct 01:35
c6d64ef

Chage Log

  • Refactor the package depth
    • 4 depths
      • pytorch_optimizer.lr_scheduler : lr schedulers
      • pytorch_optimizer.optimizer : optimizers
      • pytorch_optimizer.base : base utils
      • pytorch_optimizer.experimental : any experimental features
    • pytorch_optimizer.adamp -> pytorch_optimizer.optimizer.adamp
    • Still from pytorch_optimizer import AdamP is possible
  • Implement lr schedulers
    • CosineAnealingWarmupRestarts
  • Implement (experimental) lr schedulers
    • DeBERTaV3-large layer-wise lr scheduler

Other changes (bug fixes, small refactors)

  • Fix AGC (to returning the parameter)
  • Make a room for experimental features (at pytorch_optimizer.experimental)
  • base types