Skip to content

pytorch-optimizer v3.1.0

Compare
Choose a tag to compare
@kozistr kozistr released this 21 Jul 11:54
d00136f

Change Log

Feature

Refactor

  • Refactor AdamMini optimizer. (#258)
  • Deprecate optional dependency, bitsandbytes. (#258)
  • Move get_rms, approximate_sq_grad functions to BaseOptimizer for reusability. (#258)
  • Refactor shampoo_utils.py. (#259)
  • Add debias, debias_adam methods in BaseOptimizer. (#261)
  • Refactor to use BaseOptimizer only, not inherit multiple classes. (#261)

Bug

  • Fix several bugs in AdamMini optimizer. (#257)

Contributions

thanks to @sdbds