Skip to content

pytorch-optimizer v2.3.0

Compare
Choose a tag to compare
@kozistr kozistr released this 30 Jan 07:42
5df1281

Change Log

Feature

  • re-implement Shampoo Optimizer (#97, related to #93)
    • layer-wise grafting (none, adagrad, sgd)
    • block partitioner
    • preconditioner
  • remove casting to fp16 or bf16 inside of the step() not to lose consistency with the other optimizers. #96
  • change some ops to in-place operations to speed up. #96

Fix

  • fix exp_avg_var when amsgrad is True. #96

Refactor

  • change linter from Pylint to Ruff, #97