pytorch-optimizer v2.1.1
Change Log
Feature
- Support
gradient centralization
forAdai
optimizer - Support
AdamD debias
forAdaPNM
optimizer - Register custom exceptions (e.g. NoSparseGradientError, NoClosureError, ...)
Documentation
- Add API documentation
Bug
- Fix
SAM
optimizer