Releases: kozistr/pytorch_optimizer
Releases · kozistr/pytorch_optimizer
pytorch-optimizer v0.4.2
refactor: coverage
pytorch-optimizer v0.4.1
refactor: coverage
pytorch-optimizer v0.4.0
- remove
.data
torch.no_grad()
to the allstep()
function (exceptclosure()
)- support
bfloat16
dtype (XLA compatibility)
pytorch-optimizer v0.3.7
feature: LARS optimizer
pytorch-optimizer v0.3.6
refactor: utils
pytorch-optimizer v0.3.5
- Refactor test modules
- merge fp32 & fp16 recipes into one list
- test cases for
Lookahead
- test case for
no gradient
- fix Ranger21 optimizer able to handle no gradient
- improve stability
pre_norm
for Lamb optimizer
pytorch-optimizer v0.3.4
fix: MADGRAD optimizer
pytorch-optimizer v0.3.3
feature: RaLamb optimizer
pytorch-optimizer v0.3.2
fix: Ranger optimizer
pytorch-optimizer v0.3.1
refactor: deprecated APIs