Skip to content

0.1.7

Pre-release
Pre-release
Compare
Choose a tag to compare
@marovira marovira released this 01 May 19:53
· 147 commits to master since this release

Updates

  • For iteration training, the global iteration is now updated correctly. Previously it was updated in the middle of the training loop, which caused the progress bar and the log flag passed in to the model after the batch was over to be out of sync with the global iteration count. This has now been addressed by updating the global iteration count at the top of the iteration loop.
  • Removes the callback system from the trainer. Given the current implementation, there's nothing that the callbacks could do that couldn't be performed by overriding the corresponding function in the model or the datamodule.
  • Adds wrappers for printing which allow the user to choose which rank (global or local) the print should happen on.
  • Adds a wrapper for torch.distributed.barrier which works in both distributed and regular contexts.

Full Changelog

0.1.6...0.1.7