Skip to content

Releases: davidtvs/pytorch-lr-finder

Release v0.2.1

13 Sep 15:57
Compare
Choose a tag to compare

Release notes:

  • Fix error message in DataLoaderIter.inputs_labels_from_batch()
  • Fix flat loss when using a validation dataset (#59, #60)
  • Fix issue #57 by determining the batch size from the size of the labels instead of the size of the inputs (#58)
  • Add optional argument to LRFinder.plot() for plotting a suggested learning rate (#44). The optional argument is suggest_lr

Release v0.2.0

11 Jun 19:43
Compare
Choose a tag to compare

Release notes:

  • Command to install apex changed from pip install torch-lr-finder -v --global-option="amp" to pip install torch-lr-finder -v --global-option="apex"
  • Handle apex install for pytorch < 1.0.0
  • Remove message checking if the apex.amp module is available (#46)
  • Fix learning rate history and learning rate computation in schedulers (#43, #42)
  • Refactor of Dataloader iterator wrapper (#37). An example of how this can be used can be found in examples/lrfinder_cifar10_dataloader_iter
  • Transfer data to cuda with non_blocking=True (#31)
  • Enable batch data contained in a dictionary to be moved to the correct device (#29)
  • Enable generic objects to be moved to the correct device if they have a .to() method (#29)
  • Dropped Python 2.x support: the last version with Python 2 support is v0.1.5 which can also be found in the torch_lr_finder-v0.1 branch

Extended iterable unpacking fix for Python 2.7

23 Apr 21:29
Compare
Choose a tag to compare

Fixed extended iterable unpacking for Python 2.7 as proposed by @NaleRaphael in #27.