Skip to content

Commit

Permalink
Indefinite dataloading (#498)
Browse files Browse the repository at this point in the history
* Add infinite data loading

* Add tests

* Add tests

* Add on_batch to interval checkpoint with test

* Update changelog

* Remove redundant none generator check

* Add fluent methods for infinite loading

* Add docstrings

* Add tests

* Fix inf loaders not working properly

* Patch some warnings

* Fix python2 test

* Fix python2 test

* Fix python2 test

* Fix python2 tests

* Update test

* Spelling
  • Loading branch information
MattPainter01 authored and ethanwharris committed Feb 18, 2019
1 parent 32b4070 commit 1be9c30
Show file tree
Hide file tree
Showing 6 changed files with 419 additions and 51 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added SimpleExponentialSimpleExponentialKL
- Added the option for model parameters only saving to Checkpointers.
- Added documentation about serialization.
- Added support for indefinite data loading. Iterators can now be run until complete independent of epochs or iterators can be refreshed during an epoch if complete.
- Added support for batch intervals in interval checkpointer

### Changed
- Changed the default behaviour of the std metric to compute the sample std, in line with torch.std
- Tqdm precision argument now rounds to decimal places rather than significant figures
Expand Down
16 changes: 16 additions & 0 deletions tests/callbacks/test_checkpointers.py
Original file line number Diff line number Diff line change
Expand Up @@ -167,6 +167,22 @@ def test_interval_is_more_than_1(self, mock_save_check):

self.assertTrue(mock_save_check.call_count == 3)

@patch('torchbearer.callbacks.checkpointers._Checkpointer.save_checkpoint')
def test_interval_on_batch(self, mock_save_check):
state = {}
check = Interval('test_file', period=4, on_batch=True)

for i in range(13):
check.on_step_training(state)
if i == 3:
self.assertTrue(mock_save_check.call_count == 1)
elif i == 6:
self.assertFalse(mock_save_check.call_count == 2)
elif i == 7:
self.assertTrue(mock_save_check.call_count == 2)
check.on_checkpoint(state)
self.assertTrue(mock_save_check.call_count == 3)

def test_state_dict(self):
check = Interval('test')
check.epochs_since_last_save = 10
Expand Down

0 comments on commit 1be9c30

Please sign in to comment.