New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trainer2 #1285
Conversation
also useful when the order of examples is important and should not be | ||
broken. | ||
|
||
Args: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider adding a "shuffle" (bool) argument, with default value False. If set to True, it takes on the same behavior as ShuffledIterator. There seems to be quite a bit of code duplication between these two classes. If this change is implemented, ShuffledIterator could then be removed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you, I agree with you. I updated the code and merge these iterators into SerialIterator
.
Fix document to pass test
@beam2d I checked ExponentialShift. It seems OK. |
|
||
Chainer provides some iterators that implement typical strategies to create minibatches by iterating over datasets. | ||
:class:`SerialIterator` is the simplest one, which extract mini batches in the main thread. | ||
:class:`MultiprocessIterator` is a parallelized version of :class:`ShuffledIterator`. It maintains worker subprocesses to load the next mini batch in parallel. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ShuffledIterator does no longer exist.
Use datasets insteald of using ptb data directly
LGTM 👍 |
fix #914. I wrote an updated version of the training loop abstraction. This includes many improvements based on the feedback to the old version (#958). For example,
I have not updated the tutorial document yet, which should be done before merging it.