-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Use iterations directly instead of epoch_size #656
Comments
I think we already support that. The |
To be more clear, I mean these |
These are optional parameters, which are only needed for consensus purpose for distributed training. For single machine, num_epoch is used |
The document should be more explicit about the usage of user facing parameters. |
Yes, you are absolutely right. The best way is to improve it when we find things that can be improved, so it can always be better? It would be great if you can open a PR to contribute your wisdom into this case and improve the docstring so people won't get confused later Thanks! |
This example is just a special case of the bigger issue in #657. |
Again it will be great to have your contribution in opening a PR to help making mxnet better. I see improvement even w.r.t. to this issue is great for all of us, and such steps of changes can again big momentum easier, with your wisdom and code into it |
From the early experiences of training DL models BVLC/caffe#59 (comment), most of the time epoch size, i.e. the number of samples, is a totally unnecessary and annoying burden for the users to maintain. It's much more natural to use iterations directly and automatically go to the beginning at the end of the dataset. At least it should become optional and secondary to iterations.
The text was updated successfully, but these errors were encountered: