You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thanks for such innovative ideal open source! However, I have a question: in the code of gain.py, lines 159 to 161, when taking minibatch training, is random, which does not guarantee that all the data is traversed. Why is training not designed to go through all batches in order to complete the training of an epoch?
The text was updated successfully, but these errors were encountered:
That is just for implementation simplicity. No other reasons.
You can easily change it as epoch training that guarantees to utilize all samples.
Thanks!
Hi, thanks for such innovative ideal open source! However, I have a question: in the code of gain.py, lines 159 to 161, when taking minibatch training, is random, which does not guarantee that all the data is traversed. Why is training not designed to go through all batches in order to complete the training of an epoch?
The text was updated successfully, but these errors were encountered: