-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Abnormal learning curve bumping at early batches of each epoch during DS2 training. #100
Comments
One weird thing: when I resume the training from a saved model from the above figure, the phenomena did not appear again. |
抱歉,我用中文吧 :( 基于
去掉batch-shuffle中的下面几行,即扔掉开头一些短样本,和不够组batch的长样本,
收敛情况没有出现突然上升,看着都比较正常:
|
I've given up the attempt to reproduce the phenomenon from a pre-trained model. Now I've started three from-scratch jobs with three different shuffle methods, i.e.
(For more details, please refer here) with full LibriSpeech data, in order to reproduce what @qingqing01 has observed in a small dataset. |
Here is the results for batch size = 32, with all three shuffle methods running into an abnormal convergence. Besides, all bumping points are not located in the first batches of some epoch any more (This is contradictory to what we have observed previously). However, when we change the batch size from 32 to 256, the convergence is much more stable and we haven't seen the abnormal phenomenon by far. Larger batches reduce the gradient variance, thus stabilizing the convergence. Conclusion: Batch size 32 is too small for a stable training, use 256 or larger instead. TODO:
|
您好,此issue在近一个月内暂无更新,我们将于今天内关闭。若在关闭后您仍需跟进提问,可重新开启此问题,我们将在24小时内回复您。因关闭带来的不便我们深表歉意,请您谅解~感谢您对PaddlePaddle的支持! |
After merging PR #74, we have seen such abnormal learning curve:
The figure plots the training cost. Notice that in the tails of the curve, there are many spikes, exactly locating at the first batch of each epoch.
Besides, it is not easy to reproduce the phenomenon in a small dataset.
The text was updated successfully, but these errors were encountered: