You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I changed network size from 2 layers to 4 layers and I changed batch size to 1, now after about 30 epochs I see pelpxerity starting to increase, which is, I think, unexpected.
First about 25 epochs it decreases, but then it starts behaving weirdly, and resulting model is worse than before 25 epochs. Is this a bug?
Here is my tensorboard plot. As you can see at the botton graphs, perlexilty is fluctuating (bottom right plot). Why is that happening?
The text was updated successfully, but these errors were encountered:
I changed network size from 2 layers to 4 layers and I changed batch size to 1, now after about 30 epochs I see pelpxerity starting to increase, which is, I think, unexpected.
First about 25 epochs it decreases, but then it starts behaving weirdly, and resulting model is worse than before 25 epochs. Is this a bug?
Here is my tensorboard plot. As you can see at the botton graphs, perlexilty is fluctuating (bottom right plot). Why is that happening?
The text was updated successfully, but these errors were encountered: