Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

training from scratch, random seed #52

Closed
woshilaixuexide opened this issue Oct 21, 2017 · 1 comment
Closed

training from scratch, random seed #52

woshilaixuexide opened this issue Oct 21, 2017 · 1 comment

Comments

@woshilaixuexide
Copy link

Hi
Can you explain why do we need the random seed? I noticed the random seed for ImageNet classification is set to 34. I also trained a model for face verification, without random seed, it sometimes doesn't converge. But when I set the random seed as 1000, it always converges. Can you explain how to determine random seed when faced with different training tasks?

@forresti
Copy link
Owner

This post might help:
https://www.google.com/amp/s/www.researchgate.net/post/Whenever_i_run_my_neural_network_I_get_different_result/amp

If you don't set a seed for the random number generator, it will have a new seed every time you train. If you want numerically repeatable results (useful for debugging), then it is necessary to select a seed.

DNNs tend to be pretty sensitive to their parameter initialization, and some seeds just end up not converging for some models and datasets. It's a common issue in machine learning in general.

In practice, people often train a few models with different seeds and find one that converges well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants