-
Notifications
You must be signed in to change notification settings - Fork 261
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Option to make the nets more deterministic #26
Comments
I think right now you can do a |
Sure yes. I just wanted to check if you want to add it as a option. I will close this and you can reopen it if and when you want to. |
The default for |
Ah yes, very useful if your data is not independently distributed. |
It might be worth tossing a note in the nolearn docs along the lines of:
Related: #12 |
Yes, you're right. And that reminds me that I should be working on proper docstrings soon. |
From what I can see there is more randomness than just in the train test splits. From the tests I have done there is randomness when I include a dropout layer. I can't seem to make this the same for each run by doing a np.random.seed(42) before the run. I have tried tracing it back throug the source, and it appears to be setting a seed by default in RandomStreams from theano, which is called in the DropoutLayer, but I am still getting changes from run to run when I include a DropoutLayer, but no changes when there is no DropoutLayer. |
There's an issue for that in Lasagne: Lasagne/Lasagne#6 |
What do you think about introducing an option to seed the random before doing the KFold on train test split ? That way the net predictions and loss details will be more deterministic over multiple runs on the same set.
Thanks
The text was updated successfully, but these errors were encountered: