-
Notifications
You must be signed in to change notification settings - Fork 261
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pre-processing the training data #45
Comments
One way to do this is to subclass I'll be happy to hear any suggestions on making this more dynamic. In #42, I briefly discussed making |
Thanks for getting back to me. I think your suggestion of subclassing An alternative, but more involved solution, could be to add a |
For my use case, I decided that it would be simpler to implement it as described in my post above. Here is a link to the code in case anyone else wants to do something similar: hjweide@7f30634 Any suggestions for improvements are also welcome. |
The TrainSplit interface has since been added which should give you a good opportunity to apply correct scaling. |
I want to pre-process my training data by subtracting the mean. I could do this by subtracting the mean from my training data before I pass it to nolearn.lasagne.NeuralNet, but this would contaminate my validation set. Instead, it would be nice if one could pass a StandardScaler to the NeuralNet, which could compute the mean on the training set, apply it to the validation set, and store the StandardScaler for when the NeuralNet is used to predict on a held-out test set.
This might be done in the train_loop just after the train_test_split happens.
The text was updated successfully, but these errors were encountered: