-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support hyperparameters tuning #287
Comments
I have just found this code: https://github.com/kuz/caffe-with-spearmint |
/cc @jmancewicz - you've been looking into parameter-sweep stuff, right? |
Yes, but won't be able to look at it for little while more. |
With #708 now you can sweep through many values of learning rate and batch size. |
Thank you for the follow up. I will try to use it. |
Closing (enhancement implemented in #708) |
Being able to automatically optimize hyperparameters directly in digits would be a great feature. The two obvious methods would be random search or bayesian optimization.
I've been playing for the past week with using spearmint (bayesian optimization) with digits. Here is how I do it for now:
json_dict
method ofModelJob
the best accuracy computed during training on the validation set.We end up with a bunch of model jobs in digits, and the optimized values for the parameters outputted by spearmint.
I'm now wondering what would be the best way to integrate random search or spearmint support into digits. I imagine it could be in the form of a special
ModelJob
with training tasks generated by spearmint for instance.What do you think ?
The text was updated successfully, but these errors were encountered: