-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
investigate hyperparameter optimization suggestions by Jurriaan and Berend and Carlos #34
Comments
I also found this blogpost very useful: |
And we can look into Optunity they also support for example TPE and other optimizers. |
Ah sorry but I see that Optunity uses hyperopt under the hood for TPE so we might run into the same problems as hyperas (#35) |
Optunity allows for CMA-ES optimizer. According to 'Algorithms for Hyper-parameter Optimizations' by James Bergstra, " CMA-ES is a state-of-the-art gradient-free evolutionary algorithm for optimization on continuous domains, which has been shown to outperform the Gaussian search EDA. Notice that such a gradient-free approach allows non-differentiable kernels for the GP regression." I struggle to digest this. Does this mean that it can handle non-real numbers as hyperparameter, like we want or is a non-differentiable kernel something different? |
Rescale is a commercial tool to train deep networks in the cloud, including Keras, Torch,... Part of the service is Keras hyperparameter optimization. https://blog.rescale.com/deep-neural-network-hyper-parameter-optimization/ It may be good to know that these services exist. |
In that blogpost, they use SMAC - which trains random forests on the results, and is better on categorical variables according to the blog of Alice Zheng. |
Another interesting blogpost: http://www.argmin.net/2016/06/20/hypertuning/ (also the comments below) It seems that TPE and SMAC are the only algorithms that are really suitable for the type of problem that we have: with mixed categorical, discrete and continuous hyperparameters. |
No description provided.
The text was updated successfully, but these errors were encountered: