-
-
Notifications
You must be signed in to change notification settings - Fork 316
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
For those who are struggling to find positions for many optimized parameters #140
Comments
I have the following parameters in my model:
After the hyperas is done finding the most optimal hyperparameters, it gives the following info: So I understand till 'Dropout_1', after that I don't understand 'Dropout_2':1, 'add':1. I have only added three dense layers (with the first dense layer taking the input being fixed), except the output layer and here I am getting Dropout for 4 dense layers with 'Dropout_2' being 1. I am probably missing some caveat and is therefore hoping if someone could look at it and help me out. Thanks a lot in advance. |
I also get 'Dropout_2': 1. What does that mean? That all the neurons in the next layer should be deleted? |
I found out, Dropout_2 may not actually refer to the 2nd dropout value. You have to check the output create model code to see what Dropout_2 is actually referring to. |
If you want optim.minimize to print the values instead of the indices of the best parameters, use eval_space=True as an extra argument. |
@mingfeisun if a name is specified for a layer or dropout, Hyperas could use that name as a prefix for the parameter. In that way it is more tractable if you have a lot of parameters. Now it doesn't take this into account:
|
How to find the position correspondences for the random-named optimized parameters?
During the tuning process, there are some important outputs which can help to locate.
First, check the definition of
get_space
function inHypers search space
from the tuning outputs. You will find something like this:This information helps us to know the naming for all parameters to be tuned.
Second, check the
Resulting replaced keras model
from the tuning outputs. Something like this:You will see some codes are replaced by
space['Dropout']
,space['Dropout_1']
. This is the corresponding positions for the optimized parameters. At this point, it's easy to fill in the optimized parameters.Hope this helps.
The text was updated successfully, but these errors were encountered: