-
Notifications
You must be signed in to change notification settings - Fork 396
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conditional hyperparameter tuning bug #66
Comments
To clarify, you would like the Trial summary to only show HyperParameters that were used in the Python code for this particular trial? I think this is possible but will require careful thought. In general, the Oracle will attempt to provide a value for any HyperParameter it has seen so far. There's no way for the Oracle to know in your code that We do have a concept of explicitly conditional hyperparameters (https://github.com/keras-team/keras-tuner/blob/master/kerastuner/engine/hyperparameters.py#L393), but right now that is not reflected in the Trial summary |
Yes, my expectation was that the trial summary would show all the hyperparameter settings used for that trial. It's fine if too many hp values are shown (e.g. if num_layers == 4 but perhaps units_[0-10] were shown), but in the case where too few are shown I get an incomplete view of that model (e.g. if num_layers == 4 but only units_[0-1] shown. I would also like to see units_[2-3]). In the meantime my fix is to print( model.summary() ) in my build function to verify the model architecture, but it's much more verbose than the trail summary. Would the num_layers and units_# hps discussed above be considered conditional hps? Or is there another mechanism to explicitly define the conditional relationship between hp? Thanks again. |
Thanks, will look into this The mechanism for specifying conditional hyperparameters is: a = hp.Int('a', 0, 10)
with hp.conditionaLscope('a', 5):
b = hp.Int('b', 0, 10) with that syntax, |
Thank you! |
All HyperParameters should be shown now, can you please try with the master branch? In the future, for display purposes we may hide hyperparameter values that were set but never accessed during a trial, but closing this issue for now as that's more of an enhancement |
Hi @omalleyt12 thank you. for i in range(hp.Int('num_fc_layers', 1, 4)): Previously I saw an incomplete list of "units_" printed out; however, now I see no "unit_" printed out. [Trial complete]
This isn't a major showstopper for me at this point, but I wanted to follow up to let you know. Thanks again. |
@rcmagic1 Thanks for letting me know, could you provide a minimal reproduction including the When I run the examples below I see the expected values (first example takes on the default value for 'num_fc_layers' so only has 'units_0', second example has all HPs) import tensorflow as tf
import kerastuner as kt
def build_model(hp):
model = tf.keras.Sequential()
for i in range(hp.Int('num_fc_layers', 1, 4)):
model.add(tf.keras.layers.Dense(
units=hp.Int('units_' + str(i),
min_value=10,
max_value=80,
step=10),
activation='relu'))
hp = kt.HyperParameters()
build_model(hp)
print(hp.values)
hp = kt.HyperParameters()
hp.Fixed('num_fc_layers', 4)
build_model(hp)
print(hp.values) |
certainly
and I'll get results like this. Note the missing num_units_1 and num_units_2 when num_fc_layers=2 [Trial complete]
|
@rcmagic1 Thanks for the repro!! Ah ok, yep found a bug in the This issue should be fixed by #113 |
Great! Let me know when the fix is available and I'll validate it. Thanks again. |
@rcmagic1 Thanks! The fix is available now, please let me know if the HyperParameters look right now |
I'm using Keras-Tuner to run trials on a multi-layer NN with variable number of layer and units within each layer, similar to the example in the README:
The "units_#" hyperpameter should be conditional upon "num_layer" hyperparameter. E.g.if "num_layers=2" then I should see "units_0" and "units_1". However in my testing I'm not seeing proper correlation (num_layers doesn't match the number of units_# hyperparameter values set). Instead I see something like the following:
[Trial summary]
or
[Trial summary]
This effectively makes the summary of hyperparameters used in a trial useless.
I did some debugging of the code but haven't found the culprit yet.
I'm using "randomsearch" tuner and wrapped my model build in HyperModel class (rather than function method).
Could someone please take a look? Thank you.
The text was updated successfully, but these errors were encountered: