Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

results_summary() not showing hyperparameter values #121

Closed
curiousily opened this issue Oct 19, 2019 · 3 comments · Fixed by #145
Assignees
Labels
bug

Comments

@curiousily
Copy link

@curiousily curiousily commented Oct 19, 2019

Hi,

I am using keras-tuner at commit 7f6b00f45c6e0b0debaf183fa5f9dcef824fb02f.

I run RandomSearch tuner in Google Colab Notebook. Calling results_summary() gives me the following output:

|-Results in test_dir/tune_nn
|-Showing 10 best trials
|-Objective: Objective(name='val_accuracy', direction='max') Score: 0.8007448315620422
|-Objective: Objective(name='val_accuracy', direction='max') Score: 0.7988826632499695
|-Objective: Objective(name='val_accuracy', direction='max') Score: 0.774674117565155
|-Objective: Objective(name='val_accuracy', direction='max') Score: 0.77094966173172
|-Objective: Objective(name='val_accuracy', direction='max') Score: 0.5977653861045837

This comment suggests that this should also display the hyperparameters along with their values. Is this expected?

Currently, I have to extract the hyperparameters and their values with:

tuner.oracle.get_best_trials(num_trials=1)[0].hyperparameters.values

Thanks!

@omalleyt12

This comment has been minimized.

Copy link
Collaborator

@omalleyt12 omalleyt12 commented Oct 21, 2019

@curiousily Thanks for the issue (and also great blog post by the way!)

For display, what do you think about something like this?

|-Results in test_dir/tune_nn
|-Showing 10 best trials
|-Trial ac344: Objective(name='val_accuracy', direction='max') Score: 0.8007448315620422
| -- learning_rate: 0.001
| -- layers: 3
| -- units: 100

Also we should probably expose a convenience function for tuner.oracle.get_best_trials(num_trials=1)[0].hyperparameters.values, maybe tuner.get_best_hyperparameters(num_trials=1). Then users could recreate (from scratch) their best model by calling:

hps = tuner.get_best_hyperparameters()[0]
# Weights not trained here, as opposed to `tuner.get_best_models()` which includes weights
best_model = tuner.hypermodel.build(hps)
best_model.fit(...)

What do you think?

@MinuteswithMetrics

This comment has been minimized.

Copy link

@MinuteswithMetrics MinuteswithMetrics commented Oct 24, 2019

@omalleyt12 I think the display and calling the best model is right on track.

@omalleyt12

This comment has been minimized.

Copy link
Collaborator

@omalleyt12 omalleyt12 commented Oct 28, 2019

@curiousily @MinuteswithMetrics great, sounds good.

I added a convenience method tuner.get_best_hyperparameters. Will work on updating the display

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants
You can’t perform that action at this time.