Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do I reduce the verbosity of logs during the trials? #221

Closed
sibyjackgrove opened this issue Jan 22, 2020 · 12 comments
Closed

How do I reduce the verbosity of logs during the trials? #221

sibyjackgrove opened this issue Jan 22, 2020 · 12 comments
Assignees
Labels
enhancement New feature or request

Comments

@sibyjackgrove
Copy link

Is there any argument that can be passed to tuner.search() to control the logs being produced at the end of each trial (similar to verbose in Keras model.filt()). Right now it is printing all the hyperparameters in addition to other information.

@jaimeperezsanchez
Copy link

If you put 'verbose=0' only the hyperparameters and the score of each trial will appear

@sibyjackgrove
Copy link
Author

Yes, I am aware of that. verbose=0 will only suppress the logs from Keras model.fit(). Is there any similar argument for Keras-tuner? I want to suppress the hyperparameter logs at each trial.

@omalleyt12
Copy link
Contributor

@sibyjackgrove Thanks for the issue!

This is currently not possible but it makes sense to me that we should have an option for this.

IMO we should unify on the verbose argument (if it is present) to control this. I'll keep this thread updated on the status of the feature

@omalleyt12 omalleyt12 self-assigned this Jan 23, 2020
@omalleyt12 omalleyt12 added the enhancement New feature or request label Jan 23, 2020
@sibyjackgrove
Copy link
Author

Thank you for the update @omalleyt12 !

@romanovzky
Copy link

I would vouch for this feature as well. Not only the current verbosity is too extensive, it also lacks some information that could be better, for example trial number, execution time until then and of that trial, etc
I also don't seem to understand what "best_step" means to, as it comes as 0 for every trial (both in hyperband and random at least).

@ben-arnao
Copy link
Contributor

ben-arnao commented Jan 28, 2020

I believe the output is coming from the on_trial_end() method in base_tuner.py

self._display.on_trial_end(self.oracle.get_trial(trial.trial_id))

I remove this and put my own code to prints out what i want.

There are two additional modifications i've added to help me personally. A score_history array on the base_tuner (i can plot my progress, see how many iterations have gone by, if my search has found a new best config)

Also a get_init_points() on the bayesian.py oracle so that if i auto calculate init points with space * 3 i can retrieve this value to see when my search switches from random diversification to bayesian tuning.

@haifeng-jin
Copy link
Member

We will need this verbose argument in AutoKeras, too.
It will be based on this feature on Keras Tuner.

@Aviously
Copy link
Contributor

Specifying verbose=0 in tuner.search() now also suppress the output from Keras Tuner (since #312).

@juliukvi
Copy link

Specifying verbose=0 in tuner.search() now also suppress the output from Keras Tuner (since #312).

Still get wierd format and colored output after each trial.

@haifeng-jin
Copy link
Member

I assume with the latest release of Keras Tuner 1.0.2, the issue is resolved.
I am closing the issue.
Please let me know if it doesn't.

@nicrie
Copy link

nicrie commented Mar 4, 2022

Using Keras Tuner 1.1.0

  File "build-model.py", line 119, in train_log_reg
    tuner.search(X_train, y_train, verbose=0)
TypeError: search() got an unexpected keyword argument 'verbose'

From what I read here, version 1.0.2 should have added this parameter, no? Any idea what I'm doing wrong?

@martiw1
Copy link

martiw1 commented Apr 13, 2022

Setting the parameter verbose=0 in Tuner.search reduces the output to the console, but all of the logs are still created. How can we suppress creation of the large checkpoint.data-* logs that can reach sizes of 1 GB?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

10 participants