Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Documentation for n_evaluations flag should be improved #100

Closed
jkterry1 opened this issue May 31, 2021 · 1 comment
Closed
Labels
enhancement New feature or request

Comments

@jkterry1
Copy link
Contributor

Right now, all that's said is:

parser.add_argument("--n-evaluations", help="Number of evaluations for hyperparameter optimization", type=int, default=20)

The problem is that this doesn't tell you what exact number is being referred to without diving into the code--it could plausibly be the number of times a hyperparameter set is evaluated and tested, the number of evaluation points during a training curve, or a few other things.

@araffin araffin added the enhancement New feature or request label Jun 2, 2021
@araffin
Copy link
Member

araffin commented Jun 2, 2021

Hello,
I do agree on that point and would welcome a PR that clarifies its role ;)

jkterry1 added a commit to jkterry1/MCMES that referenced this issue Jun 2, 2021
@jkterry1 jkterry1 mentioned this issue Jun 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants