-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add ml_model_settings parameter #434
Conversation
a2aa1e2
to
b9e1b8d
Compare
b9e1b8d
to
4a94a78
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice PR, thanks! Just some minor questions.
I agree that having a mechanism that dumps the current parameter settings automatically would be nice. However, I think the user can do this themselves at the end of e.g. the training loop. We might want to think about providing e.g. a decorator that can be used directly for this end and that wraps e.g. the training function of the model inst. But imho, this isn't critical for this PR. (Though it also shouldn't be super hard, so if you have some spare time feel free to add it ;) )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks!
This PR adds the 'ml_model_settings' parameter that writes key-value pairs directly into the
self.parameters
attribute.Example:
These parameters can then be changed on command line
Open questions:
1.) At the moment, the
ml_model_settings
is not implemented for theMLModelsMixin
, therefore we can only use the "default" model (or create new model viaderive
) when e.g. creating histograms. We could add aml_models_settings
parameter to this mixin, but I'm not sure if this would be used much.2.) Since the output of the model is now hashed, we might want to also automatically produce an output of the parameters or the
parameters_repr
such that ml trainings can always be reproduced if necessary.