Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test methods should use parameter checkpoint with best return #203

Open
jkterry1 opened this issue Jan 14, 2021 · 1 comment
Open

Test methods should use parameter checkpoint with best return #203

jkterry1 opened this issue Jan 14, 2021 · 1 comment
Labels
enhancement New feature or request
Milestone

Comments

@jkterry1
Copy link
Collaborator

Right now, test is applied to the parameters at the end of training. Sometimes, this can be much worse than the peak, which is what you'd use any benchmark anyways. Ideally test would automatically do this for you.

@cpnota cpnota added the enhancement New feature or request label Jan 14, 2021
@cpnota
Copy link
Owner

cpnota commented Jan 14, 2021

This is the "deep mind" style testing. I've been thinking about the best way to handle this, there's a few options. I'm still not happy with the experiments package as a whole, so may roll this into future refactoring.

If you need this urgently, you should be able to implement a version using between experiment.train(), experiment.test() and experiment.save() by writing your own version of run_experiment

@cpnota cpnota added this to the Future Work milestone Feb 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants