-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
running regression tests #44
Comments
This means that they should provide more data, i.e. trajectory files (or a tiny bit) and their own output, right? |
Yes. Not sure it is a good idea though.... Even just a couple of frames |
It is a good idea in the spirit of promoting reproducibility, which does not meany only testing if an input file is still compatible with the current version of PLUMED... |
This is true but even running a regression will not prove reproducibility. I am not sure this is important, what really matters is fostering data availability, the fact that an input is still working may be useful to take inspiration from working examples. I think it is already very nice to see how different people write their inputs.
On a different note I think the regression suite of plumed should be improved to decrease the time and increase the coverage
…Sent from my iPhone
Il giorno 24 apr 2019, alle ore 18:10, Massimiliano Bonomi ***@***.***> ha scritto:
It is a good idea in the spirit of promoting reproducibility, which does not meany only testing if an input file is still compatible with the current version of PLUMED...
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or mute the thread.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I think we could consider the possibility to run regression tests on travis, maybe with a timer to make sure they run fast enough. Clearly they should be designed ad hoc, and not just using the production plumed.dat with the 100Mb trajectory.
A possible syntax could be
It overlaps a bit with our own regtests though, so I am not sure it is a good idea. But it could be more suitable for members of the consortium that are not really developing new features (and so rarely get into the github code) but would like to make sure their input files not only can be read (already testing it) but will continue to give the same results.
Any comment?
The text was updated successfully, but these errors were encountered: