Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add pytest slow test filter #333

Merged
merged 21 commits into from
Apr 11, 2023
Merged

Add pytest slow test filter #333

merged 21 commits into from
Apr 11, 2023

Conversation

IAlibay
Copy link
Member

@IAlibay IAlibay commented Apr 10, 2023

Pretty much doing what's in the pytest docs + some fun actions trickery

@codecov
Copy link

codecov bot commented Apr 10, 2023

Codecov Report

Patch coverage: 93.33% and project coverage change: -1.26 ⚠️

Comparison is base (52dba26) 91.82% compared to head (8835a9d) 90.56%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #333      +/-   ##
==========================================
- Coverage   91.82%   90.56%   -1.26%     
==========================================
  Files          92       92              
  Lines        5712     5723      +11     
==========================================
- Hits         5245     5183      -62     
- Misses        467      540      +73     
Impacted Files Coverage Δ
openfe/protocols/openmm_rfe/equil_rfe_methods.py 89.81% <ø> (-0.19%) ⬇️
openfe/tests/conftest.py 98.85% <92.30%> (-1.15%) ⬇️
...tests/protocols/test_openmm_equil_rfe_protocols.py 86.39% <100.00%> (-13.61%) ⬇️

... and 2 files with indirect coverage changes

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

@@ -63,8 +63,10 @@ jobs:
micromamba list

- name: "Run tests"
env:
slow: ${{ fromJSON('{"false":"","true":"--runslow"}')[github.event_name == 'schedule'] }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love this

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this works, I am going to steal it since this pattern comes up all the time

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll admit it's not fully original - may have been inspired by github forums answers 😅

That being said, if you like this, you'll love this: https://github.com/MDAnalysis/MDAKits/blob/main/.github/workflows/gh-ci-cron.yaml#L44

@@ -133,12 +133,11 @@ def test_dry_run_ligand(benzene_system, toluene_system,
assert sampler.is_periodic


@pytest.mark.parametrize('method', ['repex', 'sams', 'independent'])
def test_dry_run_complex(benzene_complex_system, toluene_complex_system,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so adding this as a slow test does make CI go down to ~ couple of minutes (at least on Linux), however it does mean not having recorded coverage for the "protein" bits of the Protocol. It's ~ 1%, so I don't know if we care that much.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pretty sure we can mark individual parameters as slow here, so you could keep repex as always on, then test the other two when slow tests are enabled.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tbh the other two cases (sams and independent) don't cover anything that's not covered in the vacuum and solvent tests, so I'd advocate for removing them and keeping repex.

My main question is whether we want to take the 4 mins bullet for this one repex test (which will give us coverage for protein paths).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

depends on your viewpoint of coverage. internally we know these are doing the same thing which is your point, but externally it looks like two different codepaths so we probably ought to test both. I'd probably wrap all three in a slow call?

conftest.py Outdated
@@ -0,0 +1,24 @@
# This code is part of OpenFE and is licensed under the MIT license.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is ugly but the only way we can appropriately call pytest at the root level directory and have additional options picked up (otherwise we'd have to call pytest openfe.tests and pyest openfecli.tests separately).

Interestingly, this is the recommended approach by ptyest: https://docs.pytest.org/en/latest/how-to/writing_plugins.html#plugin-discovery-order-at-tool-startup :/ ... doesn't make it any prettier though

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mikemhenry does bring up the good point that this will probably not get packaged if it's in root, which would break the idea of doing this anyways. Shall we move it back to openfe.tests?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I vote we move back to openfe.tests so they get packaged. Then we can tell users to run pytest --pyargs openfecli.tests --pyargs openfe.tests -v -n auto and that will test everything. Then if we add more burn in tests that we want people to run, they can add the --runslow arg

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And then on our main CI we just need to add openfe/tests to the pytest invocation (if we are not using the pyargs method)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pytest --pyargs invocation won't work with the flag it seems (this is a known issue), my suggestion is that we either a) turn the flag into --fast so that by default we already run all the tests, or b) switch to using env variables (boring, but at least it'll likely work?)

@@ -63,8 +63,10 @@ jobs:
micromamba list

- name: "Run tests"
env:
slow: ${{ fromJSON('{"false":"","true":"--runslow"}')[github.event_name != 'schedule'] }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can stuff like this get a comment? it's not clear what's going on here to me

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@IAlibay
Copy link
Member Author

IAlibay commented Apr 11, 2023

Ok - it's now got both the option of using a flag --runslow, for the lazy folks like me that don't like always having to set an env var on/off, and an env var (OFE_SLOW_TESTS) for when using pytest --pyargs invocations, or calling tests purely in the root directory (i.e. pytest . at the top level directory).

@mikemhenry mikemhenry enabled auto-merge (squash) April 11, 2023 18:28
@IAlibay IAlibay disabled auto-merge April 11, 2023 18:41
@IAlibay IAlibay merged commit d175847 into main Apr 11, 2023
@IAlibay IAlibay deleted the slow-tests branch April 11, 2023 18:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants