-
Notifications
You must be signed in to change notification settings - Fork 183
chore(api): Speed up stackup snapshot tests #18700
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is so great, thanks a lot for doing this. Kid you not, this will probably save at least 30 hours of running this locally over the next couple of months.
@@ -120,6 +120,8 @@ jobs: | |||
run: make -C api test-cov | |||
- name: Ensure assets build | |||
run: make -C api sdist wheel | |||
- name: Check labware stacking regression tests |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- name: Check labware stacking regression tests | |
- name: Check stacking regression tests |
Kind of a future proofing nit, but since we'll probably add gripper stacking tests shortly, can we rephrase this?
api/integration_testing/__init__.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not that I know better, but is there a reason this testing lives here as opposed to something like api/tests/integration_testing
, even if it's not managed by pytest?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's just nice to have everything in api/tests/
be owned by pytest so you can do py.test tests
without having to play weird directory exclusion games
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think it's worth addressing now (and would probably be worth thinking through more before committing to a change), but since this is so much faster than the current version, I wonder if we can do something like just pass tuples of all the labware/adapters for all their versions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah we definitely should, especially with the ability to do filtering. but let's get this in first
) | ||
|
||
print(f"Processing {robot_info}...") | ||
executor = ProcessPoolExecutor() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, this is where the magic happens.
The labware stackup integration tests rule but were also very slow, because they were running serially and there are 8,000 of them. By making them run in parallel using concurrent.futures.ProcessPoolExecutor, we can make it take like a minute instead of 2 hours.
This was running under pytest but it wasn't really _using_ pytest, and therefore didn't have the upsides like test collection and parametrization to balance the downsides like it being annoying to add command line flags. By making it a standalone application, we get all that stuff. While we're at it, refactor so it'll be easier to add new kinds of tests that take advantage of the test spec generation and so on.
d4b3f06
to
c42553f
Compare
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## edge #18700 +/- ##
=======================================
Coverage 24.75% 24.75%
=======================================
Files 3284 3284
Lines 285464 285416 -48
Branches 28663 28655 -8
=======================================
Hits 70662 70662
+ Misses 214781 214733 -48
Partials 21 21
Flags with carried forward coverage won't be shown. Click here to find out more. 🚀 New features to boost your workflow:
|
Hey, the stacking regression tests seem to take forever and then time out in CI (which makes it hard for me to get stuff merged into Do you have an example of them ever completing successfully in CI? Even for this PR, they just time out after reaching
https://github.com/Opentrons/opentrons/actions/runs/15826957756/job/44609936201?pr=18700 |
@SyntaxColoring and @mjhuff added these extremely sick snapshot tests for labware geometry. Only downside was they took forever, but they're awesome and a solid basis for confidently modifying and improving that logic in the future. By parallelizing them - which was really easy since they were already doing subprocess stuff, so there was no shared logic to break - we can cut the test execution time down to something reasonable to run in your inner loop.
Also, there's not much of a point to this being in pytest since we're not taking advantage of anything it can do while everything's in one big test (which it has to be to take advantage of the multiprocessing); so pop it out into a new little integration-testing place and make it a command-line application that can now do filtering and such.
to come out of draft