-
Notifications
You must be signed in to change notification settings - Fork 518
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Drop and combine some testing combinations on Azure #1356
Conversation
Brainwave: where test jobs differ only in Python dependencies, we can group them together and let tox do its thing with environments. So system-level dependencies (like HDF5) only need to be installed once per group. I've done this for the Windows tests so far, as compiling HDF5 is slowest there. Any reasons not to do the same for other groups? |
When I was going though the (travis) tests, I noticed we were testing quite a few different HDF5 versions. Do we want to move to testing the latest 1.8 and 1.10 series HDF5? Also, we currently support HDF5 back to 1.8.4 (which we don't test...), do we want to have a plan for how far back we will support in the vein of NEP 29 (also, we can probably drop mindeps as a test series, as by limiting our numpy support to the last three minor version there's less chance of a breakage there, unlike when we supposedly supporting back till 1.8 or something). I'm happy to look at caching, I held off for the original PR as it's new to azure (I think it's still in beta), and I wanted something that worked, which we can then iterate on. With @takluyver's brainwave, that sounds excellent! |
From what they told us, 1.10.x releases are more like 10.x if they were using semantic versioning, so it probably makes sense to test on more than just the latest. HDF group say that no-one should be using 1.10 before 1.10.3, so I think we can skip testing those... except that guess what Ubuntu 18.04 and Debian stretch have? 😞
It definitely makes sense to get something working first before worrying about caching. Now that it's working, I think it's worth considering some kind of caching for the jobs that have to compile HDF5, because that's slow, and it should be the same work each time. |
With these changes, the time is governed by the Windows builds, which take about 20 minutes. Hopefully the caching in #1357 will cut that down a lot. |
Codecov Report
@@ Coverage Diff @@
## master #1356 +/- ##
==========================================
- Coverage 84.35% 82.87% -1.48%
==========================================
Files 18 18
Lines 2096 2096
==========================================
- Hits 1768 1737 -31
- Misses 328 359 +31
Continue to review full report at Codecov.
|
I'll merge this in a day or two if there are no objections. It may not be perfect, but I think it's a good step to make the Azure setup more practical. |
It's certainly good to test on a range of Python & HDF5 versions, but I think getting timely feedback is more valuable than trying the same things in lots of different combinations, and I've been finding the Azure tests quite slow. Linux jobs in particular seem to be a limiting factor, so I hope trimming a few will speed things up.
cc @aragilar