Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Drop and combine some testing combinations on Azure #1356

Merged
merged 5 commits into from Oct 2, 2019

Conversation

takluyver
Copy link
Member

It's certainly good to test on a range of Python & HDF5 versions, but I think getting timely feedback is more valuable than trying the same things in lots of different combinations, and I've been finding the Azure tests quite slow. Linux jobs in particular seem to be a limiting factor, so I hope trimming a few will speed things up.

cc @aragilar

@takluyver
Copy link
Member Author

Brainwave: where test jobs differ only in Python dependencies, we can group them together and let tox do its thing with environments. So system-level dependencies (like HDF5) only need to be installed once per group.

I've done this for the Windows tests so far, as compiling HDF5 is slowest there. Any reasons not to do the same for other groups?

@aragilar
Copy link
Member

When I was going though the (travis) tests, I noticed we were testing quite a few different HDF5 versions. Do we want to move to testing the latest 1.8 and 1.10 series HDF5? Also, we currently support HDF5 back to 1.8.4 (which we don't test...), do we want to have a plan for how far back we will support in the vein of NEP 29 (also, we can probably drop mindeps as a test series, as by limiting our numpy support to the last three minor version there's less chance of a breakage there, unlike when we supposedly supporting back till 1.8 or something). I'm happy to look at caching, I held off for the original PR as it's new to azure (I think it's still in beta), and I wanted something that worked, which we can then iterate on.

With @takluyver's brainwave, that sounds excellent!

@takluyver
Copy link
Member Author

From what they told us, 1.10.x releases are more like 10.x if they were using semantic versioning, so it probably makes sense to test on more than just the latest. HDF group say that no-one should be using 1.10 before 1.10.3, so I think we can skip testing those... except that guess what Ubuntu 18.04 and Debian stretch have? 😞

  • 24 months (like NEP 29 suggests for numpy) would include 1.8.20 and 1.10.2. We'd drop the 1.8 series entirely mid-2020, assuming there are no more releases.
  • 42 months (as it suggests for Python) would include 1.8.17 and all 1.10 releases. We'd drop the 1.8 series at the end of 2021.

It definitely makes sense to get something working first before worrying about caching. Now that it's working, I think it's worth considering some kind of caching for the jobs that have to compile HDF5, because that's slow, and it should be the same work each time.

@takluyver takluyver changed the title Drop a few testing combinations on Azure Drop and combine some testing combinations on Azure Sep 27, 2019
@takluyver
Copy link
Member Author

With these changes, the time is governed by the Windows builds, which take about 20 minutes. Hopefully the caching in #1357 will cut that down a lot.

@codecov
Copy link

codecov bot commented Sep 27, 2019

Codecov Report

Merging #1356 into master will decrease coverage by 1.47%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1356      +/-   ##
==========================================
- Coverage   84.35%   82.87%   -1.48%     
==========================================
  Files          18       18              
  Lines        2096     2096              
==========================================
- Hits         1768     1737      -31     
- Misses        328      359      +31
Impacted Files Coverage Δ
h5py/_hl/filters.py 80.45% <0%> (-8.63%) ⬇️
h5py/_hl/files.py 91.03% <0%> (-4.49%) ⬇️
h5py/_hl/dataset.py 86.37% <0%> (-1.32%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 752f768...69088df. Read the comment docs.

@takluyver
Copy link
Member Author

I'll merge this in a day or two if there are no objections. It may not be perfect, but I think it's a good step to make the Azure setup more practical.

@takluyver takluyver merged commit d4264bc into h5py:master Oct 2, 2019
@takluyver takluyver deleted the sparsify-azure-jobs branch October 2, 2019 07:39
@takluyver takluyver added this to the 3.0 milestone May 2, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants