Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TST: Windows CI Testing using AppVeyor #1057

Closed
wants to merge 44 commits into from

Conversation

jjhelmus
Copy link
Contributor

Continuous Integration on Windows using AppVeyor.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 68569ea on jjhelmus:appveyor into 1c27a97 on scikit-image:master.

@jjhelmus
Copy link
Contributor Author

I plan on rebasing this before merging, once it is decided what exact setup we want to use. If it is preferred I can make a new branch with just the single commit.

It is possible to test on Python 2.6, 2.7, 3.3 and 3.4 but whenever I have more than three builds in the matrix the fourth build fails. As it stands Python 2.6, 2.7 and 3.4 are being tests.

Python and dependencies are provided using conda from the Anaconda repository. Scikit-learn is using the Python.org Python and wheels derived from Christoph Gohlke's distribution, see the pull request for details. Their Rackspace hosted wheel house only provides NumPy and SciPy which is not sufficient for scikit-image but this method could be followed up for the other dependencies if desired.

All builds are 32-bit. 64-bit builds should be possible using the same method as sklearn using the Python.org but again wheel are needed for dependencies.

Wheel and exe file are created and can be downloaded as build artifacts. Users could download these and try out the bleeding edge skimage without a compiler

Example build
Example artifacts

@ogrisel
Copy link

ogrisel commented Jul 11, 2014

FYI I generated the .whl packages for numpy and scipy by calling wheel convert [package_name].exe on the installers proivded by Christoph Gohlke. This is meant as a temporary solution while waiting for numpy and scipy to provide their own official whl packages directly on PyPI. If you need I can create a Rackspace cloud files API key for scikit-image mainainers.

@stefanv
Copy link
Member

stefanv commented Jul 11, 2014

@jjhelmus This is looking great. Is there any way to get AppVeyor to report back to GitHub like Travis-CI does? Am I correct in assuming that the wheels that are built are only usable with Conda's Python, and not Python.org's version?

@ogrisel Thank you for that offer. @matthew-brett kindly set up an OSX wheel upload for us using his credentials, but for this and other tasks it would be good to have one for skimage itself.

@ogrisel
Copy link

ogrisel commented Jul 11, 2014

Am I correct in assuming that the wheels that are built are only usable with Conda's Python, and not Python.org's version?

If you use MSVC++ compilers I think they should be binary compatible with Python from python.org but that should be checked.

@ogrisel Thank you for that offer. @matthew-brett kindly set up an OSX wheel upload for us using his credentials, but for this and other tasks it would be good to have one for skimage itself.

I'll send you and @jjhelmus the API key by private message.

@coveralls
Copy link

Coverage Status

Coverage decreased (-0.02%) when pulling 6402315 on jjhelmus:appveyor into 1c27a97 on scikit-image:master.

@jjhelmus
Copy link
Contributor Author

@stefanv AppVeyor will report back on the status with orange/green/red ball on the commits when the webhook is enabled for the repository, see the UI from the pull request in my repo. The issue is that when both AppVeyor and Travis CI are enabled the first service to set the status seems to win. GitHub exposes a combined status in their API, but from the discussion on the topic in the AppVeyor forum it does not sound like this not mapped to the GitHub UI.

I'll be doing some compatibility testings today to test these wheels with different Python builds.

@jjhelmus
Copy link
Contributor Author

FYI, it looks like the GitHub status is set to the LAST service that reports, not the first.

@jjhelmus
Copy link
Contributor Author

My testing verifies that the exe and binary wheels produced by AppVeyor can be installed and the unit tests pass on 32-bit Python 2.7 builds provided by Python.org, Enthought, and Miniconda.

@stefanv
Copy link
Member

stefanv commented Jul 12, 2014

That's excellent--along with the API key from Olivier we have ready-made
wheels to upload on release! What are the issues with the 64 bit version?

[I will be traveling for a few days, so I apologise for the slow response
in advance]

@jjhelmus
Copy link
Contributor Author

@stefanv The issue with running on a 64-bit Python is that the VM only has a 32-bit Python installed from which a 32-bit conda environment can be bootstrapped but not a 64-bit environment. sklearn is downloading and installing the 64-bit Python.org interpreter which works well. It should be possible to download and install a 32/64-bit versuib if miniconda and bootstrap from that, ignoring the VM provided Python entirely. I'll test it out and report back.

Is there a preference on which Python to use for the CI / builds? The two choices are Python.org's python and wheels from Christoph Gohlke or Miniconda and conda packages from Continuum. I'm more familiar with using the conda packages but I can grok the wheels solution too.

The other question is what builds to include in the matrix? Python 2.6, 2.7, 3.3, and 3.4 are all possible along with 32/64 bit for all four version. Assuming we can test out three before timing out I would propose:

  • Python 2.7 32-bit
  • Python 2.7 64-bit
  • Python 3.4 64-bit

sklearn is using those and a 3.4/32-bit build in their matrix.

@ogrisel
Copy link

ogrisel commented Jul 14, 2014

Apparently the AppVeyor default VM now have Python 64-bit instead of 32-bit by default. I have update the sklearn build scripts in my branch that is going to be merged soon.

@stefanv
Copy link
Member

stefanv commented Jul 16, 2014

Even building only two, e.g. 2.7/32 and 3.4/64, would highlight most
problems, I think.

For building the actual release wheels we should perhaps do something
similar to what Matthew did for OSX (
https://github.com/scikit-image/scikit-image-wheels). See also
https://github.com/scikit-image/scikit-image/pull/1046/files

@stefanv
Copy link
Member

stefanv commented Jul 16, 2014

As for the wheels vs Conda etc. I have no strong preference, mainly because
I am not well informed as to the pros and cons of either. It is important
that we build wheels that work with Python.org Python, but that's the only
firm requirement as far as I am concerned.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 534c232 on jjhelmus:appveyor into 6aa0b9e on scikit-image:master.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 534c232 on jjhelmus:appveyor into 6aa0b9e on scikit-image:master.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 534c232 on jjhelmus:appveyor into 6aa0b9e on scikit-image:master.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 534c232 on jjhelmus:appveyor into 6aa0b9e on scikit-image:master.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 534c232 on jjhelmus:appveyor into 6aa0b9e on scikit-image:master.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 534c232 on jjhelmus:appveyor into 6aa0b9e on scikit-image:master.

@coveralls
Copy link

Coverage Status

Coverage increased (+0.0%) when pulling 0c9f904 on jjhelmus:appveyor into 6aa0b9e on scikit-image:master.

@jjhelmus
Copy link
Contributor Author

Testing is now performed on Python 2.7 and 3.4, both 32 and 64-bit with .exe and .whl file being downloadable after the build. Currently using Miniconda to install Python since the Python.org binaries were causing issues with AppVeyor although @ogrisel might have solved this. NumPy, SciPy and Cython wheels files are installed from the sklearn rackspace account; pillow, six, networkx and wheel are installed form wheel files on PyPI; and finally nose and decorator are build from source (wheel files could be created and uploaded to rackspace but the build time is minimal).

Two tests are failing on all build, those that require matplotlib. Pull Request #1060 should fix these.

Links to the individual jobs:

Running all four jobs take over an hour, I think dropping down to two (32-bit Python2.7 and 64-bit Python 3.4) should give a good balance between test coverage and run time.

@ogrisel
Copy link

ogrisel commented Jul 23, 2014

Yeah the architecture of Python pre-installed on the base image changed. I fixed my install script to make it make more robust:

https://github.com/ogrisel/python-appveyor-demo

@stefanv
Copy link
Member

stefanv commented Oct 18, 2014

@jjhelmus I'm sorry, we should have merged this long ago. @blink1073 Can you have a look over this PR in light of the recent refactorings you did for Travis-CI?

@blink1073
Copy link
Member

Will do, I've been meaning to learn appveyor.

- "python -c \"import struct; print(struct.calcsize('P') * 8)\""

# Install the build and runtime dependencies of the project.
- "%CMD_IN_ENV% pip install -v -r tools/appveyor/requirements.txt"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd suggest pip install wheel, then pip install -v -r requirements.txt --no-index --find-links=.... That way we are only maintaining a single requirements file.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, that won't work, because not all of the requirements are available in the Wheelhouse. How about pip install nose wheel, pip install -v numpy scipy cython --find-links=..., pip install -v -r requirements.txt. That way we are explicit while being flexible to the other requirements.

@blink1073
Copy link
Member

@jjhelmus, are you available to make the changes I suggested? If not, I can "fork" your PR and finish it up.

@jjhelmus
Copy link
Contributor Author

jjhelmus commented Nov 5, 2014

@blink1073 I'll look into making the changes in the next few days, the weekend at the latest. Given that the number of changes to scikit-image since this was submitted would it make sense to rebase or merge back into the master branch? Or start a new branch?

@stefanv
Copy link
Member

stefanv commented Nov 5, 2014

A rebase is probably the best place to start.

@matthew-brett
Copy link
Contributor

Ah no - don't worry about me if you prefer to have all the wheels in the same container - I can live with reading a few more lines of directory listing.

@ogrisel
Copy link

ogrisel commented Nov 28, 2014

+1 for having all stable wheels in a specific location for all the platforms (Linux, OSX and hopefully soon windows even for numpy / scipy with openblas and mingw-64w).

For sklearn we currently have the following container for our windows wheels:

http://windows-wheels.scikit-learn.org/ (it's on our rackspace account)

It is populated from our master and release branches via appveyor.yml via https://github.com/ogrisel/wheelhouse-uploader (that among other things compute sha256 digests and put them in URL of recently uploaded artifacts so that pip can detect corrupted downloads).

I will try to work on wheelhouse-uploader to make it possible to upload dev artifacts and release artifacts (from a specific tag) to different containers.

@stefanv
Copy link
Member

stefanv commented Nov 28, 2014

@matthew-brett Here you go!

https://gist.github.com/stefanv/585fc1bb25b1ec30dfab

Usage:

python3 wheels.py Cython

or

python3 wheels.py | grep Cython

@ogrisel
Copy link

ogrisel commented Nov 29, 2014

BTW it would be great to use the wheels on a more generic name than just wheels.scikit-image.org. Maybe wheels.scipy.org? @stefanv do you know of has the admin rights for the scipy.org domain?

@blink1073
Copy link
Member

Yes, thank you @ogrisel, that is exactly what I was advocating for.

@stefanv
Copy link
Member

stefanv commented Dec 1, 2014

Enthought holds those rights, I'll ping their sysadmin. To which IP should we point wheels.scipy.org?

@jjhelmus
Copy link
Contributor Author

jjhelmus commented Dec 1, 2014

+1 in favor of having a single place for all types of wheels be they Linux, OS-X or Windows. We might want to make a distinctions between the official blessed wheels created by projects themselves and those which developers have crafted from various sources for CI testing. In my opinion the later should not be made highly visible to users as they should be obtaining the packages from the official source (PyPI, etc) not an un-official wheelhouse.

From the listing so far I take it we have:

URL Contents Rackspace Container
http://windows-wheels.scikit-learn.org/ Official Windows scikit-learn wheels sklearn-windows-wheels
http://wheels.scikit-image.org Unofficial Linux and OS X wheels wheels
http://travis-wheels.scikit-image.org Unofficial Linux wheels travis_wheels
28daf...rackcdn.com Unofficial Windows wheels windows_packages

Also we should decide if we want to keep wheel files available at these sites if the same files can be found on PyPI. For example both Pillow and matplotlib have Windows and OS X wheels available on PyPI for most version of Python.

@stefanv
Copy link
Member

stefanv commented Dec 1, 2014

Chip Parker at Enthought agreed to help us register the new entry--I just need an IP.

@ogrisel
Copy link

ogrisel commented Dec 2, 2014

Also we should decide if we want to keep wheel files available at these sites if the same files can be found on PyPI.

In my case I use the wheels generated by AppVeyor and uploaded to http://windows-wheels.scikit-learn.org/ and travis (for OSX using https://travis-ci.org/MacPython/scikit-learn-wheels) and uploaded to http://wheels.scikit-image.org/ to semi automate the release of the artifact I upload to PyPI.

I would therefore not advertise those containers to much (to the end users). But it's a great tool for maintainers to make the release or for developers to quickly reproduce a bug on a specific combination of library version + platform without having to recompile everything (esp. for scipy).

@ogrisel
Copy link

ogrisel commented Dec 2, 2014

Chip Parker at Enthought agreed to help us register the new entry--I just need an IP.

We could have CNAME entry to point to:

a365fff413fe338398b6-1c8a9b3114517dc5fe17b7c3f8c63a43.r19.cf2.rackcdn.com

This is the CNAME target of the current http://wheels.scikit-image.org. This container has OSX wheels for most of the scipy stack (thanks to @matthew-brett) and some linux wheels (for CI only I assume).

@jjhelmus
Copy link
Contributor Author

jjhelmus commented Dec 2, 2014

I'll copy the Windows wheel files over to that container that points to http://wheels.scikit-image.org once I get home this evening so that all the files for CI testing are in a single place.

@ogrisel
Copy link

ogrisel commented Dec 2, 2014

I'll copy the Windows wheel files over to that container that points to http://wheels.scikit-image.org once I get home this evening so that all the files for CI testing are in a single place.

You mean those from 28daf...rackcdn.com?

How those wheels were built? I am concerned about the numpy-MKL wheels: we don't have a MKL license that allows MKL distribution AFAIK.

Out of curiosity which version of BLAS/LAPACK is used by the scipy wheel? Does it depend on numpy-MKL? Carl Kleffner (@carlkl) is making great progress on building windows wheels for numpy / scipy using openblas and mingw-w64 (gcc 4.9 for windows without any runtime dependencies and binary compat with MSVC built Python):

https://bitbucket.org/carlkl/mingw-w64-for-python/downloads

I think this is a good long term solution to build windows wheels for numpy / scipy without any licensing issues. However we should wait for the numpy & scipy developers to make this way of shipping numpy & scipy binaries official before putting them under a container that will appear as http://wheels.scipy.org. There are still a bunch of tests to fix and patches to apply to the upstream 2 projects though.

The scikit-learn windows wheels container is a mix of officially released wheels and in-dev branches from CI. The latter should not be advertised. The stable wheels (without the -git version suffix) are already published on PyPI.

@stefanv
Copy link
Member

stefanv commented Dec 3, 2014

http://wheels.scipy.org should be up

@ogrisel
Copy link

ogrisel commented Dec 3, 2014

Thanks!

Maybe you should announce it on the numpy & scipy mailing list. At this stage I think it's important to state that this is mostly experimental and mainly useful for project maintainers dealing with CI and releases.

End user should keep on using the wheel officially published on PyPI (or use alternative binary installers).

@stefanv
Copy link
Member

stefanv commented Dec 3, 2014

Annoyingly, most of the packages do not publish Linux wheels. Do you know why that is?

@jjhelmus
Copy link
Contributor Author

jjhelmus commented Dec 3, 2014

Linux wheels are not allowed on PyPI as there as some nasty ABI issues which are still being investigated.

@ogrisel
Copy link

ogrisel commented Dec 3, 2014

Also if your wheel links against a dynamic library (from a system package for instance) you need to have it installed with the same version which might be the case depending on the distribution and its version.

There is some discussion on the disutils mailing list to allow for custom, distro specific platform tags but there is no consensus yet.

@blink1073
Copy link
Member

If it is for developers only, perhaps the name should be different? Leave PyPI for "official" releases and use something like travis-wheels.scipy.org for us maintainers.

@stefanv
Copy link
Member

stefanv commented Dec 4, 2014

How does conda get around this problem?

@ogrisel
Copy link

ogrisel commented Dec 4, 2014

How does conda get around this problem?

They ship a whole standalone distro and have no dependency on the system. I wonder if they embed their own copy of the libc.

@ogrisel
Copy link

ogrisel commented Dec 4, 2014

If it is for developers only, perhaps the name should be different? Leave PyPI for "official" releases and use something like travis-wheels.scipy.org for us maintainers.

I think this is fine as long as it's not advertised as the default way to install those package on the scipy download page.

@jjhelmus
Copy link
Contributor Author

jjhelmus commented Dec 4, 2014

As @ogrisel said, conda ships all library dependencies with the various packages and links the .so files against these. The one difference is libc (GLIBC), they do not distribute that, rather they build the packages on a system with an old version of the library (CentOS 5 I believe). Since GLIBC is backwards compatible these work on most Linux systems. Something really old like Ubuntu 10.04 would have issues. When building conda packages on Linux, the build systems uses patchelf to point the compiled libraries and binaries to the conda provided libraries to make them more portable. That said many of the Linux conda packages on binstar will not work on the majority of Linux systems due libraries in the missing conda provided libraries or when the package was build on a system with a newer version of GLIBC that the system being installed on.

@ogrisel
Copy link

ogrisel commented Dec 5, 2014

I wonder if it would be possible to extend @matthew-brett's delocate to make it work for linux wheels as well. I am no expert in dynamic linking though.

@matthew-brett
Copy link
Contributor

On Friday, December 5, 2014, Olivier Grisel notifications@github.com
wrote:

I wonder if it would be possible to extend @matthew-brett
https://github.com/matthew-brett's delocate
https://github.com/matthew-brett/delocate to make it work for linux
wheels as well. I am no expert in dynamic linking though.

No fundamental reason why not as far as I know.

@ogrisel
Copy link

ogrisel commented Dec 8, 2014

No fundamental reason why not as far as I know.

If that's the case then building the linux wheels on an oldish glibc like travis'CI Ubuntu Precise and then "delocating" libopenblas.so and other runtime depencies might do the the trick to get wheels that runs on most linux versions.

@matthew-brett
Copy link
Contributor

If that's the case then building the linux wheels on an oldish glibc like
travis'CI Ubuntu Precise and then "delocating" libopenblas.so and other
runtime depencies might do the the trick to get wheels that runs on most
linux versions.

I will scan delocate and read a little to see what how much work that would
be, and get back to you.

@stefanv
Copy link
Member

stefanv commented Dec 9, 2014

Thanks, @matthew-brett!

@blink1073
Copy link
Member

Closing in favor of #1320, thanks @jjhelmus!

@blink1073 blink1073 closed this Jan 2, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants