New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TST: Windows CI Testing using AppVeyor #1057
Conversation
I plan on rebasing this before merging, once it is decided what exact setup we want to use. If it is preferred I can make a new branch with just the single commit. It is possible to test on Python 2.6, 2.7, 3.3 and 3.4 but whenever I have more than three builds in the matrix the fourth build fails. As it stands Python 2.6, 2.7 and 3.4 are being tests. Python and dependencies are provided using conda from the Anaconda repository. Scikit-learn is using the Python.org Python and wheels derived from Christoph Gohlke's distribution, see the pull request for details. Their Rackspace hosted wheel house only provides NumPy and SciPy which is not sufficient for scikit-image but this method could be followed up for the other dependencies if desired. All builds are 32-bit. 64-bit builds should be possible using the same method as sklearn using the Python.org but again wheel are needed for dependencies. Wheel and exe file are created and can be downloaded as build artifacts. Users could download these and try out the bleeding edge skimage without a compiler |
FYI I generated the |
@jjhelmus This is looking great. Is there any way to get AppVeyor to report back to GitHub like Travis-CI does? Am I correct in assuming that the wheels that are built are only usable with Conda's Python, and not Python.org's version? @ogrisel Thank you for that offer. @matthew-brett kindly set up an OSX wheel upload for us using his credentials, but for this and other tasks it would be good to have one for skimage itself. |
If you use MSVC++ compilers I think they should be binary compatible with Python from python.org but that should be checked.
I'll send you and @jjhelmus the API key by private message. |
@stefanv AppVeyor will report back on the status with orange/green/red ball on the commits when the webhook is enabled for the repository, see the UI from the pull request in my repo. The issue is that when both AppVeyor and Travis CI are enabled the first service to set the status seems to win. GitHub exposes a combined status in their API, but from the discussion on the topic in the AppVeyor forum it does not sound like this not mapped to the GitHub UI. I'll be doing some compatibility testings today to test these wheels with different Python builds. |
FYI, it looks like the GitHub status is set to the LAST service that reports, not the first. |
My testing verifies that the exe and binary wheels produced by AppVeyor can be installed and the unit tests pass on 32-bit Python 2.7 builds provided by Python.org, Enthought, and Miniconda. |
That's excellent--along with the API key from Olivier we have ready-made [I will be traveling for a few days, so I apologise for the slow response |
@stefanv The issue with running on a 64-bit Python is that the VM only has a 32-bit Python installed from which a 32-bit conda environment can be bootstrapped but not a 64-bit environment. sklearn is downloading and installing the 64-bit Python.org interpreter which works well. It should be possible to download and install a 32/64-bit versuib if miniconda and bootstrap from that, ignoring the VM provided Python entirely. I'll test it out and report back. Is there a preference on which Python to use for the CI / builds? The two choices are Python.org's python and wheels from Christoph Gohlke or Miniconda and conda packages from Continuum. I'm more familiar with using the conda packages but I can grok the wheels solution too. The other question is what builds to include in the matrix? Python 2.6, 2.7, 3.3, and 3.4 are all possible along with 32/64 bit for all four version. Assuming we can test out three before timing out I would propose:
sklearn is using those and a 3.4/32-bit build in their matrix. |
Apparently the AppVeyor default VM now have Python 64-bit instead of 32-bit by default. I have update the sklearn build scripts in my branch that is going to be merged soon. |
Even building only two, e.g. 2.7/32 and 3.4/64, would highlight most For building the actual release wheels we should perhaps do something |
As for the wheels vs Conda etc. I have no strong preference, mainly because |
Testing is now performed on Python 2.7 and 3.4, both 32 and 64-bit with .exe and .whl file being downloadable after the build. Currently using Miniconda to install Python since the Python.org binaries were causing issues with AppVeyor although @ogrisel might have solved this. NumPy, SciPy and Cython wheels files are installed from the sklearn rackspace account; pillow, six, networkx and wheel are installed form wheel files on PyPI; and finally nose and decorator are build from source (wheel files could be created and uploaded to rackspace but the build time is minimal). Two tests are failing on all build, those that require matplotlib. Pull Request #1060 should fix these. Links to the individual jobs: Running all four jobs take over an hour, I think dropping down to two (32-bit Python2.7 and 64-bit Python 3.4) should give a good balance between test coverage and run time. |
Yeah the architecture of Python pre-installed on the base image changed. I fixed my install script to make it make more robust: |
@jjhelmus I'm sorry, we should have merged this long ago. @blink1073 Can you have a look over this PR in light of the recent refactorings you did for Travis-CI? |
Will do, I've been meaning to learn appveyor. |
- "python -c \"import struct; print(struct.calcsize('P') * 8)\"" | ||
|
||
# Install the build and runtime dependencies of the project. | ||
- "%CMD_IN_ENV% pip install -v -r tools/appveyor/requirements.txt" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd suggest pip install wheel
, then pip install -v -r requirements.txt --no-index --find-links=...
. That way we are only maintaining a single requirements file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, that won't work, because not all of the requirements are available in the Wheelhouse. How about pip install nose wheel
, pip install -v numpy scipy cython --find-links=...
, pip install -v -r requirements.txt
. That way we are explicit while being flexible to the other requirements.
@jjhelmus, are you available to make the changes I suggested? If not, I can "fork" your PR and finish it up. |
@blink1073 I'll look into making the changes in the next few days, the weekend at the latest. Given that the number of changes to scikit-image since this was submitted would it make sense to rebase or merge back into the master branch? Or start a new branch? |
A rebase is probably the best place to start. |
Ah no - don't worry about me if you prefer to have all the wheels in the same container - I can live with reading a few more lines of directory listing. |
+1 for having all stable wheels in a specific location for all the platforms (Linux, OSX and hopefully soon windows even for numpy / scipy with openblas and mingw-64w). For sklearn we currently have the following container for our windows wheels: http://windows-wheels.scikit-learn.org/ (it's on our rackspace account) It is populated from our master and release branches via appveyor.yml via https://github.com/ogrisel/wheelhouse-uploader (that among other things compute sha256 digests and put them in URL of recently uploaded artifacts so that pip can detect corrupted downloads). I will try to work on wheelhouse-uploader to make it possible to upload dev artifacts and release artifacts (from a specific tag) to different containers. |
@matthew-brett Here you go! https://gist.github.com/stefanv/585fc1bb25b1ec30dfab Usage:
or
|
BTW it would be great to use the wheels on a more generic name than just |
Yes, thank you @ogrisel, that is exactly what I was advocating for. |
Enthought holds those rights, I'll ping their sysadmin. To which IP should we point wheels.scipy.org? |
+1 in favor of having a single place for all types of wheels be they Linux, OS-X or Windows. We might want to make a distinctions between the official blessed wheels created by projects themselves and those which developers have crafted from various sources for CI testing. In my opinion the later should not be made highly visible to users as they should be obtaining the packages from the official source (PyPI, etc) not an un-official wheelhouse. From the listing so far I take it we have:
Also we should decide if we want to keep wheel files available at these sites if the same files can be found on PyPI. For example both Pillow and matplotlib have Windows and OS X wheels available on PyPI for most version of Python. |
Chip Parker at Enthought agreed to help us register the new entry--I just need an IP. |
In my case I use the wheels generated by AppVeyor and uploaded to http://windows-wheels.scikit-learn.org/ and travis (for OSX using https://travis-ci.org/MacPython/scikit-learn-wheels) and uploaded to http://wheels.scikit-image.org/ to semi automate the release of the artifact I upload to PyPI. I would therefore not advertise those containers to much (to the end users). But it's a great tool for maintainers to make the release or for developers to quickly reproduce a bug on a specific combination of library version + platform without having to recompile everything (esp. for scipy). |
We could have CNAME entry to point to: a365fff413fe338398b6-1c8a9b3114517dc5fe17b7c3f8c63a43.r19.cf2.rackcdn.com This is the CNAME target of the current http://wheels.scikit-image.org. This container has OSX wheels for most of the scipy stack (thanks to @matthew-brett) and some linux wheels (for CI only I assume). |
I'll copy the Windows wheel files over to that container that points to http://wheels.scikit-image.org once I get home this evening so that all the files for CI testing are in a single place. |
You mean those from 28daf...rackcdn.com? How those wheels were built? I am concerned about the numpy-MKL wheels: we don't have a MKL license that allows MKL distribution AFAIK. Out of curiosity which version of BLAS/LAPACK is used by the scipy wheel? Does it depend on numpy-MKL? Carl Kleffner (@carlkl) is making great progress on building windows wheels for numpy / scipy using openblas and mingw-w64 (gcc 4.9 for windows without any runtime dependencies and binary compat with MSVC built Python): https://bitbucket.org/carlkl/mingw-w64-for-python/downloads I think this is a good long term solution to build windows wheels for numpy / scipy without any licensing issues. However we should wait for the numpy & scipy developers to make this way of shipping numpy & scipy binaries official before putting them under a container that will appear as http://wheels.scipy.org. There are still a bunch of tests to fix and patches to apply to the upstream 2 projects though. The scikit-learn windows wheels container is a mix of officially released wheels and in-dev branches from CI. The latter should not be advertised. The stable wheels (without the -git version suffix) are already published on PyPI. |
http://wheels.scipy.org should be up |
Thanks! Maybe you should announce it on the numpy & scipy mailing list. At this stage I think it's important to state that this is mostly experimental and mainly useful for project maintainers dealing with CI and releases. End user should keep on using the wheel officially published on PyPI (or use alternative binary installers). |
Annoyingly, most of the packages do not publish Linux wheels. Do you know why that is? |
Linux wheels are not allowed on PyPI as there as some nasty ABI issues which are still being investigated. |
Also if your wheel links against a dynamic library (from a system package for instance) you need to have it installed with the same version which might be the case depending on the distribution and its version. There is some discussion on the disutils mailing list to allow for custom, distro specific platform tags but there is no consensus yet. |
If it is for developers only, perhaps the name should be different? Leave PyPI for "official" releases and use something like |
How does conda get around this problem? |
They ship a whole standalone distro and have no dependency on the system. I wonder if they embed their own copy of the libc. |
I think this is fine as long as it's not advertised as the default way to install those package on the scipy download page. |
As @ogrisel said, conda ships all library dependencies with the various packages and links the .so files against these. The one difference is libc (GLIBC), they do not distribute that, rather they build the packages on a system with an old version of the library (CentOS 5 I believe). Since GLIBC is backwards compatible these work on most Linux systems. Something really old like Ubuntu 10.04 would have issues. When building conda packages on Linux, the build systems uses patchelf to point the compiled libraries and binaries to the conda provided libraries to make them more portable. That said many of the Linux conda packages on binstar will not work on the majority of Linux systems due libraries in the missing conda provided libraries or when the package was build on a system with a newer version of GLIBC that the system being installed on. |
I wonder if it would be possible to extend @matthew-brett's delocate to make it work for linux wheels as well. I am no expert in dynamic linking though. |
On Friday, December 5, 2014, Olivier Grisel notifications@github.com
No fundamental reason why not as far as I know. |
If that's the case then building the linux wheels on an oldish glibc like travis'CI Ubuntu Precise and then "delocating" libopenblas.so and other runtime depencies might do the the trick to get wheels that runs on most linux versions. |
|
Thanks, @matthew-brett! |
Continuous Integration on Windows using AppVeyor.