Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Binary wheels for Linux/macOS/Windows #42

Open
letmaik opened this issue Dec 28, 2017 · 28 comments
Open

Binary wheels for Linux/macOS/Windows #42

letmaik opened this issue Dec 28, 2017 · 28 comments
Labels
enhancement New feature or request support Support request opened by outside user/collaborator

Comments

@letmaik
Copy link
Contributor

letmaik commented Dec 28, 2017

Currently you only provide binary packages via conda. To support pip install wrf-python more generally I created a repository which builds those wheels using Travis CI/Appveyor: https://github.com/letmaik/wrf-python-wheels. It uses https://github.com/matthew-brett/multibuild and follows the concrete adaptation from https://github.com/MacPython/scipy-wheels which had similar requirements, e.g. building Fortran code using numpy distutils.

You can see the build logs for Linux and macOS here:
https://travis-ci.org/letmaik/wrf-python-wheels/builds/322376418
And for Windows including downloadable wheels (see artifacts tab in each job):
https://ci.appveyor.com/project/letmaik/wrf-python-wheels/build/1.0.5

Travis CI doesn't have free artifact storage, but if you guys like the general idea then maybe we can ask whether you can use the Rackspace container that scikit-learn uses as it says in the README of the multibuild repo:

# Contact Matthew Brett, or the
# scikit-learn team, for permission (and the API key) to upload to
# the Rackspace account used here, or use your own account.

The idea is that whenever you release a new version (or for dev builds), you would trigger a build in the wrf-python-wheels repository through a commit. This then builds and uploads wheels to some storage where you download the wheels, do some tests if you like, and then upload them to PyPI.

I'm happy to transfer the repo over to your GitHub org.

@bladwig1
Copy link

Good stuff, I'm definitely interested! I'll look in to this more after the holidays. It would be nice to have binary wheel support for the soon-to-be-released 1.1, which adds OpenMP support, but generally makes things a little more complicated with the build process (requires some Fortran preprocessing, f2py needs to be run after the preprocessing step, so I've been creating build scripts to simplify this).

@letmaik
Copy link
Contributor Author

letmaik commented Jan 20, 2018

Note to self: Update the numpy-distutils submodule in https://github.com/letmaik/wrf-python-wheels to point to the official numpy 1.14 release as the DLL library folder changed. See also #39 (comment)
EDIT: Done.

@letmaik
Copy link
Contributor Author

letmaik commented Mar 12, 2018

@bladwig1 Did you spend some more thought on this? Let me know if you're stuck on anything, happy to help out!

@bladwig1
Copy link

@letmaik We're going to start making an effort on this soon.

@khallock Can you look in to this?

@khallock
Copy link

@bladwig1 I'll try to take a look at this soon, but I'm focused on the NCL 6.5.0 release at the moment.

@letmaik
Copy link
Contributor Author

letmaik commented Sep 28, 2018

Any news here?

@bladwig1
Copy link

bladwig1 commented Oct 4, 2018

Now that pypi.org allows collaborators, would you be willing to handle this for us?

Since you have most of it done, the only thing to change is the enable the OpenMP multicore support. You can look at the commands in the build_scripts directory for how to do this, or just call the scripts directly.

@letmaik
Copy link
Contributor Author

letmaik commented Oct 10, 2018

I gave it a try and there are still some build issues with OpenMP enabled:

  • macOS: I'm getting "Symbol not found: _GOMP_atomic_end" at package import
  • test_updraft_helicity fails with 45% mismatch on Windows and Linux, possibly macOS too but can't test yet. All other tests pass.
  • Windows: Need to bundle the OpenMP DLL

Other notes:

  • Due to how the manylinux1 setup (using multibuild) works it wasn't possible that I use f2py as a preprocessing step explicitly (as numpy wasn't installed at that build phase). To my understanding this isn't necessary anyway. I removed the pyf file from setup.py so that numpy distutils re-generates it. Doing that I could omit the explicit f2py call. Was there are reason for keeping the pyf file in the repo?
  • The idea is to only publish the OpenMP-enabled build to PyPI, does that make sense?

@bladwig1
Copy link

macOS: I'm getting "Symbol not found: _GOMP_atomic_end" at package import

It can't find the libgomp dynamic library at runtime (the OpenMP runtime library). Ugh, this is going to be a problem on macOS. Previously the Clang compiler for mac didn't support OpenMP at all (prior to Sierra?). For now, it's probably looking for libgomp from wherever gfortran was installed, but that search path isn't a default one. This is going to create a problem, because you could -rpath or DYLD_LIBRARY_PATH to it, but then every user would need the compiler libraries installed in that directory. Or, we could try static linking against it, but I'm not sure if that is going to require other dependencies at link time. Or, drop support for older macOS and try to link against the clang version and hope for the best. Or, try bundling the compiler libraries in a manner similar to Windows.

It might be easier to punt on this and use non-OpenMP gnu_no_omp.sh for mac for now (or just use the source code as it is by default, which has OpenMP turned off).

test_updraft_helicity fails with 45% mismatch on Windows and Linux, possibly macOS too but can't test yet. All other tests pass.

I suspect you're building from the 'develop' branch instead of 'master'. I'm currently in the process of fixing several computational bugs that came in, but haven't updated the CI tests yet. There should be a new release coming out in the next week or two, and then you can build from master, since it has the requirements.txt update in there.

Windows: Need to bundle the OpenMP DLL

When built in a conda environment, a directory 'wrf-python/.libs' is automatically created with
libwrf_cons.libwrf_cons.BYEX7ZCX7HSM7VGRMGCD4WTB5X3CO3SQ.gfortran-win_amd64.dll
inside it. Is the behavior different outside of conda? I thought it was numpy distutils that bundled that library, but is numpy.distutils just creating the directory with the library but not including it in a wheel? What needs to change to make this work?

Due to how the manylinux1 setup (using multibuild) works it wasn't possible that I use f2py as a preprocessing step explicitly (as numpy wasn't installed at that build phase). To my understanding this isn't necessary anyway. I removed the pyf file from setup.py so that numpy distutils re-generates it. Doing that I could omit the explicit f2py call. Was there are reason for keeping the pyf file in the repo?

In the past, the f2py distutils examples used it, so I assumed it wouldn't be generated as part of setup. This particular pyf lines up with the omp.f90 file that is generated with OpenMP turned off, so that people wouldn't have to do any preprocessing if they just wanted to build the non-OpenMP version. I haven't done any manual edits to the pyf file, so it can be removed (will remove in next release, and update the build scripts).

The idea is to only publish the OpenMP-enabled build to PyPI, does that make sense?

That's ideal, but we might not be able to do this on MacOS at this time.

@letmaik
Copy link
Contributor Author

letmaik commented Oct 12, 2018

macOS: I'm getting "Symbol not found: _GOMP_atomic_end" at package import

It can't find the libgomp dynamic library at runtime (the OpenMP runtime library). [...] Or, try bundling the compiler libraries in a manner similar to Windows.

It's a weird one, normally the multibuild tool would take care of embedding any required dependencies. Need to take a look on a macOS system, which may take a while. It's definitely supposed to work though.

test_updraft_helicity fails with 45% mismatch on Windows and Linux, possibly macOS too but can't test yet. All other tests pass.

I suspect you're building from the 'develop' branch instead of 'master'.

Yes, I was building from 'develop', good to know.

Windows: Need to bundle the OpenMP DLL

When built in a conda environment, a directory 'wrf-python/.libs' is automatically created with
libwrf_cons.libwrf_cons.BYEX7ZCX7HSM7VGRMGCD4WTB5X3CO3SQ.gfortran-win_amd64.dll
inside it. Is the behavior different outside of conda? I thought it was numpy distutils that bundled that library, but is numpy.distutils just creating the directory with the library but not including it in a wheel?

The file you are referring to is just the gfortran compiler runtime. This is the only thing that numpy's distutils can handle out of the box today. It doesn't copy in any other required DLLs like the OpenMP DLL. The multibuild tool only does automatic dependency embedding for Linux and macOS, but not for Windows. You can see that for scipy there is some manual copying involved as well: https://github.com/MacPython/scipy-wheels/blob/master/appveyor.yml#L182.

EDIT: What I said above regarding the OpenMP DLL was incorrect. The libwrf_cons...dll contains the OpenMP runtime parts as well and no extra library is needed. This comes from the mingw environment and is not related to the MSVC OpenMP DLLs. I'm not sure why this was an issue before, but it works now, maybe it was never an issue.

@bladwig1
Copy link

It's a weird one, normally the multibuild tool would take care of embedding any required dependencies. Need to take a look on a macOS system, which may take a while. It's definitely supposed to work though.

No problem. Keep fighting the good fight!

The file you are referring to is just the gfortran compiler runtime. This is the only thing that numpy's distutils can handle out of the box today. It doesn't copy in any other required DLLs like the OpenMP DLL. The multibuild tool only does automatic dependency embedding for Linux and macOS, but not for Windows. You can see that for scipy there is some manual copying involved as well: https://github.com/MacPython/scipy-wheels/blob/master/appveyor.yml#L182.

My bad. With the build from conda (conda-forge channel), if you do a 'dumpbin \symbols' on libwrf_cons.libwrf_cons.BYEX7ZCX7HSM7VGRMGCD4WTB5X3CO3SQ.gfortran-win_amd64.dll, the libgomp stuff is in there. I thought those symbols were being exported in the libwrf_cons.* DLL as part of the numpy build magic, but it must be coming from the gfortran library included as part of the m2w64_fortran build tool.

@letmaik
Copy link
Contributor Author

letmaik commented Feb 3, 2019

A had a look at this again and brought https://github.com/letmaik/wrf-python-wheels/tree/letmaik/omp up to date with the reference at https://github.com/MacPython/scipy-wheels. This included dropping Python < 3.5 support for wheel building. Due to dependency issues with pandas I also dropped 32-bit Linux wheels.

I re-checked the Windows wheels and in fact you were absolutely right, everything related to OpenMP, like in conda, is already bundled in that DLL generated by numpy's distutils. I'm not sure why I thought this was an issue before, or whether I actually tested this. Maybe I was thinking it should re-distribute the MSVC OpenMP DLL, but that's nonsense since this is using mingw etc. and depends on libgomp.

The tests are all passing as well, for Windows and Linux.

The macOS issue remains, I haven't done any further investigation in that yet.

@letmaik
Copy link
Contributor Author

letmaik commented Feb 3, 2019

OK, turns out the macOS issue was just caused by a missing -fopenmp in the linker flags. This was an oversight as there is some special logic for macOS only in that shell script that sets up the build.

Since all the wheels are building now, I think it's time to approach the maintainers at https://github.com/MacPython and ask them whether https://github.com/letmaik/wrf-python-wheels can be moved to that organisation. That would allow us to use their Rackspace account for publishing wheels whenever a new wrf-python version is released. Then the wheels can be downloaded from there and re-uploaded to PyPI. If that sounds good, let me know and I'll take it in my hands.

@bladwig1
Copy link

bladwig1 commented Feb 5, 2019

Awesome! Thanks for doing this! Do you have a PyPI account so I can add you as a maintainer? Feel free to take this in to your own hands.

@letmaik
Copy link
Contributor Author

letmaik commented Feb 5, 2019

I think it would be better if you guys remain maintainers. I can set up everything on the MacPython org and make sure you get access to trigger builds but then it's up to you to re-upload the wheels to PyPI and make sure nothing broke between versions (which could happen if your Fortran-specific build scripts have changed, in which case this has to be reflected in the corresponding shell script used for building the wheels).

@letmaik
Copy link
Contributor Author

letmaik commented Feb 5, 2019

I got a response from Matthew Brett, saying that it would be better to not move the repo over there since it would slow down our builds (limit per org). Instead we could move it to NCAR and get the encrypted key for the Rackspace account which I can add to the config files.
If that sounds ok, let me know and I will transfer the repo to the NCAR org. To do that I need to temporarily get access permissions to create repos in the NCAR org, otherwise I can't transfer it.

@khallock
Copy link

@letmaik, would forking letmaik/wrf-python-wheels to the NCAR organization work? Or does the original repository need to be transferred? Alternatively, if @bladwig1 or I had admin privileges on your repo, I think we would be able to do the ownership transfer ourselves.

We can add you as a collaborator on the NCAR/wrf-python-wheels repo once it exists, but membership to the NCAR GitHub organization (and thus the ability to create repos) is automated through the UCAR staff directory system, so I'm not sure how we would go about granting you permission to create the repo yourself.

@letmaik
Copy link
Contributor Author

letmaik commented Feb 11, 2019

@khallock Forking would do the trick, and sounds reasonable. After that you could add me as collaborator on that repository.

@letmaik
Copy link
Contributor Author

letmaik commented Feb 11, 2019

@khallock Thanks, I'm now a collaborator. Couple more things to get started which only admins can do:

  • Please go to https://travis-ci.com/ (Linux and macOS) and flip the switch for this repository. That should be straightforward.
  • Same for https://ci.appveyor.com (Windows) but here it's a bit trickier if you want to reflect the NCAR organisation similarly as in GitHub and Travis CI. What I typically do is to first login once via the GitHub option to register on the system. After that, logout again and register a new password-based account on Appveyor called NCAR -- unfortunately you have to use different email addresses for that. After logging in to that NCAR account, go to https://ci.appveyor.com/team and grant admin permission to your normal user account. Now you can manage the NCAR org from your user account via your GitHub login. Once all done, flip the switch for the repository.

Let me know if you need any help with that.

@letmaik
Copy link
Contributor Author

letmaik commented Mar 21, 2019

Are there any problems?

@letmaik
Copy link
Contributor Author

letmaik commented Jun 18, 2019

@khallock It's been 4 months since your last comment. Can you please respond? I'm still interested in helping with this, but silence doesn't help :)

@khallock
Copy link

Hi @letmaik,

I'm sorry about the silence, things have been hectic around here recently and I haven't had many spare cycles to dedicate to this. I think I've got the NCAR/wrf-python-wheels repo configured on Travis CI, but I still need to see if I can get AppVeyor set up through the NCAR GitHub organization. I'll ask around and try to get back to you with an answer later this week.

Thanks for your help on this, we appreciate it!

@khallock
Copy link

@letmaik I've got the wrf-python-wheels repo enabled on AppVeyor now. Is there anything else you need me to do to finish setting this up?

https://travis-ci.org/ncar/wrf-python-wheels/builds
https://ci.appveyor.com/project/NCAR/wrf-python-wheels

@letmaik
Copy link
Contributor Author

letmaik commented Jun 25, 2019

@khallock Looks great. The setup for uploading the wheels from Travis CI to Rackspace is done. The last step is to add the encryption key for Appveyor. Can you add @matthew-brett temporarily to the Appveyor NCAR team/account so that he can encrypt the Rackspace key and open a PR similar to the Travis CI one? With Travis CI you can encrypt things for other repos without extra permissions but with Appveyor you can only do it when logged in to the account. The idea is that we never see the Rackspace key in plain text.

@khallock
Copy link

@letmaik I just sent an invitation to join the NCAR account on Appveyor to both you and @matthew-brett via email (there was no option to use Github accounts, unfortunately).

@letmaik
Copy link
Contributor Author

letmaik commented Jun 25, 2019

The wheel upload is working now. I updated the README at https://github.com/NCAR/wrf-python-wheels which should contain everything needed to create wheels and upload them to PyPI. The current BUILD_COMMIT is set to the latest tag, and the wheels in https://7933911d6844c6c53a7d-47bd50c35cd79bd838daf386af554a83.ssl.cf2.rackcdn.com/ correspond to that. Unfortunately (as reported in #96) there's a version mismatch between the tag and what setup.py declares. I leave it up to you to decide whether you want to upload these wheels as-is to PyPI or wait until the next wrf-python release with consistent version number. Either is fine.

Let me know if you have any questions or issues with the process, I'm happy to help. Let's leave this issue open until the first wheels land at PyPI.

@erogluorhan erogluorhan added enhancement New feature or request support Support request opened by outside user/collaborator labels Sep 1, 2020
@erogluorhan
Copy link
Collaborator

@pilotchute could you please have a look at here top to bottom? This contribution also would be worth trying for other repos of ours in order to have pip install

@pilotchute
Copy link

I'll read through this, we'll likely work on a pip release once we have a working conda feedstock.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request support Support request opened by outside user/collaborator
Projects
None yet
Development

No branches or pull requests

5 participants