New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Binary wheels for Linux/macOS/Windows #42
Comments
Good stuff, I'm definitely interested! I'll look in to this more after the holidays. It would be nice to have binary wheel support for the soon-to-be-released 1.1, which adds OpenMP support, but generally makes things a little more complicated with the build process (requires some Fortran preprocessing, f2py needs to be run after the preprocessing step, so I've been creating build scripts to simplify this). |
Note to self: Update the numpy-distutils submodule in https://github.com/letmaik/wrf-python-wheels to point to the official numpy 1.14 release as the DLL library folder changed. See also #39 (comment) |
@bladwig1 Did you spend some more thought on this? Let me know if you're stuck on anything, happy to help out! |
@bladwig1 I'll try to take a look at this soon, but I'm focused on the NCL 6.5.0 release at the moment. |
Any news here? |
Now that pypi.org allows collaborators, would you be willing to handle this for us? Since you have most of it done, the only thing to change is the enable the OpenMP multicore support. You can look at the commands in the build_scripts directory for how to do this, or just call the scripts directly. |
I gave it a try and there are still some build issues with OpenMP enabled:
Other notes:
|
It can't find the libgomp dynamic library at runtime (the OpenMP runtime library). Ugh, this is going to be a problem on macOS. Previously the Clang compiler for mac didn't support OpenMP at all (prior to Sierra?). For now, it's probably looking for libgomp from wherever gfortran was installed, but that search path isn't a default one. This is going to create a problem, because you could -rpath or DYLD_LIBRARY_PATH to it, but then every user would need the compiler libraries installed in that directory. Or, we could try static linking against it, but I'm not sure if that is going to require other dependencies at link time. Or, drop support for older macOS and try to link against the clang version and hope for the best. Or, try bundling the compiler libraries in a manner similar to Windows. It might be easier to punt on this and use non-OpenMP gnu_no_omp.sh for mac for now (or just use the source code as it is by default, which has OpenMP turned off).
I suspect you're building from the 'develop' branch instead of 'master'. I'm currently in the process of fixing several computational bugs that came in, but haven't updated the CI tests yet. There should be a new release coming out in the next week or two, and then you can build from master, since it has the requirements.txt update in there.
When built in a conda environment, a directory 'wrf-python/.libs' is automatically created with
In the past, the f2py distutils examples used it, so I assumed it wouldn't be generated as part of setup. This particular pyf lines up with the omp.f90 file that is generated with OpenMP turned off, so that people wouldn't have to do any preprocessing if they just wanted to build the non-OpenMP version. I haven't done any manual edits to the pyf file, so it can be removed (will remove in next release, and update the build scripts).
That's ideal, but we might not be able to do this on MacOS at this time. |
It's a weird one, normally the multibuild tool would take care of embedding any required dependencies. Need to take a look on a macOS system, which may take a while. It's definitely supposed to work though.
Yes, I was building from 'develop', good to know.
The file you are referring to is just the gfortran compiler runtime. This is the only thing that numpy's distutils can handle out of the box today. It doesn't copy in any other required DLLs like the OpenMP DLL. The multibuild tool only does automatic dependency embedding for Linux and macOS, but not for Windows. You can see that for scipy there is some manual copying involved as well: https://github.com/MacPython/scipy-wheels/blob/master/appveyor.yml#L182. EDIT: What I said above regarding the OpenMP DLL was incorrect. The |
No problem. Keep fighting the good fight!
My bad. With the build from conda (conda-forge channel), if you do a 'dumpbin \symbols' on libwrf_cons.libwrf_cons.BYEX7ZCX7HSM7VGRMGCD4WTB5X3CO3SQ.gfortran-win_amd64.dll, the libgomp stuff is in there. I thought those symbols were being exported in the libwrf_cons.* DLL as part of the numpy build magic, but it must be coming from the gfortran library included as part of the m2w64_fortran build tool. |
A had a look at this again and brought https://github.com/letmaik/wrf-python-wheels/tree/letmaik/omp up to date with the reference at https://github.com/MacPython/scipy-wheels. This included dropping Python < 3.5 support for wheel building. Due to dependency issues with pandas I also dropped 32-bit Linux wheels. I re-checked the Windows wheels and in fact you were absolutely right, everything related to OpenMP, like in conda, is already bundled in that DLL generated by numpy's distutils. I'm not sure why I thought this was an issue before, or whether I actually tested this. Maybe I was thinking it should re-distribute the MSVC OpenMP DLL, but that's nonsense since this is using mingw etc. and depends on libgomp. The tests are all passing as well, for Windows and Linux. The macOS issue remains, I haven't done any further investigation in that yet. |
OK, turns out the macOS issue was just caused by a missing Since all the wheels are building now, I think it's time to approach the maintainers at https://github.com/MacPython and ask them whether https://github.com/letmaik/wrf-python-wheels can be moved to that organisation. That would allow us to use their Rackspace account for publishing wheels whenever a new wrf-python version is released. Then the wheels can be downloaded from there and re-uploaded to PyPI. If that sounds good, let me know and I'll take it in my hands. |
Awesome! Thanks for doing this! Do you have a PyPI account so I can add you as a maintainer? Feel free to take this in to your own hands. |
I think it would be better if you guys remain maintainers. I can set up everything on the MacPython org and make sure you get access to trigger builds but then it's up to you to re-upload the wheels to PyPI and make sure nothing broke between versions (which could happen if your Fortran-specific build scripts have changed, in which case this has to be reflected in the corresponding shell script used for building the wheels). |
I got a response from Matthew Brett, saying that it would be better to not move the repo over there since it would slow down our builds (limit per org). Instead we could move it to NCAR and get the encrypted key for the Rackspace account which I can add to the config files. |
@letmaik, would forking letmaik/wrf-python-wheels to the NCAR organization work? Or does the original repository need to be transferred? Alternatively, if @bladwig1 or I had admin privileges on your repo, I think we would be able to do the ownership transfer ourselves. We can add you as a collaborator on the NCAR/wrf-python-wheels repo once it exists, but membership to the NCAR GitHub organization (and thus the ability to create repos) is automated through the UCAR staff directory system, so I'm not sure how we would go about granting you permission to create the repo yourself. |
@khallock Forking would do the trick, and sounds reasonable. After that you could add me as collaborator on that repository. |
@khallock Thanks, I'm now a collaborator. Couple more things to get started which only admins can do:
Let me know if you need any help with that. |
Are there any problems? |
@khallock It's been 4 months since your last comment. Can you please respond? I'm still interested in helping with this, but silence doesn't help :) |
Hi @letmaik, I'm sorry about the silence, things have been hectic around here recently and I haven't had many spare cycles to dedicate to this. I think I've got the NCAR/wrf-python-wheels repo configured on Travis CI, but I still need to see if I can get AppVeyor set up through the NCAR GitHub organization. I'll ask around and try to get back to you with an answer later this week. Thanks for your help on this, we appreciate it! |
@letmaik I've got the wrf-python-wheels repo enabled on AppVeyor now. Is there anything else you need me to do to finish setting this up? https://travis-ci.org/ncar/wrf-python-wheels/builds |
@khallock Looks great. The setup for uploading the wheels from Travis CI to Rackspace is done. The last step is to add the encryption key for Appveyor. Can you add @matthew-brett temporarily to the Appveyor NCAR team/account so that he can encrypt the Rackspace key and open a PR similar to the Travis CI one? With Travis CI you can encrypt things for other repos without extra permissions but with Appveyor you can only do it when logged in to the account. The idea is that we never see the Rackspace key in plain text. |
@letmaik I just sent an invitation to join the NCAR account on Appveyor to both you and @matthew-brett via email (there was no option to use Github accounts, unfortunately). |
The wheel upload is working now. I updated the README at https://github.com/NCAR/wrf-python-wheels which should contain everything needed to create wheels and upload them to PyPI. The current BUILD_COMMIT is set to the latest tag, and the wheels in https://7933911d6844c6c53a7d-47bd50c35cd79bd838daf386af554a83.ssl.cf2.rackcdn.com/ correspond to that. Unfortunately (as reported in #96) there's a version mismatch between the tag and what setup.py declares. I leave it up to you to decide whether you want to upload these wheels as-is to PyPI or wait until the next wrf-python release with consistent version number. Either is fine. Let me know if you have any questions or issues with the process, I'm happy to help. Let's leave this issue open until the first wheels land at PyPI. |
@pilotchute could you please have a look at here top to bottom? This contribution also would be worth trying for other repos of ours in order to have |
I'll read through this, we'll likely work on a pip release once we have a working conda feedstock. |
Currently you only provide binary packages via conda. To support
pip install wrf-python
more generally I created a repository which builds those wheels using Travis CI/Appveyor: https://github.com/letmaik/wrf-python-wheels. It uses https://github.com/matthew-brett/multibuild and follows the concrete adaptation from https://github.com/MacPython/scipy-wheels which had similar requirements, e.g. building Fortran code using numpy distutils.You can see the build logs for Linux and macOS here:
https://travis-ci.org/letmaik/wrf-python-wheels/builds/322376418
And for Windows including downloadable wheels (see artifacts tab in each job):
https://ci.appveyor.com/project/letmaik/wrf-python-wheels/build/1.0.5
Travis CI doesn't have free artifact storage, but if you guys like the general idea then maybe we can ask whether you can use the Rackspace container that scikit-learn uses as it says in the README of the multibuild repo:
The idea is that whenever you release a new version (or for dev builds), you would trigger a build in the wrf-python-wheels repository through a commit. This then builds and uploads wheels to some storage where you download the wheels, do some tests if you like, and then upload them to PyPI.
I'm happy to transfer the repo over to your GitHub org.
The text was updated successfully, but these errors were encountered: