Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make PyPI action multi-platform #15

Open
breznak opened this issue Sep 26, 2019 · 26 comments
Open

Make PyPI action multi-platform #15

breznak opened this issue Sep 26, 2019 · 26 comments

Comments

@breznak
Copy link

@breznak breznak commented Sep 26, 2019

Hello,

I am evaluatiing GH Actions and converted my multi platform build to it (runs smoothly on OSX, Linux, Windows).
Great job at providing pypi-publish action 馃憤 integrates very nicely to the flow.

Now I got error on OSX that the action is only supported on Linux, which is a pity

  • Readme should mention only linux support
  • could you please provide your action on all GH Actions' platforms?
    • it used to be possible, with twine

https://github.com/htm-community/htm.core/commit/babeb019b8ec3439122ea08d872ebcc265c3717c/checks

Cheers,

@breznak breznak mentioned this issue Sep 26, 2019
3 of 4 tasks complete
@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Sep 26, 2019

Wow, that's a good point. I didn't realize that GitHub doesn't run the docker daemon in macOS VMs. Maybe it's even a bug in GitHub workflows.
I think one of the ways to deal with it is to build dists in whatever jobs you want and then use https://github.com/actions/upload-artifact & https://github.com/actions/download-artifact to bypass artifacts (maybe built using different jobs) to the final job that does the upload itself.
WDYT?

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Sep 26, 2019

As for improving the cross-platform support, we need to evaluate whether it's really needed.
Usually, you'd build a bunch of artifacts and then do the upload of all of them at once which seems to be possible with those actions for uploading/downloading artifacts.
If we choose to use "native" actions, we'll have to use JS which probably implies that we'll need to just proxy a shell-script call. But it would also mean that we'd need to somehow force users to maintain proper pre-requisites or install things ourselves.

@pradyunsg any thoughts?

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Sep 26, 2019

@breznak please feel free to send a PR improving the README :)

@breznak

This comment has been minimized.

Copy link
Author

@breznak breznak commented Sep 26, 2019

Thanks for swift reply!

Wow, that's a good point. I didn't realize that GitHub doesn't run the docker daemon in macOS VMs. Maybe it's even a bug in GitHub workflows.

I don't know enough of the Actions' internals, would you investigate, please?
https://github.com/softprops/action-gh-release has a similar functionality (uploads to GH Releases) and works cross-platform. (But seems they don't use docker?)

I think one of the ways to deal with it is to build dists in whatever jobs you want and then use https://github.com/actions/upload-artifact & https://github.com/actions/download-artifact to bypass artifacts (maybe built using different jobs) to the final job that does the upload itself.

thanks, that's a possible workaround 馃憤

How would I define dependency of jobs? I couldn't find something as needs/requires in CircleCI's workflows.

...
  strategy:
    matrix:
      os: [a, b, c]

jobs:
  - build #builds the app/artifacts on all platforms
    runs_on: matrix.os
    -name: steps... 
   - uses: upload-artifacts

  - pypi # upload artifacts to pypi
    runs_on: ubuntu-18.04
    #FIXME: how do I specify to run after (all of) "build" completed? 
    - uses: download-artifacts
    - uses: gh-action-pypi-publish

evaluate whether it's really needed.
Usually, you'd build a bunch of artifacts and then do the upload of all of them at once which seems to be possible with those actions for uploading/downloading artifacts

the action-gh-release mentioned above does similar, and uploads files from different build envs. The user does not need to maintain/specify anything.

PS: Zdravim z CZ 馃憢

@breznak

This comment has been minimized.

Copy link
Author

@breznak breznak commented Sep 26, 2019

I guess this https://github.com/remorses/pypi would have the same problem, since it runs docker (?)

@chrispat

This comment has been minimized.

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Sep 26, 2019

@chrispat I'm personally aware of job deps via needs: but FYI it was non-trivial for me to locate it in docs back when I was looking for it. Maybe it could be improved/included in some very basic quickstart? (docs feedback)

@breznak

softprops/action-gh-release has a similar functionality (uploads to GH Releases) and works cross-platform. (But seems they don't use docker?)

Yes. They run JS code using NodeJS which is pre-installed in all CI runner VM. For use to do the same, it's needed to pre-install Python and then exec a Python script and proceed from there. This needs some experimentation atm and honestly, I think that the way with collecting artifacts and uploading via a single job looks way better.

I guess this remorses/pypi would have the same problem, since it runs docker (?)

Totally. It also looks half-baked and does more than it should (like setuptools lock-in which is the same problem that Travis CI has).

PS: Zdravim z CZ 馃憢

Ahoj

@breznak

This comment has been minimized.

Copy link
Author

@breznak breznak commented Sep 26, 2019

I think that the way with collecting artifacts and uploading via a single job looks way better.

I'm starting to think the same. Maybe this issue could be resolved only with a Readme example on using the needs: and artifact-upload to sync and upload all wheels.

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Sep 26, 2019

Yep. But not only the README. I also have a guide PR pending: pypa/packaging.python.org#647.

Please link the results of your experiments here. I can't promise that I'll get to do it myself soon so a PR is also welcome :)

@breznak

This comment has been minimized.

Copy link
Author

@breznak breznak commented Sep 26, 2019

README. I also have a guide PR pending: pypa/packaging.python.org#647.

hey, that's pretty cool, thanks!
I'm currently testing the artifact upload-download-pypi combination. (will take a bit too, we don't have setup.py packaging working correctly, apparently. For that reason, I hadn't known PEP517, looks better than setuptools)

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Sep 26, 2019

looks better than setuptools

It's not "better". It's just a unified way of invoking other tools which defaults to setuptools anyway. It kinda works on a different level.

FWIW, you can invoke PEP517 build backend manually using pep517 dist from PyPI, like this (wrapped with tox):
https://github.com/sanitizers/octomachinery/blob/006bcb0/tox.ini#L68-L87
https://github.com/sanitizers/octomachinery/blob/006bcb0/.github/workflows/publish-to-test-pypi.yml#L29-L31

@breznak

This comment has been minimized.

Copy link
Author

@breznak breznak commented Sep 27, 2019

Thanks for your help, the workflow with artifacts-upload/download works nice, I'm able to collect all whl/egg files from all platforms.
https://github.com/htm-community/htm.core/runs/237965449#step:7:16

Q: Is PYPI and pypi-publish action ok with multiple wheel & egg files in the dist folder? (from each platform)

Run ls dist*
dist1:
htm.core-2.0.16-cp37-cp37m-linux_x86_64.whl
htm.core-2.0.16-py3.7-linux-x86_64.egg
requirements.txt

dist2:
htm.core-2.0.16-cp37-cp37m-macosx_10_14_x86_64.whl
htm.core-2.0.16-py3.7-macosx-10.14-x86_64.egg
requirements.txt

dist3:
htm.core-2.0.16-cp37-cp37m-win_amd64.whl
htm.core-2.0.16-py3.7-win-amd64.egg
requirements.txt
@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Sep 27, 2019

Until #10 is addressed, you should put all of the wheels into dist/ directly. Twine will then pick up all of them.

P.S. I see you're building eggs, it's a decade-deprecated format. Better drop those in favor of wheels. And it's also highly recommended to ship a source dist (a .tar.gz tarball) so that end-users could choose to compile stuff on their side.

@breznak

This comment has been minimized.

Copy link
Author

@breznak breznak commented Sep 27, 2019

put all of the wheels into dist/ directly. Twine will then pick up all of them.

perfect, thanks!

building eggs, it's a decade-deprecated format. Better drop those in favor of wheels

oh, didn't know, I'll drop the eggs 馃嵆

@webknjaz webknjaz changed the title Make PYPI action multi-platform Make PyPI action multi-platform Sep 30, 2019
@breznak

This comment has been minimized.

Copy link
Author

@breznak breznak commented Oct 3, 2019

Please link the results of your experiments here.

Hi, I'm getting back with my results, made this method work:

All CI seems to pass,
https://github.com/htm-community/htm.core/runs/246384722

But I still have a problem as pip install does not find the version? htm-community/htm.core#19 (comment)

$ pip install -i https://test.pypi.org/simple/ htm.core==2.0.18
Looking in indexes: https://test.pypi.org/simple/
Collecting htm.core==2.0.18
ERROR: Could not find a version that satisfies the requirement htm.core==2.0.18 (from versions: none)
ERROR: No matching distribution found for htm.core==2.0.18

ERROR: No matching distribution found for htm.core==2.0.18

could this be because my "hack"? https://github.com/htm-community/htm.core/blob/master/.github/workflows/htmcore.yml#L135
where I simply renamed the file to *manylinux1_x86_64* to pass PyPI checks?
Thank you for any help!

@breznak

This comment has been minimized.

Copy link
Author

@breznak breznak commented Oct 3, 2019

Ok, I figured this!

cp37-cp37m-manylinux1_x86_64

the CI package is compiled with python 3.7 (cp37) while on my machine only py 3.6 was available.
After installing locally python 3.7, the pip install works fine!

OT: maybe pip could be more verbose on which condition was not met?

So the proper fix would be to compile in CI with a oldest possible py3.x version.

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Oct 5, 2019

@pradyunsg

This comment has been minimized.

Copy link
Member

@pradyunsg pradyunsg commented Oct 5, 2019

Ok, I figured this!

Yay!

OT: maybe pip could be more verbose on which condition was not met?

Yes, yes it could. pypa/pip#6526

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Oct 10, 2019

@breznak

I simply renamed the file to *manylinux1_x86_64* to pass PyPI checks
So the proper fix would be to compile in CI with a oldest possible py3.x version.

You shouldn't do this, it's a very bad idea. You have to use manylinux docker containers to build wheels against every Python interpreter version you want to support + also use auditwheel to eliminate external references.
Renaming the wheel only adjusts Pip's search logic but the shared libraries inside won't work under different interpreter versions. So you're basically tricking Pip into downloading the code that 100% won't work in the env.

@breznak

This comment has been minimized.

Copy link
Author

@breznak breznak commented Oct 10, 2019

So you're basically tricking Pip into downloading the code that 100% won't work in the env.

I wasn't able to upload to PyPI a wheel created on Ubuntu18.04, (which would be fine and enough if pip could recognize). So I got the impression that Linux OSs have to be manylinux for pypi.

You shouldn't do this, it's a very bad idea. You have to use manylinux docker containers to build wheels against every Python interpreter version you want to support

yes, I'm aware what I do isn't optimal solution, we already got some users failing on that,
htm-community/htm.core#702 (comment)

But our library is c++/python + other dependencies, and we require a rather modern c++ standard (c++11,ideally '17).

I thought if we build with -staticlibc++ that it would be fine. pypa/manylinux#118 (comment)

The problem is the manylinux (CentOS 5) c++ environment is quite ancient for modern c++ development, so unless newer manylinux images are released for PyPI, I don't know what to do..
pypa/manylinux#118

  • would using manylinux + newer gcc (installed from a repo/compiled/or a 3rd party (security implications) docker image of manylinux+gcc9) + using -static-libc++ be any better than the current solution (of using Ubuntu18.04 image)?

also use auditwheel to eliminate external references.

I'll have to try this, building the binary with -static-libc++ but people still get the error:

ImportError: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.27' not found (required by

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Oct 10, 2019

manylinux (CentOS 5) c++ environment is quite ancient for modern c++ development

@breznak the idea is to only link against things that are 100% present in the user's OS. So if you can compile all externals statically it should work.

You can improve glibc dep a bit by using manylinux2010 which is based on CentOS 6 but you still have to build against glibc present there.

Regarding non-manylinux1, I'm not sure but maybe you could use just linux. @pradyunsg do you know if that works?

Anyway, publishing wheels which are marked as manylinux1 but are not compliant is disrespectful to users because this standard is a promise for that that the thing will work no matter what.

@TkTech

This comment has been minimized.

Copy link

@TkTech TkTech commented Oct 21, 2019

  • would using manylinux + newer gcc (installed from a repo/compiled/or a 3rd party (security implications) docker image of manylinux+gcc9) + using -static-libc++ be any better than the current solution (of using Ubuntu18.04 image)?

I'd like to think I'm trustable, hah, but in the linked issue I do provide a recipe for easily making your own manylinux-derived images with newer versions of GCC. As long as you use -static-libc++ these will work on 99% of Linux installs.

@maartenbreddels

This comment has been minimized.

Copy link

@maartenbreddels maartenbreddels commented Nov 28, 2019

Would manylinux 2010 not be a solution?

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Nov 28, 2019

@maartenbreddels manylinux2010 allows using a newer toolchain but I think whether it's a solution or not depends on each project's specific needs.

@maartenbreddels

This comment has been minimized.

Copy link

@maartenbreddels maartenbreddels commented Dec 2, 2019

One issue could be that it requires pip>=19, for the rest I only see benefits.

@webknjaz

This comment has been minimized.

Copy link
Member

@webknjaz webknjaz commented Dec 2, 2019

It's not an issue at all. But everyone needs a solution that fits their needs. There's no universal pill.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
6 participants
You can鈥檛 perform that action at this time.