New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support CPython 3.11, 3.12, and aarch64 processors #2331
base: master
Are you sure you want to change the base?
Conversation
Hoi 👋 exciting, will take a look early next week!
that worries me a bit.. :) groeten, Maarten |
here are all timings: https://github.com/ddelange/vaex/actions/runs/3965720337/usage depending on how often a month you release vaex, this could eat into the 2k free minutes of GH... as the parallelization is maximised and they're pushed to PyPI as soon as they're built, most of the wheels will be available soon upon release regardless here are all the wheels: distributions.zip |
interestingly, that was 8260 minutes ^ apparently that's OK? then I don't understand their explanation 🤔 https://docs.github.com/en/billing/managing-billing-for-github-actions/about-billing-for-github-actions#included-storage-and-minutes |
ah there is a fair amount of duplication in that usage table for whatever reason 🤯 |
a diff of current PyPI vs the zip above: vaex_core-4.16.1-cp310-cp310-macosx_10_9_x86_64.whl
vaex_core-4.16.1-cp310-cp310-macosx_11_0_arm64.whl
-vaex_core-4.16.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
+vaex_core-4.16.1-cp310-cp310-manylinux_2_28_aarch64.whl
+vaex_core-4.16.1-cp310-cp310-manylinux_2_28_x86_64.whl
+vaex_core-4.16.1-cp310-cp310-musllinux_1_1_aarch64.whl
vaex_core-4.16.1-cp310-cp310-musllinux_1_1_x86_64.whl
vaex_core-4.16.1-cp310-cp310-win_amd64.whl
+vaex_core-4.16.1-cp311-cp311-macosx_10_9_x86_64.whl
+vaex_core-4.16.1-cp311-cp311-macosx_11_0_arm64.whl
+vaex_core-4.16.1-cp311-cp311-manylinux_2_28_aarch64.whl
+vaex_core-4.16.1-cp311-cp311-manylinux_2_28_x86_64.whl
+vaex_core-4.16.1-cp311-cp311-musllinux_1_1_aarch64.whl
+vaex_core-4.16.1-cp311-cp311-musllinux_1_1_x86_64.whl
+vaex_core-4.16.1-cp311-cp311-win_amd64.whl
vaex_core-4.16.1-cp36-cp36m-macosx_10_9_x86_64.whl
-vaex_core-4.16.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
+vaex_core-4.16.1-cp36-cp36m-manylinux_2_28_aarch64.whl
+vaex_core-4.16.1-cp36-cp36m-manylinux_2_28_x86_64.whl
+vaex_core-4.16.1-cp36-cp36m-musllinux_1_1_aarch64.whl
vaex_core-4.16.1-cp36-cp36m-musllinux_1_1_x86_64.whl
vaex_core-4.16.1-cp36-cp36m-win_amd64.whl
vaex_core-4.16.1-cp37-cp37m-macosx_10_9_x86_64.whl
-vaex_core-4.16.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
+vaex_core-4.16.1-cp37-cp37m-manylinux_2_28_aarch64.whl
+vaex_core-4.16.1-cp37-cp37m-manylinux_2_28_x86_64.whl
+vaex_core-4.16.1-cp37-cp37m-musllinux_1_1_aarch64.whl
vaex_core-4.16.1-cp37-cp37m-musllinux_1_1_x86_64.whl
vaex_core-4.16.1-cp37-cp37m-win_amd64.whl
vaex_core-4.16.1-cp38-cp38-macosx_10_9_x86_64.whl
vaex_core-4.16.1-cp38-cp38-macosx_11_0_arm64.whl
-vaex_core-4.16.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
+vaex_core-4.16.1-cp38-cp38-manylinux_2_28_aarch64.whl
+vaex_core-4.16.1-cp38-cp38-manylinux_2_28_x86_64.whl
+vaex_core-4.16.1-cp38-cp38-musllinux_1_1_aarch64.whl
vaex_core-4.16.1-cp38-cp38-musllinux_1_1_x86_64.whl
vaex_core-4.16.1-cp38-cp38-win_amd64.whl
vaex_core-4.16.1-cp39-cp39-macosx_10_9_x86_64.whl
vaex_core-4.16.1-cp39-cp39-macosx_11_0_arm64.whl
-vaex_core-4.16.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
+vaex_core-4.16.1-cp39-cp39-manylinux_2_28_aarch64.whl
+vaex_core-4.16.1-cp39-cp39-manylinux_2_28_x86_64.whl
+vaex_core-4.16.1-cp39-cp39-musllinux_1_1_aarch64.whl
vaex_core-4.16.1-cp39-cp39-musllinux_1_1_x86_64.whl
vaex_core-4.16.1-cp39-cp39-win_amd64.whl |
namespace std { | ||
template<> | ||
struct hash<PyObject*> { | ||
size_t operator()(const PyObject *const &o) const { | ||
return PyObject_Hash((PyObject*)o); | ||
} | ||
}; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was my best guess...
pybind11 ref https://github.com/pybind/pybind11/blob/769fd3b889fef6cddb060f2a0be26aee62b4da05/include/pybind11/pytypes.h#L859
failure log https://github.com/ddelange/vaex/actions/runs/3965609112/jobs/6795506653#step:6:2110
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@maartenbreddels any thoughts on this (incl me updating the pybind11 submodule)?
@@ -183,12 +183,14 @@ def __str__(self): | |||
include_package_data=True, | |||
ext_modules=([extension_vaexfast] if on_rtd else [extension_vaexfast, extension_strings, extension_superutils, extension_superagg]) if not use_skbuild else [], | |||
zip_safe=False, | |||
python_requires=">=3.6", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cibuildwheel parses this to determine which wheels to build
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @franz101
see also the diff above
beff33a
to
7bf26cc
Compare
I'm guessing this is blocked by #2339 |
Just letting you know i'm very busy and had a vacation. |
fwiw there are now third party free minutes on native arm64 machines, to get rid of the slow qemu builds |
Could you try rebasing this? |
@maartenbreddels already merged in master 👍 |
|
Yeah, a bug/artifact or our release script. Should be good now. |
hoi @maartenbreddels 👋 I pulled master and fixed merge conflicts, but it looks like CI is still not very happy. Seeing errors like hdf file missing on disk, and Do you think it might be related to this PR? |
Just wondering here on the Python packaging. Python 3.6 and 3.7 are now deprecated on the other hand we can bump to 3.10 and 3.11? |
Do we have any updates on this MR? |
HI @maartenbreddels 👋 Was your s3 account deleted by any chance?
raises
|
5680eb9
to
2136629
Compare
3adfa7a
to
ba4943b
Compare
Pushed some changes that should fix the failing tests in vaex-ml |
Thank you @JovanVeljanoski ! |
Looks like lightgbm>4. is not available via conda-forge for python < 3.8. |
base_url = 's3://vaex'
@pytest.mark.slow
@pytest.mark.parametrize("base_url", ["gs://vaex-data", "s3://vaex"])
def test_cloud_glob(base_url):
> assert set(vaex.file.glob(f'{base_url}/testing/*.hdf5', fs_options=fs_options)) >= ({f'{base_url}/testing/xys-masked.hdf5', f'{base_url}/testing/xys.hdf5'})
E AssertionError: assert set() >= {'s3://vaex/testing/xys-masked.hdf5', 's3://vaex/testing/xys.hdf5'}
E + where set() = set([])
E + where [] = <function glob at 0x7f8447317f28>('s3://vaex/testing/*.hdf5', fs_options={'anonymous': 'true'})
E + where <function glob at 0x7f8447317f28> = <module 'vaex.file' from '/home/runner/work/vaex/vaex/packages/vaex-core/vaex/file/__init__.py'>.glob
E + where <module 'vaex.file' from '/home/runner/work/vaex/vaex/packages/vaex-core/vaex/file/__init__.py'> = vaex.file
tests/cloud_dataset_test.py:45: AssertionError |
The hash issues are due to dask/dask#10876 |
This version gives different results, although not a problem in production (it will make your cache invalid though), for CI we test that we have stable keys (fingerprints)
Getting greener, but seeing micromamba failing often, and hanging of tests on OSX. |
hmm, looks like micromamba is still flakey. maybe relevant? https://stackoverflow.com/a/77333269/5511061 |
macos seems to be consistently hanging on https://github.com/vaexio/vaex/blob/master/tests/ml/cluster_test.py any ideas there @JovanVeljanoski? |
Can we make this more manageable by splitting it into multiple smaller PRs? Like:
I feel the size and complication of this PR now holds this effort back. |
I compiled a list of all stable releases during the time the last build was working: I'm not sure which package is causing the hanging tests: |
Hi there. Any plans to release this soonish? Really appreciated! |
@to-bee it would be a great help if you can install the wheels (see PR description) and report back your environment info + whether the wheels work in your environment! |
@ddelange yes sure. |
Hoi 👋
linux-aarch64 makes up for almost 10% of all platforms ref giampaolo/psutil#2103
manylinux_2_28
).the wheels from this PR can be installed with:
fixes #2366, fixes #2368, fixes #2397