New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
python3Packages.dask: fix sandboxed builds #120310
Conversation
1e393f1
to
031a8b7
Compare
Result of 7 packages marked as broken and skipped:
16 packages skipped due to time constraints:
46 packages built successfully:
1 suggestion:
|
Did you report the issue upstream? If not, please do so, submit the patch, and fetch it instead of including directly in Nixpkgs. |
I opened a PR upstream: dask/dask#7601. Will I need to wait for this to be merged before using it here? |
No need for that. Please use |
031a8b7
to
379f74d
Compare
I just did this. One question: can github possibly garbage collect a patch if the upstream PR uses a merge strategy that modifies the history (e.g. rebase or squash)? |
Importing dask.dataframe in a sandboxed build results in a TypeError like this: File "/nix/store/nv60iri29bia4szhhcvsdxgsci4wxvp6-python3.8-dask-2021.03.0/lib/python3.8/site-packages/dask/dataframe/io/csv.py", line 392, in <module> AUTO_BLOCKSIZE = auto_blocksize(TOTAL_MEM, CPU_COUNT) File "/nix/store/nv60iri29bia4szhhcvsdxgsci4wxvp6-python3.8-dask-2021.03.0/lib/python3.8/site-packages/dask/dataframe/io/csv.py", line 382, in auto_blocksize blocksize = int(total_memory // cpu_count / memory_factor) TypeError: unsupported operand type(s) for //: 'int' and 'NoneType' This occurs because dask.dataframe has a non-deterministic component which generates an automatic chunk-size based on system information. This went unnoticed because the dask tests were disabled. Changes: - add a patch making the chunk-size inference more robust - re-enable the tests Resolves NixOS#120307
379f74d
to
ceeaf2d
Compare
It shouldn't https://github.community/t/does-github-ever-purge-commits-or-files-that-were-visible-at-some-time/1944/2. |
This is a semi-automatic executed nixpkgs-review with nixpkgs-review-checks extension. It is checked by a human on a best effort basis and does not build all packages (e.g. lumo, tensorflow or pytorch). Result of 7 packages marked as broken and skipped:
4 packages failed to build and already failed to build on hydra master:
1 package failed to build and are new build failure:
57 packages built:
|
Motivation for this change
Allow
dask.dataframe
to be used in sandboxed builds. See #120307.Things done
sandbox
innix.conf
on non-NixOS linux)nix-shell -p nixpkgs-review --run "nixpkgs-review wip"
./result/bin/
)nix path-info -S
before and after)