You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Mauricio 'Pachá' Vargas Sepúlveda / @pachadotdev:
related to ARROW-12373, the PR for this ticket adds a verification so that instead of converting values of -n, ..., -3, -2, -1 max partitions to 18,446,744,073,709,551,613, it returns an error message about feasibility.
Matt Matolcsi:
Hello, I am running into this issue with Arrow 6.0.0.9000, would it be possible to implement the max_partitions argument for write_dataset()?
Thanks everyone for your hard work on Arrow, it is really great to be able to use it, especially in R.
the Python docs show that we can pass, say, 1025 partitions
https://arrow.apache.org/docs/_modules/pyarrow/dataset.html
but in R this argument doesn't exist, it would be good to add this for arrow v4.0.0
this is useful, for example, with intl trade datasets:
Reporter: Mauricio 'Pachá' Vargas Sepúlveda / @pachadotdev
Related issues:
PRs and other links:
Note: This issue was originally created as ARROW-12315. Please see the migration documentation for further details.
The text was updated successfully, but these errors were encountered: