You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Being a user who easily makes mistakes, I chose to do a read of a certain label set with 100 partitions and write the resulting dataframe to CSV, within my local docker environment.
Should we cap partitions to some maximum size? I have a hard time imagining that values over say 16 or so are ever going to make sense. Just needs discussion.
The text was updated successfully, but these errors were encountered:
Being a user who easily makes mistakes, I chose to do a read of a certain label set with 100 partitions and write the resulting dataframe to CSV, within my local docker environment.
Should we cap partitions to some maximum size? I have a hard time imagining that values over say 16 or so are ever going to make sense. Just needs discussion.
The text was updated successfully, but these errors were encountered: