You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As a data scientist, I want shuffle samples in a smaller batch size and then use normal batch size in next actions, so that model quality might be improved.
As a big data engineer, I want reading small Parquet files in large batch size, so that flexibility of data management can be improved.
Detailed requirements
It should requires minimal data movements.
it should be flexible to use in complex data pipeline.
API Compatibility
Only new APIs should be introduced.
Willing to contribute
Yes
The text was updated successfully, but these errors were encountered:
User Story
Detailed requirements
API Compatibility
Willing to contribute
Yes
The text was updated successfully, but these errors were encountered: