You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I’m trying to load data to my Feature Store offline locally from .parquet file stored on my computer. The FileSource is only one – my one .parquet file.
I’m using this code to load data:
However I got MemoryError: Unable to allocate 22.4 GiB for an array with shape (30, 100000000) and data type int64
My dataset: 66 columns x 10 000 rows
First of all why I got wired dimension of array in my message error - (30, 100000000)?
After I reduced data to 66 columns x 1000 rows, loading fs offline is ok.
In pandas, I’m able to work with much larger datasets without any memory problem on the same machine.
Does FEAST are not able to deal with larger datasets? What is the limit?
Anyway, 66 columns x 10 000 rows are still not a big data…
The text was updated successfully, but these errors were encountered:
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Machine: Ubuntu 20.04 LTS, python3.8, feast0.28, 16 GB RAM
I’m trying to load data to my Feature Store offline locally from .parquet file stored on my computer. The FileSource is only one – my one .parquet file.
I’m using this code to load data:
However I got MemoryError: Unable to allocate 22.4 GiB for an array with shape (30, 100000000) and data type int64
My dataset: 66 columns x 10 000 rows
First of all why I got wired dimension of array in my message error - (30, 100000000)?
After I reduced data to 66 columns x 1000 rows, loading fs offline is ok.
In pandas, I’m able to work with much larger datasets without any memory problem on the same machine.
Does FEAST are not able to deal with larger datasets? What is the limit?
Anyway, 66 columns x 10 000 rows are still not a big data…
The text was updated successfully, but these errors were encountered: