Replies: 1 comment 1 reply
-
Hello! I can't speak to the technical details, but maybe I can help in the meantime! There is a pragma setting to restrict the maximum amount of memory that DuckDB can use. That might get things working for you. One other thing to try might be doing your count on a single column instead of count(*) (in your case, maybe the country column since it is already needed for the where clause) -- set the memory limit
PRAGMA memory_limit='1GB'; |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I have a table with one billion rows (20 columns).
When my query needs to scan an entire column, I noticed the memory consumption is increasing until saturation on my system.
A python example:
Naively, I would have thought that this query would scan the
country
column incrementing a counter with minimal memory allocation.For my knowledge, could you please tell me why database needs to allocate memory and what it allocates ?
Is there a strategy to analyse such big data on a laptop with duckdb ?
Thanks a lot for this project, it is great !
Camille
Beta Was this translation helpful? Give feedback.
All reactions