New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OOM for select count(distinct ) query #47520
Comments
It is correct. You may want to increase the settings |
I am wondering if clickhouse is supposed to start doing external group by much earlier because of max_bytes_before_external_group_by = 10MB setting and don't produce OOM. |
try
I guess the problem is that max_bytes_before_external_group_by is applicable to |
external aggregating is to write interim aggregated results into temporary files and merge them a bit by bit. To trigger the feature, only setting Try running below query: set max_memory_usage = 8000000000;
set max_memory_usage_for_user = 8000000000;
set max_bytes_before_external_group_by = 10000000;
set max_bytes_before_external_sort = 10000000;
set group_by_two_level_threshold = 16;
create table if not exists t ENGINE = MergeTree() order by v1 as
select *
from file('/usr/proj/tsv-*', TabSeparated, 'v1 text, v2 text, v3 text, seq bigint');
select count(distinct v1) from t group by seq % 32;
|
As I mentioned max_bytes_before_external_group_by does not impact on
|
select count(distinct ) query produces OOM
I created table from external tsv as MergeTree:
set max_memory_usage = 8000000000;
set max_memory_usage_for_user = 8000000000;
set max_bytes_before_external_group_by = 10000000;
set max_bytes_before_external_sort = 10000000;
create table if not exists t ENGINE = MergeTree() order by v1 as
select *
from file('/usr/proj/tsv-*', TabSeparated, 'v1 text, v2 text, v3 text, seq bigint');
Table has 3B rows.
I run following query: select count(distinct v1) from t;
I receive following error:
Code: 241. DB::Exception: Received from localhost:9000. DB::Exception: Memory limit (for query) exceeded: would use 7.45 GiB (attempt to allocate chunk of 6291456 bytes), maximum: 7.45 GiB.: While executing AggregatingTransform. (MEMORY_LIMIT_EXCEEDED)
The text was updated successfully, but these errors were encountered: