Skip to content

The parameter max_concurrent_queries seems to be ineffective at sometimes #56498

@HwiLu

Description

@HwiLu

Our clickhouse cluster has set max_concurrent_queries 1000

<!-- Maximum number of concurrent queries. -->
    <max_concurrent_queries>1000</max_concurrent_queries>

but I analyzed query_log, it was found that there were over 1000 queries by users within 1 second, as shown below

SELECT
    user,
    max(cnt)
FROM
(
    SELECT
        user,
        event_date,
        event_time AS query_minute,
        count(*) AS cnt
    FROM system.query_log
    WHERE (event_date >= '2023-10-01') AND (event_date <= yesterday()) AND (type = 'QueryStart')
    GROUP BY
        user,
        event_date,
        event_time
)
GROUP BY user

Query id: bfc6a2ea-7208-4bab-8047-38ab56033f7a

┌─user─┬─max(cnt)─┐
│      │        2 │
└──────┴──────────┘
┌─user──┬─max(cnt)─┐
│ p_etl │        4 │
└───────┴──────────┘
┌─user──────────────┬─max(cnt)─┐
│ p_realtime_writer │     2378 │
└───────────────────┴──────────┘
┌─user─────────────┬─max(cnt)─┐
│ p_idea_reader_v2 │      180 │
└──────────────────┴──────────┘
┌─user────┬─max(cnt)─┐
│ default │        9 │
└─────────┴──────────┘
┌─user───────┬─max(cnt)─┐
│ ck_monitor │       23 │
└────────────┴──────────┘

Then I checked these over 2000 queries

select user,event_time,query,type from system.query_log where user='p_realtime_writer'  and event_time='2023-10-11 11:30:01' and type='QueryStart'

Output:

...
                                                                                                                                                            │ QueryStart │
│ p_realtime_writer │ 2023-10-11 11:30:01 │ insert into factor_instr_count (`data_date`,`factor_name`,`bar_time`,`count`) 
 FORMAT RowBinary
                                                                                                                                                            │ QueryStart │
│ p_realtime_writer │ 2023-10-11 11:30:01 │ insert into factor_instr_count (`data_date`,`factor_name`,`bar_time`,`count`) 
 FORMAT RowBinary
                                                                                                                                                            │ QueryStart │
│ p_realtime_writer │ 2023-10-11 11:30:01 │ insert into factor_instr_count (`data_date`,`factor_name`,`bar_time`,`count`) 
 FORMAT RowBinary
                                                                                                                                                            │ QueryStart │
│ p_realtime_writer │ 2023-10-11 11:30:01 │ insert into factor_instr_count (`data_date`,`factor_name`,`bar_time`,`count`) 
 FORMAT RowBinary
                                                                                                                                                            │ QueryStart │
│ p_realtime_writer │ 2023-10-11 11:30:01 │ insert into factor_instr_count (`data_date`,`factor_name`,`bar_time`,`count`) 
 FORMAT RowBinary
                                                                                                                                                            │ QueryStart │
│ p_realtime_writer │ 2023-10-11 11:30:01 │ insert into factor_instr_count_test (`data_date`,`factor_name`,`bar_time`,`count`) 
 FORMAT RowBinary
                                                                                                                                                       │ QueryStart │
│ p_realtime_writer │ 2023-10-11 11:30:01 │ insert into factor_instr_count_test (`data_date`,`factor_name`,`bar_time`,`count`) 
 FORMAT RowBinary
                                                                                                                                                       │ QueryStart │
│ p_realtime_writer │ 2023-10-11 11:30:01 │ insert into factor_instr_count (`data_date`,`factor_name`,`bar_time`,`count`) 
 FORMAT RowBinary
                                                                                                                                                            │ QueryStart │
│ p_realtime_writer │ 2023-10-11 11:30:01 │ insert into factor_instr_count_test (`data_date`,`factor_name`,`bar_time`,`count`) 
 FORMAT RowBinary
                                                                                                                                                       │ QueryStart │
└───────────────────┴─────────────────────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴────────────┘

2378 rows in set. Elapsed: 0.007 sec. Processed 10.69 thousand rows, 1.34 MB (1.54 million rows/s., 192.94 MB/s.)

Similarly, there are over 2000 queries of type 'QueryFinish', it seams sometimes max_concurrent_queries doesn't work.

Or, what is the specific meaning of this parameter? The official website says: Limit on total number of currently executed queries .So what exactly does concurrently mean? Does it refer to queries within the same second or the same millisecond?

Thank you very much if you could provide an answer

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions