Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SQL: Improve CircuitBreaker logic for SqlParser #35300

Merged
merged 4 commits into from Nov 7, 2018

Conversation

matriv
Copy link
Contributor

@matriv matriv commented Nov 6, 2018

Grammar's identifiers can be completely skipped from counting depths
as they just add another level to the tree and they are always children
of some other expression which gets counted.

Increased maximum depth from 100 to 200. After testing on production
configuration with -Xss1m, depths of at least 250 can be used, so being
conservative we put the limit lower.

Fixes: #35299

Grammar's identifiers can be completely skipped from counting depths
as they just add another level to the tree and they are always children
of some other expression which gets counted.

Increased maximum depth from 100 to 200. After testing on production
configuration with -Xss1m, depths of at least 250 can be used, so being
conservative we put the limit lower.

Fixes: elastic#35299
@elasticmachine
Copy link
Collaborator

Pinging @elastic/es-search-aggs

@matriv
Copy link
Contributor Author

matriv commented Nov 6, 2018

retest this please

Copy link
Member

@costin costin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left a comment regarding the exception message.
LGTM!

ctx.getClass() != SqlBaseParser.BackQuotedIdentifierContext.class) {
int currentDepth = depthCounts.putOrAdd(ctx.getClass().getSimpleName(), (short) 1, (short) 1);
if (currentDepth > MAX_RULE_DEPTH) {
throw new ParsingException("expression is too large to parse, (tree's depth exceeds {})", MAX_RULE_DEPTH);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's improve the expression a bit:

  1. add location (so the user can tell where the parsing stopped):
    new ParsingException(source(ctx), "")
  2. rephrase the message to better convey the message:
    "SQL statement too large; halt parsing to prevent memory errors (stopped at depth {})" or something along those lines.

@matriv matriv merged commit 42dcdd0 into elastic:master Nov 7, 2018
@matriv matriv deleted the mt/fix-35299 branch November 7, 2018 11:08
matriv pushed a commit that referenced this pull request Nov 7, 2018
Grammar's identifiers can be completely skipped from counting depths
as they just add another level to the tree and they are always children
of some other expression which gets counted.

Increased maximum depth from 100 to 200. After testing on production
configuration with -Xss1m, depths of at least 250 can be used, so being
conservative we put the limit lower.

Fixes: #35299
@matriv
Copy link
Contributor Author

matriv commented Nov 7, 2018

Backported to 6.x with 566073e

matriv pushed a commit that referenced this pull request Nov 7, 2018
Grammar's identifiers can be completely skipped from counting depths
as they just add another level to the tree and they are always children
of some other expression which gets counted.

Increased maximum depth from 100 to 200. After testing on production
configuration with -Xss1m, depths of at least 250 can be used, so being
conservative we put the limit lower.

Fixes: #35299
@matriv
Copy link
Contributor Author

matriv commented Nov 7, 2018

Backported to 6.5 with edd8277

@colings86 colings86 added v6.5.0 and removed v6.5.1 labels Nov 9, 2018
pgomulka pushed a commit to pgomulka/elasticsearch that referenced this pull request Nov 13, 2018
Grammar's identifiers can be completely skipped from counting depths
as they just add another level to the tree and they are always children
of some other expression which gets counted.

Increased maximum depth from 100 to 200. After testing on production
configuration with -Xss1m, depths of at least 250 can be used, so being
conservative we put the limit lower.

Fixes: elastic#35299
@jimczi jimczi added v7.0.0-beta1 and removed v7.0.0 labels Feb 7, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants