-
Notifications
You must be signed in to change notification settings - Fork 303
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running multiple times aggregation queries on the main branch eventually generate memory limit errors #3503
Labels
bug
Something isn't working
Comments
Dataset
|
PSeitz
added a commit
that referenced
this issue
Jun 6, 2023
PSeitz
added a commit
to quickwit-oss/tantivy
that referenced
this issue
Jun 6, 2023
small docs improvement as follow up on bug quickwit-oss/quickwit#3503
fulmicoton
pushed a commit
to quickwit-oss/tantivy
that referenced
this issue
Jun 12, 2023
small docs improvement as follow up on bug quickwit-oss/quickwit#3503
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
On the github archive dataset, if I run several aggregations, the first runs will succeed but after several attempts, it generates the following errors:
Exception: ('Error while querying', '{\n "message": "Internal error: `(Internal error: `Aborting aggregation because memory limit was exceeded. Limit: 1.00 GB, Current: 1.00 GB`., split_id: 01H26SCHJ2AR36YAXATXMSW4FW), (Internal error: `Aborting aggregation because memory limit was exceeded. Limit: 1.00 GB, Current: 1.00 GB`., split_id: 01H26SKYQVE5SQV08XSVDCK3XY), (Internal error: `Aborting aggregation because memory limit was exceeded. Limit: 1.00 GB, Current: 1.00 GB`., split_id: 01H26SV71BZGPT45364VPYEPY8), (Internal error: `Aborting aggregation because memory limit was exceeded. Limit: 1.00 GB, Current: 1.00 GB`., split_id: 01H26T2GZT54S7H4Q05XHM0NPN), (Internal error: `Aborting aggregation because memory limit was exceeded. Limit: 1.00 GB, Current: 1.00 GB`., split_id: 01H26T9Z9QNSQBQVNB066ZBT7Q), (Internal error: `Aborting aggregation because memory limit was exceeded. Limit: 1.00 GB, Current: 1.00 GB`., split_id: 01H26THAEMY8JS08A6ZNDZSJDG)`."\n}')
If I restart, the requests first succeed and then fail again.
Here are the queries I run:
The text was updated successfully, but these errors were encountered: