New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Results window is too large - Kaminari #577
Comments
@realmadrid2727 I was also getting the same error. I fixed it by setting |
Setting To make the deep pagination work "properly", one can either:
|
Can you please give us the status on this issue? Since increasing the |
Same problem |
Is there any recommendation here on how to handle large page numbers in large result sets? We're now hitting this on Stack Overflow as well. |
bump on this... I'd love to hear what others are doing... it seems like |
Argh, this went under my radar, sorry for the silence, everybody. The As a note, the Scroll API is not meant to be used for "regular searching", especially in a high concurrency scenarios (many people searching at the same time), since it keeps an "open window" into the index, and that can again break. The Scroll API is great for something like "export this dataset" feature, which is run only now and then, either by people or by a cronjob. Supporting pagination over a very large dataset probably needs a bit of re-designing and re-implementing, going in a different direction than the regular |
To follow up here, provided business end of things with the different options and after a bit of discussion, we ended up just upping the limit to an acceptable level (~60k) to cover most of the most viewed content. At the eod, we're still using |
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This really isn't stale. |
Is it really possible that 5 years on there still isn't a good way to do front-end driven pagination of large result sets? We want to let users scroll through a month's worth of orders. Sometimes that is 100,000+ orders. What are we supposed to do? We use datatables on the front end. |
@philsmy Limit/offset pagination is not a scalable pattern for a distributed system like elasticsearch. Use cursor-based pagination with |
When using this with Kaminari versus using from + size, the following error occurs when navigating to the last page in the paginated results:
ActionView::Template::Error ([500] {"error":{"root_cause":[{"type":"query_phase_execution_exception","reason":"Result window is too large, from + size must be less than or equal to: [10000] but was [11880]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level parameter."}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"users","node":"bB32lITgax0OBbQtim3hFbI","reason":{"type":"query_phase_execution_exception","reason":"Result window is too large, from + size must be less than or equal to: [10000] but was [11880]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level parameter."}}]},"status":500}):
I'd limit my results to 10,000 if I knew how, but I can't seem to figure it out (similar to how in MySQL you could say
SELECT * FROM users LIMIT 10000
). Is this an issue with Kaminari?The text was updated successfully, but these errors were encountered: