New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance issue for pages that return a long list of tags #1630
Comments
One option is also to paginate results. But we might not be able to support search easily. |
Related to #262. There's also the issue that really large pages (with several MB of HTML) may work in some browser but not others. Especially mobile devices with fewer RAM can struggle to display large sites efficiently (if at all). |
For codes, I'm going to reduce the number of entries, and keep only entries with at least 3 x : codes_tags: [ In terms of actual uses (to group products by country / brands), it's probably enough. And it will reduce a lot the number of unique entries. |
@stephanegigandet Are you sure about that ? When I analyzed the mongo log file, I saw "exceeded memory limit" for aggregation on ingredients and codes. The solution would be to turn on the option "allowDiskUse". |
@syl10100 : that's right, I turned on allowDiskUse, which allows the MongoDB query to return relatively quickly (less than 30 seconds), but then the Perl processing takes much longer. |
I created a new function get_taxonomy_tag_and_link_for_lang to replace multiple calls to display_taxonomy_tag / canonicalize_taxonomy_tag_link It yields a 10X improvement to the speed to generate the HTML for the list of tags. |
…isplay_taxonomy_tag_link, display_taxonomy_tag and other related functions. Bug #1630
@stephanegigandet Good news ! Did you turn on allowDiskUse as well ? |
Pages that return a long list of tags (e.g. /ingredients or /codes ) fail to load because they take too long (> 1 minute) to generate. (nginx typically times out before that)
The issue is not with mongo (especially as it as been optimised by bug #1612 ) but with the perl code that processes each of the tags. (in sub display_list_of_tags($$) )
The text was updated successfully, but these errors were encountered: