New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance problems with large wiki, with long history and large pages #1940
Comments
I managed to make the |
Maybe I should add, that the nlab-content repo has a weird structure from a wiki-standpoint, since, so far, it is only used as a backup method (for this wiki https://ncatlab.org/). In particular, it has a deep folder structure, the folder names are number, each page is called content.md and all the internal links do not work as they are. |
Thanks for letting us know, and for providing a test repo. See gollum/gollum-lib#437 for an explanation of what's causing the poor performance. Note: the fix there only takes care of the page load times, the Overview logic still uses the slower complete tree map approach. We could ultimately combine the approach in gollum/gollum-lib#437 with caching, but I think caching is less important than improving our logic at this point! |
I should also say: any feedback and help is appreciated! |
Many thanks for looking into this! |
I tried to use gollum (latest master) with this repo:
https://github.com/ncatlab/nlab-content
but it is somewhat too slow to be usable:
I guess the second problem could be solved by caching (I don't mind if there is some wating time after updating a page) and essentially amounts to finishing up this:
For the first problem I only have the wild guess, that it might come from the long history (>100k commits), since it also seemed to cause problems here:
Closing with some details of my setup, just in case that plays a role:
The text was updated successfully, but these errors were encountered: