New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

\OC\Files\Filesystem::normalizePath is very slow #13221

LukasReschke opened this Issue Jan 9, 2015 · 0 comments


None yet
1 participant
Copy link

LukasReschke commented Jan 9, 2015

\OC\Files\Filesystem::normalizePath is very slow and takes about 1.5 seconds for 40k entries. That makes any action that does a lot of path normalization such as searching a really painful experience.

@LukasReschke LukasReschke self-assigned this Jan 9, 2015

LukasReschke added a commit that referenced this issue Jan 9, 2015

Simplify isValidPath and add unit tests
The check for invalid paths is actually over-complicated and performed twice resulting in a performance penalty. Additionally, I decided to add unit-tests to that function.

Part of #13221

LukasReschke added a commit that referenced this issue Jan 10, 2015

Cache results of `normalizePath`
`normalizePath` is a rather expensive operation and called multiple times for a single path for every file related operation.

In my development installation with about 9GB of data and 60k files this leads to a performance boost of 24% - in seconds that are 1.86s (!) - for simple searches. With more files the impact will be even more noticeable. Obviously this affects every operation that has in any regard something to do with using OC\Files\Filesystem.

Part of #13221

LukasReschke added a commit that referenced this issue Jan 10, 2015

Verify whether value is already normalized
Apparently `normalizer_normalize` is not verifying itself whether the string needs to be converted or not. Or does it at least not very performantly.

This simple change leads to a 4% performance gain on the processing of normalizeUnicode. Since this method is called quite often (i.e. for every file path) this has actually a measurable impact. For examples searches are now 200ms faster on my machine. Still not perfect but way to go.

Part of #13221
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment