-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Processing large number of completions blocks the main ST process #6249
Comments
Creating |
Execution time of Most of It's not ideal that GUI is blocked while processing completions, as it results in poor editing experience. Most of the delay is a result of python's limited performance. So even without GUI blocking it is to expect so many completions show up several seconds late. |
Note that this part is not that relevant to the issue. It happens on plugin load once and in a LSP scenario it would happen in a separate process. |
This comparison was made to clarify, that even creating those completions is a very expensive operation, which takes too much time. In a real world plugin, such as LSP those completions would be created at runtime and would be relevant for the delay, even though this is not the part, which blocks the GUI. The point is python being too slow to handle such massive amounts of completions. So there's little which ST can do, except avoiding GUI to block while processing. |
Note that while we can and should make this faster (decoding completions and setting up the auto-complete is taking a substantial amount of time), a big part of why this is slow is unfortunately just python. There are some things we can do to make things faster on the python side but fundamentally making 1M objects is just slow in python. That The only real ways to manage that are:
|
I've mentioned this before but will say it again: in case of the LSP, we are creating completions on the async thread so while the completions can be slow to appear at least that doesn't block the main thread which is the main issue here. (blocking the async thread for a long time is also not ideal but lets leave that discussion out of here for now) |
So you are also mentioning calling # Resolve on the main thread to prevent any sort of data race for _set_target (see sublime_plugin.py).
sublime.set_timeout(lambda: clist.set_completions(completions, flags)) which suggests that doing that is currently problematic. |
Maybe plugins, heavily using async functions should maintain their own background thread, instead of relying on ST's global one. |
Not sure if that's feasible in all cases. If a plugin has to interact with |
Preventing or blocking other plugins due to long lasting expensive tasks is not ideal, too. You are right about GitGutter is doing exactly that. Using only synchronous APIs and pushing tasks to a permanently running background thread. If all logic of a plugin (e.g. LSP) uses its own background thread, race conditions should not be of an issue and it would in worst case block its own concurrently running background tasks if a single one is very expensive, without affecting others (which is only partly right due to GIL, but well). |
ST also seems to have special handling for when calling ST API from async thread - it blocks the async thread until task on the main thread is done. A lot of code relies on that currently and, even if possible to replicate with custom thread, it probably wouldn't be a simple change. In any case, I don't even know if a background thread would provide any solution for this issue because I don't quite recall what the issue with |
After some optimizations I've gotten to the point where I can have 50k completions without being annoyed by the latency - previously this was 5k. That's all the low hanging fruit I could find, so we'll see how this performs in the next build. |
build v4168 build v4170 While I can 4170 being faster than 4168 the delay is still noticeable.I would not close this ticket until the delay is fixed. What is the difference if someone offers you: Is it useful if someone get that many completions? Is it useful if someone get no completions at all? The only difference is that getting no options at all is cheaper, and will not block the UI that much :) In LSP, there are at least two servers which can generate a lot of completions tailwindcss and typescript |
Build 4184 has additional performance improvements when handling many completions. Part of this is allowing |
Description of the bug
When there is a very large number of completions (where definition of "very large" varies per machine), ST blocks the main process while processing the completions, resulting in choppy editing experience.
Steps to reproduce
Expected behavior
The main process should not be blocking on typing "1", "2" and "3".
Or it should be optimized, if possible, so that the lag is not noticeable (but it will still be possible to trigger lag with edge cases).
Actual behavior
The process blocks on typing "123" and nothing can be done until all completions are processed and completions popup is shown. This can take different amount of time depending on the number of completions and the machine the ST is running on but on my mac M2 pro, with 1000000 completions, ST blocks for over a second on each completions query.
Sublime Text build number
4168
Operating system & version
macOS
(Linux) Desktop environment and/or window manager
No response
Additional information
The plugin is logging the time it takes between "on_modified" notifications to roughly show how long the process is blocked for each completions query which might be useful for debugging.
OpenGL context information
No response
The text was updated successfully, but these errors were encountered: