-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LSP enhancement #40026
Comments
@itome what version of the Dart/Flutter SDK are you using? The docs are expected to be added in the sdk/pkg/analysis_server/test/lsp/completion_test.dart Lines 326 to 328 in dd5fa9c
It's possible I've missed somewhere though. That change should've shipped in Dart v2.4.
It'll have less of an impact, but it should still be significant - there's a lot of unimported code that should still be excluded. I don't think there's much we can do there since they're all valid to use though. If you can confirm which version of the SDK you're using (and that you're not using a custom version/snapshot of the server from previous testing), I'll do some measuring here. When we made previous changes, it sounded like this was solved (emacs-lsp/lsp-mode#851 (comment)) - does it seem like it's gotten worse, or was the improvement just not enough? |
I'm using dart 2.8.0
When I checked dart lsp after the previous change, It seems work well, so I thought there were some regression. |
@DanTup I think split completion response and mark it as |
Thanks, I was able to reproduce this - I'm seeing documentation included in the response to I don't think Hopefully we won't need it if we fix this, I'll figure out what's gone wrong. |
Oh, I see what's going on... The server only has a resolve-like API that we can use to get documentation for the completion items that come via Suggestion Sets. That means some items (which the server returns in-line) will still have documentation attached. I think this should usually only be a small portion though. Can you confirm what size JSON you're seeing in the project you're testing with? (If you can repro with a public project and attach the file, that would help too). In emacs-lsp/lsp-mode#851 (comment) it says emacs was taking around 200ms to parse 2MB JSON, but it sounds like you're seeing significantly slower than that - is the JSON significantly larger? |
Any news on that? |
@ericdallo nothing has changed here recently. It's still not clear how much JSON is being transferred when this is occurring. The comment linked above suggests it takes 200ms to parse 2MB of JSON, so it's not clear if it's much slower than that, or if there is considerably more JSON. Are you able to capture the JSON that's taking a long time to parse? |
Thanks, @DanTup, no I didn't capture, but I updated my Emacs from 26 to 28, and it's much faster. I think It's related to the Emacs 27 feature of native JSON parse. |
@DanTup Sorry for leaving issue aside. This file is log of lsp. Can you check why the response is so big? (13mb~ response for just one textDocument/completion call) |
Looking at the log, I think it's big because:
I don't think there's much to do about 1, though 2 could be an easy win (I don't know if it'll change the deserialise time much though) and we may need to make changes that could affect 3 anyway based on discussions in Dart-Code/Dart-Code#2290. |
I think the log is pretty printed. So I don't know whether the whitespace and newline is included in the response or not. And I noticed that after I send
|
I believe that's expected - the full list is sent to the client so that it can be filtered client-side as the user is typing. There's a flag in LSP (isIncomplete=true) that tells the client the list is not complete, so it must go back to the server as the user is typing. That said, the LSP spec is not explicit about whether the client should call back to the server if IsIncomplete=false and the user presses backspace.. If the intention is that it should, then perhaps these could be filtered on the server. I've opened microsoft/language-server-protocol#954 seeking clarification in the spec. |
@itome how is the performance recently with latest versions of everything? I filed another issue at microsoft/language-server-protocol#1008 with an idea that I think could help us reduce the payload by over 50% (related to point 3 from the comment above), plus I'm still waiting to hear back on microsoft/language-server-protocol#954 which might give some additional options. |
@DanTup |
I am wanting to register my interest in this item, especially the part about reduced payload. Basically, yes please. A relevant discussion has occurred in the vim LSC project about the performance of the Dart Language Server; see discussion here. The issue being that when doing simple term completion it appears that the Dart language server, when in a Flutter project, is sending upwards of 6,000 candidates which then need to be filtered by the language client. In the case of LSC that would be via the VimScript language, which is notoriously slow when processing large lists. In a JavaScript React project Microsoft's Nevertheless, any optimisation done via this issue would be appreciated. Best regards. |
5,630 completion items were being sent when using Flutter With Flutter This is bringing the language client to its knees even on a fast machine. Processing this 5 times larger list is too much for the plugin. Auto-completion is now unusable. Is this the expectation? The Dart language server appears to be sending an ever expanding list of term completions, 5 times more between the above listed Flutter releases, and the LSP client needs quickly filter the big list? Noting that some language clients are implemented in slow languages, it may not be possible to do. Or are there some flags that I need to set for the language server? I am starting Dart Analysis Server just via: Any information would be appreciated. Best regards. |
Flutter has organized their packages so that developers generally only need to import a single library (such as But I don't know why we'd be getting 26,789 suggestions, and that's definitely not something I would expect. There aren't any other flags you need to be passing in. As far as I can see you're doing everything correctly. There is, however, a flag that might provide some useful debugging information. If you could start the server with the flag |
Instrumented log files attached. Not run on the same machine, I don't know how to switch between flutter versions. Flutter Flutter |
@bwilkerson - Looking at just the Unnecessarily long completions (both logs)
Why are we sending items like Repeated completions (way worse in 1.17)
This is an API that doesn't show up in the log from flutter 1.12. It shows up twice in the same response in flutter 1.17. I see at least 14801 repeated labels in the 1.17 log, and only 127 in the 1.12 log (some repeated labels may be valid if they are imported from different places?) (I was surprised to find it's a real name in Some APIs such as APIs that should be private to SDK (both logs)
As far as I'm aware this is something internal to the SDK and should not be importable by user code. |
27,000 - 15,000 possible duplicates still results in 12,000 completion items which is still an awful lot. Ideally for Nate's pure VimScript LSP client LSC completions should be 1,500 or under, way under being better. Actually I would want the term already entered by the user, 3 characters in LSC's case before triggering auto-completion, to be sent to the Analysis Server for server-side filtering. I don't know the Language Server Protocol enough to know if that is even possible (maybe it isn't). If it is, it would greatly ease the burden on this LSP client which is implemented in a very slow language. |
Since there's been no response to LSP or VS Code issues about filtering by prefix, I'm planning to assume the behaviour of VS Code is the expected behaviour of all clients (and that if the spec is clarified, it would be to match that). This means we can filter by prefix, which in most cases (except where you invoke completion with no prefix) should remove a significant number of completions (discussion in #42152).
I think taking away enum values would hurt usability. If you have code like:
It feels natural to want to type the name of the enum value (eg. If the number of these is huge, perhaps it could be optional, though I think with the change mentioned above it might be unnecessary.
I'll see if I can repro this - if they're exact dupes, that's definitely a bug. |
Ok, I think I tracked down these dupes:
There are actually multiple
This will also disappear with the fix above (it's a static member that shouldn't have been included), however the reason it appeared multiple times is different - the class that contains it is re-exported in three different libraries. If you haven't already imported any of them, then it's valid to show up 3 times as you're able to select which library you want to import it from (once you've already imported a library that has it, it will no loner show up from the others). I'm working on a change to filter by prefix server-side (https://dart-review.googlesource.com/c/sdk/+/150628), and in my testing it seems to make a very significant difference. I'll ping when it's finished/merged. |
@bluz71 Thanks for sending the logs! Unnecessarily long completions
The prevailing winds (including several teams outside the Dart team) seem to be to provide increasing numbers of multi-token completions under the assumption that it will improve the usability of the system. I think this would be a good area for some UX research (@InMatrix).
I agree that we should wait to see whether such an option is necessary. If it is, then I'd suggest that it be based on "single token" vs "multi token" completion styles, as that appears to me to be the real factor behind the option. Repeated completions @jwren and I are also working on the suggestion generation in server and tackling some issues like this there.
There's an outstanding request for us to be able to intelligently select the right library from which a name should be imported, especially within the Flutter SDK. There's been some discussion, but no work started yet. That arc of work should help with this issue. But aside from cases where users might legitimately want to import an element from one of several possible libraries, there shouldn't be any duplicates. APIs that should be private to SDK As far as I'm aware, the analyzer still looks in
My understanding is that internal libraries are supposed to be marked as |
I'm not sure - @vsmenon might know |
Sorry, not completely caught up on the context here, but I don't think |
With pure VimScript LSP clients such as vim-lsc and vim-lsp the performance benefit will be tremendous. So I am very pleased that it is a work in progress. Thank you and the team. Just out of curiosity, what will be the exchange required between client and server for this to function correctly? Client sends prefixed request ---> what type of LSP event is expected and in what format? Server sends back filtered list ---> I assume no change from the existing format for completions? It seems to me that certain LSP clients may, likely will, need to be changed to support this. So clarification will be appreciated. Lastly before I forget, can you list the characteristics of this prefix matching; strict prefix, substring, fuzzy, case sensitive? For me, strict prefix case sensitive is ideal; but I am sure others will differ on this. Best regards. |
There should be no change required from any clients - the server knows the state of the current document (because of
I believe it's fuzzy and case-insensitive (the fuzzy matcher already existed so I'm just reusing it for LSP). You can see some of the tests at sdk/pkg/analysis_server/test/src/services/completion/filtering/fuzzy_matcher_test.dart Line 277 in f7f0e2b
My understanding is that clients can/will still filter client-side (as they do today - since they don't know if the server applied prefix filtering), so if your editor has additional settings for the client-side filtering, they should still work (for ex. VS Code has some options to enable/disable allowing simple typos to match- if you turn that off, then it'll do exact filtering even if the server supplied options that matched based on typos). |
Great news, so no change needed client side. The LSC plugin (Vim LSP client) already does client-side filtering, it bubbles up strict prefix case sensitive matches to the top of the completion menu. These upcoming Analysis Server enhancements and fixes will significant. Looking forward to their availability. |
Bug: #42152. Bug: #40026. Change-Id: I4c921ddff3224d736c4236861d0ff156f58be595 Reviewed-on: https://dart-review.googlesource.com/c/sdk/+/150628 Commit-Queue: Danny Tuppeny <danny@tuppeny.com> Reviewed-by: Brian Wilkerson <brianwilkerson@google.com>
Some interesting results from LSC 303 comparing various language servers Two tests were undertaken, counting the number of server-provided completion items when completing a language keyword and also when completing a library type after typing a 3 character prefix.
Less is more 😉 |
@bluz71 a few of the improvements have made it through to the Flutter master channel and it seems to have improve things for VS Code Vim users - can you see if it's made much difference for you? Thanks! |
Thanks for the prompt. I typically use the Dart that ships with Flutter; but in this case I temporarily downloaded the latest Dart SDK Dev release Tested on two machines:
Raw match count, same test as my last post:
Night and day performance difference. The interactive performance via the Vim LSC plugin is now smooth as silk. No stall, no stutter unlike the previous Flutter releases; and that even includes when run on the very slow Macbook. It feels as fast as Microsoft's TypeScript language server and the No doubt about it, a tremendous result. I would say this is now a solved issue. Side note, all language servers should try and bound their match counts (say under 1,000), it makes a huge difference client-side in the editor. Many thanks to the Dart team, can't wait for it to roll into an upcoming Flutter release. |
I checked |
@bluz71 @itome thanks for testing - I'm glad to hear the changes made a reasonable difference! If you spot perf issues anywhere else, please do raise them here (and cc me) - with logs where possible. (@itome I don't have permissions here to close this, but if you're happy with the result I think you can do it. Thanks!) |
Underselling the benefit, these changes make a huge difference. You and the team should be proud, great work! |
Thank you very much @DanTup, I'll link this fix on some lsp-dart issues |
By the way - I added a note in the docs about passing If you have plugins that spawn the server, it would be useful to pass these so there's some visibility of where the LSP server is being used. |
I'll change |
Added to |
@ericdallo thanks! |
@DanTup
Hello, I'm using dart lsp for emacs, and writing flutter code.
Dart analysis server returns so large json response for
textDocument/completion
that emacs freezes while parsing json file for 10~ secondsDo you have some idea to reduce it?
I checked the json response and noticed that
documentation
is included to response.Maybe it must be excluded from response according to this issue -> #37060
Set
suggestFromUnimportedLibraries = false
has no effects for flutter user, beacauseimport 'package:flutter/material.dart
imports almost all source of flutter sdk.Irrelevant completion item seems to be returned. When we request in position below, All classes in flutter sdk is returned. I don't know it is due to lsp server or client.
The text was updated successfully, but these errors were encountered: