Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Dart] Consistent and progressive degradation of sync during any session #3509

Closed
lukepighetti opened this issue May 9, 2020 · 41 comments
Closed

Comments

@lukepighetti
Copy link

lukepighetti commented May 9, 2020

Product and Version [VS/VSCode]:
Latest
OS Version [macOS/Windows/Linux]:
Any
Live Share Extension Version:
Latest
Target Platform or Language [e.g. Node.js]:
Dart

What were you doing when you encountered the desync?
Normal use

How often have you encountered desyncs?
Every time we live share

I'm hoping I can get one of the core contributors here to try VSCode LiveShare with a Dart (Flutter) project. It only takes about five minutes for it to start falling apart, and it does so consistently. The VSCode extension in use is https://github.com/Dart-Code/Dart-Code/

I was going to create multiple issues but I'm thinking this is specific to Dart since I had an issue open for a year that didn't have anyone else chiming in.

These issues all appear for guests to a live share session.

  • Type annotation stops working
  • Unable to import files via the lightbulb menu
  • Options go missing from code completion popup/ilght bulb menu
  • Mouse-over info from the language server stops working
  • command+. code actions stop working

These things can all be temporarily resolved by having a guest leave and rejoin. This is particularly annoying because the experience is amazing for the first minute but it degrades quickly and reliably. Our use case is video livestream pair programming, not local pair programming.

Ping @DanTup

@daytonellwanger
Copy link
Collaborator

Could you link to the repo for a dart project that I can try to repro with? It doesn't sound like the issue is repo specific, but just in case..

@DanTup
Copy link

DanTup commented May 11, 2020

@daytonellwanger is there a way to log traffic between the client/host?

@lukepighetti some things to check:

  • Does it happen if you disable autoImportCompletions? With this enabled, the completion results can be really large - presumably here, Live Share would be transmitting them over the web
  • Are there any errors in the dev console (Help -> Toggle Developer Tools in VS Code) on either the client or host?

@daytonellwanger
Copy link
Collaborator

@DanTup , a lot of what's sent over the wire exists in our logs, although it's obfuscated (you can access the logs via the command "Live Share: Export logs" available through the command palette).

But yes, for completion results, those are all sent over the wire.

@lukepighetti
Copy link
Author

I'll try to get together with @GroovinChip to answer these questions in the next few days.

@lukepighetti
Copy link
Author

Quick question before we provide any more info: is there an easy way to handle all the IDE features locally but have live share only handle the files/typing indicators?

The reason I say that is in theory if the remote project is transferred to my computer I should be able to use it perfectly and all we'd need at that point is to be able to follow people, see what they are typing, and keep the files synced.

@daytonellwanger
Copy link
Collaborator

No. Architecturally, all guests in a Live Share session are simply terminals to the host's machine. They have very little context relative to the full project. E.g. we download files to the guest on demand when they're opened.

Handling things locally would require your local setup nearly exactly match the host's setup. E.g. if you're on Node 12.x and the host is on Node 9.x, the project may work for them but not you. Maybe the project depends on some environment variables being set. Maybe it's a web app that connects to a local instance of a DB. etc.

@lukepighetti
Copy link
Author

lukepighetti commented May 15, 2020

I understand and support the decision to do terminal architecture. It makes perfect sense. Is it a quantum leap to allow synced-file architecture for projects where it makes sense? That should probably be a new issue...

Back on topic: We disabled autoImportCompletions and did a quick test and it seemed to improve the experience, although we were still seeing 5-6 seconds before autocomplete suggestions appeared which does feel extremely slow when you're used to it taking about 300ms. We were both on 100/10 bandwidth internet in the same geographical region (US Northeast). We have a live stream coming up on Saturday and I will report back with more details after then.

@daytonellwanger
Copy link
Collaborator

daytonellwanger commented May 15, 2020

Is it a quantum leap to allow synced-file architecture for projects where it makes sense?

Basically 😔. BUT this is a request we hear often, and it's not unreasonable. It's just not how we've been thinking about the product so far. It'd be great if you could create a new issue to track it. It'll help us to gauge how many other folks are interested in it and we could start fleshing out what it might look like.

Are other operations snappy for you? E.g. coediting?

Do you have any idea what the average size of a completion response is from the Dart language server (with autoImportCompletions disabled)?

@daytonellwanger
Copy link
Collaborator

Are other language services requests quick? E.g. go to definition?

@lukepighetti
Copy link
Author

lukepighetti commented May 15, 2020

Coediting is perfect, click to definition and hover type information are very fast.

It's only autocomplete that appears to be about 20x slower than what would be considered seamless. I'm going to have to defer to @DanTup on payload sizes, hope he can chime in when he has a moment. I've seen 500+ members come in before.

Maybe there's some way to reduce the autocomplete response size? Not sure if that should be a Live Share setting or a Dart-Code setting. Maybe Live Share needs a max length & throttle? Just throwing out ideas, I'm totally ignorant to how you guys have it setup.

@daytonellwanger
Copy link
Collaborator

If the other requests are quick, that suggests to me that it's probably related to the payload size, as the handling of all language services requests is quite similar.

@daytonellwanger
Copy link
Collaborator

daytonellwanger commented May 15, 2020

The language server protocol (actually I'm not sure if Dart support is via an LSP server or VS Code APIs) has recently added a lot of support for partial results and streaming results via progress notifications.. have you all investigated that at all? It would also speed up your language services in the local scenario.

As you suggested, maybe it would make sense for Live Share to automatically do this on a languages behalf.. let me think about that proposal some more. It wouldn't be easy to implement, and we haven't heard this issue for other languages, so it might not get high priority.

@DanTup
Copy link

DanTup commented May 15, 2020

Payloads with autoImportCompletions turned off shouldn't be massive - but it will depend a lot on how many libraries you've imported and symbols are in scope. They will typically be much larger than other types of requests (hovers, etc.) though so it's no huge surprise they're slower. I'm not sure how we can really reduce the size without losing functionality though.

The language server protocol (actually I'm not sure if Dart support is via an LSP server or VS Code APIs) has recently added a lot of support for partial results and streaming results via progress notifications..

We in the process of moving to LSP, but I'm not aware of any way to stream completion results (LSP or VS Code APIs). We do delay some stuff to the resolve call, but I believe we need to do provide the full list of completions up-front both for LSP and not (except in the case of IsIncomplete=true but that comes at the expense of added latency and transfer while typing instead of letting the client filter it all).

How does Live Share work here? Does it just inject a completion provider on the client, forward the request to the host, and then have the host invoke the completion provider on the host using executeCompletionItemProvider? If so, do you use itemResolveCount?

It might be useful if Live Share collecting metrics on some of these timings (even if not reported to you, at least accessible to users to help debug perf issues).

@lukepighetti
Copy link
Author

We completed our livestream and the quality of service with autoImportCompletions disabled was very good. I'm unlikely to claim that it was the culprit, but it seems to have contributed. We still have a concern about time to retrieve auto-complete options. And obviously, being able to have access to auto import completions would be extremely nice. Or, at the very least, be able to have settings that are specific to when you're on a live share session.

@DanTup
Copy link

DanTup commented May 19, 2020

I tried exporting the Live Share logs, but it didn't seem to include the payloads going over the wire. Adding logging to the completion item provider shows that there are around 11k items in a Flutter project.

Returning 10946 completion items, with JSON size of 17593943

If I JSON.stringify() the results, it comes to 17MB - however, a lot of this looks like our internal data and not properties that VS Code wants. I would be interested to know what data Live Share is serialising. My understand was that VS Code would only transfer the properties it wanted when sending from extension host to the editor, but I don't know where Live Share is getting the data from (my understanding was that it would get the VS Code results, sent in a round trip from extension host to UI back to extension host, so should not include these).

Screenshot 2020-05-19 at 12 51 32

@daytonellwanger do you know if these would be included in your payload? Is there a way I can get a log of it for testing?

@daytonellwanger
Copy link
Collaborator

Yeah, unfortunately since we try to keep the logs clean of any sensitive information, we don't include that sort of thing in them.

The Live Share payload is whatever is returned by vscode.commands.executeCommand('vscode.executeCompletionItemProvider'), so you could try running that.

@DanTup
Copy link

DanTup commented May 23, 2020

@daytonellwanger so do you call that method, then just JSON.stringify the results and send them to the client? If so, I'll test that to see whether the internal data is showing up.

How are you handling resolve? As far as I can tell, there's no command for you to resolve items on the host?

@DanTup
Copy link

DanTup commented May 26, 2020

I called vscode.commands.executeCommand('vscode.executeCompletionItemProvider') and serialised the results, and it did not include all of the internal data mentioned above (which is what I expected/hoped).

This means the actual payload is "only" 2.6MB and not 17MB. Here's an example of roughly what each entry looks like (I've formatted it here, but presumably you're not formatting the JSON over the wire so the whitespace etc. will be collapsed):

Screenshot 2020-05-26 at 14 10 55

So on average, the payload is 2600000/10946 = 230 characters per completion item (this seems about right for the items in the screenshot).

I'm not sure this can easily be solved in the language extension. I wonder if Live Share would be better off doing its own filtering on the host and just supplying some subset (eg. first 100 matches), setting isIncomplete-true to allow it to round-trip for more as the client types?).

@lukepighetti
Copy link
Author

Even if it did first 100, then the rest after that, it would seem like it would help considerably

@daytonellwanger
Copy link
Collaborator

For resolving, the executeCompletionItemProvider command takes an optional argument for specifying the number of completion items to resolve. We don't resolve any completion items today (although we should! Adding an item for that... That'll just make the payload even larger though).

Yeah, I'm in agreement that it probably doesn't make much sense for individual language extensions to have knowledge that they might be remoted, and so need to be much more sensitive to payload size, and hence support progress/streaming.

Since Live Share is already acting as a middle-man between language extensions and VS Code, we could probably implement progress/streaming for them...

@DanTup
Copy link

DanTup commented May 26, 2020

the executeCompletionItemProvider command takes an optional argument for specifying the number of completion items to resolve

Yeah, I use this in tests and have issues with its design (you can't resolve individual items so end up having to resolve loads). I was wondering if you were going to tell me about some API I didn't know about. Maybe there will be more demand than just me for better APIs there now :-)

Since Live Share is already acting as a middle-man between language extensions and VS Code, we could probably implement progress/streaming for them...

FWIW, although it's much less of an issue locally, large completion payloads have been coming up in various issues when running locally too...

In Dart we pre-load a load of completion info into the client to reduce the over-the-wire (from the language server) payload when using autoImportCompletions (ofcourse LiveShare isn't getting the benefit of this as it gets the full merged output). There's some info in Dart-Code/Dart-Code#2290 and discussions for TS considering something similar here at microsoft/TypeScript#36265 (comment).

There's also an ongoing issue with extensions blocking the extension host when compiling huge completion lists (see microsoft/vscode#75627) that can make typing sluggish when using extensions that handle keypresses in the extension host.

I mention that, because it seems like some better handling of large completion lists could be beneficial elsewhere too - maybe there's some solution that would work well for languages, VS Code, LSP and LiveShare rather than each spending resources on their own versions (eg. if LSP supported streaming completions at an API level, maybe LiveShare would benefit directly from that).

@daytonellwanger
Copy link
Collaborator

Doesn't LSP support streaming of completion items as of 3.15? See here. Not sure if VS Code yet supports it, but I would assume it does..

@DanTup
Copy link

DanTup commented May 27, 2020

Doesn't LSP support streaming of completion items as of 3.15? See here. Not sure if VS Code yet supports it, but I would assume it does..

My understanding is that this is just progress notifications - eg. you can tell the client that you're doing some long-running work and when it completes. This doesn't change that you need to supply all code completions up-front in a single response.

Right now, completion can work one of two ways:

  • isIncomplete=false - all completions must be supplied up-front and the client filters them as the user types
  • isIncomplete=true - each time the user types a character, the client will go back to the server and ask for a new completion list

The first one means the entire list needs to be provided in one go. The second one allows you to provide a shorter list (eg. 100 most likely) and then provide more filtered results as the user types, however it means the user can't scroll through the full list (they won't ever have more than where you cut it off) and may also transfer a lot of duplicated due to repeated round trips.

These both have drawbacks - which is why there are some discussions going on linked above. If LSP got protocol-level support for pre-loading completions into the client, and then providing a shorter response to textDocument/completion that just references them, then LiveShare might be able to piggy-back off that. I don't know if that's a plan, but my point was that there might be a solution that works reasonably well for everyone without all needing their own solutions.

@daytonellwanger
Copy link
Collaborator

daytonellwanger commented May 27, 2020

My understanding is that this is just progress notifications - eg. you can tell the client that you're doing some long-running work and when it completes. This doesn't change that you need to supply all code completions up-front in a single response.

I think you can stream results using Partial Results. Again, not sure if VS Code supports this for the completion request yet. But does that sound like it would meet your needs?

I guess this mechanism works better for requests like find all references, where you have some results displayed quickly and then others fill in later. With completion, it might not make as much sense... Perhaps we could give something like the IntelliCode suggested completions in the first partial result, as it's a small payload and contains the desired completion item like ~90% of the time.

This was referenced May 27, 2020
@DanTup
Copy link

DanTup commented Jun 1, 2020

@daytonellwanger ah thanks, I was unaware of that. Unfortunately you're right, and VS Code doesn't support this (microsoft/vscode-languageserver-node#528).

With completion, it might not make as much sense...

I'm not so sure - 90% of the payload is for things that aren't in-scope but are being supplied to enable auto-importing. I think supplying the 10% that is the in-scope variables (which are fast to compute and much smaller) and then later adding all the auto-import options might be a reasonable experience.

Perhaps we could give something like the IntelliCode suggested completions in the first partial result, as it's a small payload and contains the desired completion item like ~90% of the time.

I think this would still require VS Code support? In which case, if the language server also supports it, you could just proxy the requests (so you'd get the same partial results support), and if the language server didn't support partial requests, you could potentially pick the "first" (sorted by sortText) x and send in your own partial payload, with everything in another?

Something else that might work (I haven't tested), is showing a small list, then triggering the triggerSuggest command after a delay to force a refresh? I don't know if it'll work (it might just exit early if the suggest widget is already visible) and it's a total hack, but if VS Code won't get partial support anytime soon 😄

@daytonellwanger
Copy link
Collaborator

I think this would still require VS Code support? In which case, if the language server also supports it, you could just proxy the requests (so you'd get the same partial results support), and if the language server didn't support partial requests, you could potentially pick the "first" (sorted by sortText) x and send in your own partial payload, with everything in another?

Yep!

I'm inclined to just wait until VS Code adds support for partial results for completion.. I don't think there's any other way to support this that isn't a total hack. Based on your earlier comments, it seems like there's growing demand for it, so hopefully they pick it up soon.

@lukepighetti
Copy link
Author

lukepighetti commented Apr 1, 2021

Not sure why this was closed, usually GitHub Actions will give a reason. In any case, this is still a problem on Flutter project (Dart) live shares. First five minutes are great, then it grinds to a halt.

@lostintangent lostintangent reopened this Apr 1, 2021
@lostintangent
Copy link
Member

We had a problem with an action that closed some issues unexpectedly. Apologies for that!

@daytonellwanger
Copy link
Collaborator

@lukepighetti are you still seeing that be the case? We've made a lot of progress on these issues since this was first filed, so I'd love to get a re-cap from you of what still isn't working.

If you are still facing issues, there are some diagnostic steps on this issue template that would be helpful. Thanks!

@lukepighetti
Copy link
Author

We are still seeing this issue. We can start a live share session with someone and it performs well for about five minutes and then becomes unusable. I'd be happy to schedule a demo with anyone who is interested in seeing this first-hand.

@daytonellwanger
Copy link
Collaborator

Can we re-summarize what the issue is, since there's been a lot of things discussed in this thread? Is it the problems from the original post?

Type annotation stops working
Unable to import files via the lightbulb menu
Options go missing from code completion popup/ilght bulb menu
Mouse-over info from the language server stops working
command+. code actions stop working

@lukepighetti
Copy link
Author

lukepighetti commented Apr 30, 2021

If you can think of something that Live Share does it gets more flakey over time. It's a generalized description because it's an all encompassing generalized issue.

A 15 min demo would be much more revealing than all this back and forth. I don't think anyone's going to get the answers they need until they have a chance to demo it themselves.

@daytonellwanger
Copy link
Collaborator

A 15 min demo would be much more revealing than all this back and forth. I don't think anyone's going to get the answers they need until they have a chance to demo it themselves.

Totally agree. This is just with Dart? If I go install the Dart extension and have a ~15 minute Live Share session, I'll hit these issues?

I'm also happy to setup a time to chat so you can give us a demo of the issues you're facing. Send us an email at vsls-feedback@microsoft.com if you want to schedule something.

Thanks for all the help tracking this down!

@lukepighetti
Copy link
Author

lukepighetti commented May 1, 2021

Email sent, thanks.

I suspect you'll need the whole toolchain installed and be in an active Dart project with the language server up and running if you want to recreate it yourself.

@Davsterl
Copy link
Member

Davsterl commented May 2, 2021

@lukepighetti , Thanks Luke! We'll set something up and this repo will really help us track down and fix it!

@noga-dev
Copy link

noga-dev commented Jun 25, 2021

v1.0.4419 here.

Can we re-summarize what the issue is, since there's been a lot of things discussed in this thread? Is it the problems from the original post?

Type annotation stops working
Unable to import files via the lightbulb menu
Options go missing from code completion popup/ilght bulb menu
Mouse-over info from the language server stops working
command+. code actions stop working

In my case, the biggest issue is the problems tab desync. What I'm seeing as a guest is stuck on old snapshot vs. what the host is seeing. Only solution that fixes the problems tab desync is restarting the connection which means restarting the VSCode instance. Issue begins to occur fairly quickly, too.

Sounds like #4258

@daytonellwanger
Copy link
Collaborator

daytonellwanger commented Jun 28, 2021

Yeah, sounds like #4248 #4258. Let's track it over there. Thanks for the report!

@noga-dev
Copy link

Yeah, sounds like #4248. Let's track it over there. Thanks for the report!

I'd say this is worse because here we get no indication that we're on a past snapshot. We just see no longer relevant problems or missing existing problems. It's like being stuck in the stone age.

@daytonellwanger
Copy link
Collaborator

Whoops - I meant #4258 (the one you originally linked)

@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed automatically in 2 days.

@derekbekoe
Copy link
Collaborator

We’ve made improvements in this area since this issue was filed. We believe this may have improved your experience and are closing this issue. If this issue still persists for you, please reopen the issue and let us know.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants