Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add streaming support #182

Closed
wants to merge 7 commits into from
Closed

Conversation

felixfbecker
Copy link

@felixfbecker felixfbecker commented Feb 14, 2017

This is a proposal for adding "streaming" support to LSP, more specifically: The use case of wanting to return parts of a result before the full result is finished (#141). This is especially desirable for methods that can return lots of results like workspace/symbol, textDocument/completion and are therefor often expensive, even when the UI can only shows a subset of the full response anyway.

There are a couple of approaches to this:

  • Using a parser that can parse JSON as a stream
    This is not really viable because
    • the server can only "modify" one "object" in the result at a time, not update information at any path as information is available
    • the protocol requires to send a Content-Length header, so the sender needs to know the full content in advance
  • Polling the server for a progress / updates
    Polling results in unnecessary overhead. With the request the client expressed clear interest in the result, so the server should provide what it has as soon as possible.

Introducing a new method ($/partialResult) that the server calls on the client is therefor the best approach. This new method should be passed a request ID the server is returning partial results for and the partial data itself.

The data structure used to represent the partial result should work for all methods. In the simplest case, that means adding items to a result array. That should also work if the array is nested inside an object - for example, textDocument/completion returns an object with isIncomplete and items. The language server should be able to fill out the isIncomplete as well as add new items to the items array with partial results. The language server should also have the ability to provide partial objects, then fill out missing properties later. For example, it should be able to provide Locations with just URIs initially or SymbolInformations with just the name, then fill out the range later because it might be expensive to compute, not desirable to keep in memory and is not needed to show the user initially. The client could then already make use of the partial information.

Just trying to "merge" multiple objects/arrays together will not suffice for this as there would be no way to address specific array indices or to append to an array.

In addition, using a well-defined standard would have the benefit of having libraries available in many languages that ease implementation.

This is solved by JSON Patch. A JSON Patch is nothing but an array of operations which modify a value at a certain path. In particular, it is possible to add more items to arrays as well as add properties to objects no matter how nested the value is inside the result.

The client would just continuously apply the operations to a result and then for example rerender the UI. The UI might have to be modified so it handles undefined values, for example by showing Loading... where it would normally show certain properties or filtering items that it cannot display because of incompleteness. It might also cancel the request after it has received the information it was looking for.

The final response indicates the result is complete. If the receiver opted into streaming through the Server/ClientCapability and the server sent at least one partial result notification then the response result can just be null and is ignored by the client. Applying all patches sent through $/partialResult should yield to the exact same result as a not streamed response.

JSON Patch has libraries for almost every language including JS, Python, PHP, Ruby, Perl, C, Java, C#, Go, Haskell and Erlang. This should make implementation easy for clients and servers.

Simple example:

// --> Client sends a references request
{ id: 1, method: "textDocument/references", params: {} }
// Result at the client: undefined

// <-- Server sends a partial result that initializes the result to an empty array
{
	method: "$/partialResult",
	params: {
		id: 1,
		patch: [{ op: "replace", path: "", value: [] }]
	}
}
// Result at the client: []

// <-- Server sends zero to n partial results that append a Location to the array
{
	method: "$/partialResult",
	params: {
		id: 1,
		patch: [{ op: "add", path: "/-", value: { uri: "file:///a/file", range: { start: { line: 1, character: 2 }, end: { line: 1, character: 3 }}}]
	}
}
// Result at the client:  [{ uri: "file:///a/file", range: { start: { line: 1, character: 2 }, end: { line: 1, character: 3 }}}]

// <-- Server sends the final response
{
	id: 1,
	result: [{ uri: "file:///a/file", range: { start: { line: 1, character: 2 }, end: { line: 1, character: 3 }}}]
}

We experimented with this proposal at Sourcegraph in two language servers and are working on a VS Code client implementation.

@msftclas
Copy link

Hi @felixfbecker, I'm your friendly neighborhood Microsoft Pull Request Bot (You can call me MSBOT). Thanks for your contribution!

In order for us to evaluate and accept your PR, we ask that you sign a contribution license agreement. It's all electronic and will take just minutes. I promise there's no faxing. https://cla.microsoft.com.

TTYL, MSBOT;

@nicksnyder
Copy link

nicksnyder commented Feb 17, 2017

See microsoft/vscode#20010 for an umbrella issue that I have created to track the associated implementation details.

Specifically, these two PRs demonstrate an implementation of this proposal in VS Code

@felixfbecker
Copy link
Author

ping @dbaeumer

@ljw1004
Copy link
Contributor

ljw1004 commented Apr 6, 2017

Conceptually, diagnostics are a stream. (The protocol currently lacks a way for the server to indicate "there are no further diagnostics left to report" but it should have one).

Can you think of any way to unify diagnostics with streamed results?

@felixfbecker
Copy link
Author

@ljw1004 could you explain the where you see the streaming in diagnostics? Diagnostics are reported per-file and clear previously reported diagnostics for that file. They are typically only reported when you open a file, not for all files, so there is usually no real known point at which all diagnostics for all files are reported.

@ljw1004
Copy link
Contributor

ljw1004 commented Apr 6, 2017

They are typically only reported when you open a file, not for all files

Why do you say that? In C# and Visual Studio, when you edit a file (e.g. changing the signature of a method), then it works through the entire project to find every single place that might be affected. It reports errors including for files that haven't yet been opened.

When you work on the Roslyn codebase itself in Visual Studio (it's a fairly large C# project) then it sometimes takes up to a minute to figure out all the errors in other files. In more moderately-sized projects I commonly saw 10+ seconds to figure out all the errors in other files.

For our own team, we also display errors throughout the entire project, but our project is even bigger so it takes a bit longer.

@dbaeumer
Copy link
Member

Regarding the ping #182 (comment): given the priorities we currently have for VS Code I can't commit on giving reasonable feedback on this pull request in the near future. I agree that streaming support is a cool feature for the LSP but it requires quite some work to get this right especially for clients. May be we should start with a progress specification first which could mitigate some of the streaming use cases.

Regarding markers: language servers can provide markers for all resources in a workspace. There is no restriction to only handle open resources. This is why diagnostics are actually 'stream based' on a resource level and are notification send from the server to the client and not request results.

@matklad
Copy link
Contributor

matklad commented Apr 14, 2019

Note that Data Analysis server support streaming in a pretty general way, without using JSON patch, streaming parsers or other complicated tools. An example would be symbol search: https://htmlpreview.github.io/?https://github.com/dart-lang/sdk/blob/master/pkg/analysis_server/doc/api.html#request_search.findTopLevelDeclarations

The way it works is that, for "streaming" requests, the server answers immediately with a SearchID, and then proceeds with sending notifications with this specific search id.

Each notification usually contains the full set of items (so, newer ones overwrite older ones), but in a select cases (workspaceSymbol being one of these) the results from all notifications are concatenated.

@dbaeumer dbaeumer mentioned this pull request Jul 3, 2019
@dbaeumer
Copy link
Member

dbaeumer commented Jul 3, 2019

Please see also #786

@dbaeumer dbaeumer modified the milestones: 4.0, 3.15 Sep 20, 2019
@dbaeumer
Copy link
Member

Closing the issue. Corresponding support has been speced in dbaeumer/3.15 branch.

@dbaeumer dbaeumer closed this Sep 20, 2019
@ljw1004
Copy link
Contributor

ljw1004 commented Sep 20, 2019

@dbaeumer I'm looking in the dbaeumer/3.15 branch but I see nothing there about streaming - the only mention of "stream" is for the LSIF exporter, and there seems to be no mention of JSON-Patch or progress or partialResult. Am I just looking in the wrong place?

@KamasamaK
Copy link
Contributor

@ljw1004 Look at the specification.md. It is achieved by supporting partial results.

@ljw1004
Copy link
Contributor

ljw1004 commented Sep 20, 2019

Ah, thanks @KamasamaK - I'd simply been looking in the wrong place.

@dbaeumer
Copy link
Member

@yyoncho
Copy link

yyoncho commented Jun 5, 2020

Are there any language servers implementing this feature?

@yyoncho
Copy link

yyoncho commented Jun 11, 2020

Are there any language servers implementing this feature?

Someone? Or at least is there a language server planning to implement it anytime soon?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-request Request for new features or functionality
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

9 participants