New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

textDocument/definition implementation for VS Code. #3660

Closed
wants to merge 26 commits into
base: 1.x
from

Conversation

Projects
None yet
9 participants
@wpopielarski
Contributor

wpopielarski commented Oct 22, 2017

Trial of implementation of textDocument/definition request. Supports types only, and only in case when type is found in Zinc analysis. When source(s) is found then editor opens potential source(s). This simple implementation don't use semantic data (yet).

  • move vscDefinition task to separate file
  • add tests
Show outdated Hide outdated main/src/main/scala/sbt/Defaults.scala Outdated
Show outdated Hide outdated main/src/main/scala/sbt/Defaults.scala Outdated
Show outdated Hide outdated main/src/main/scala/sbt/Defaults.scala Outdated
@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Oct 24, 2017

Contributor
Contributor

wpopielarski commented Oct 24, 2017

@wpopielarski wpopielarski changed the title from [WIP] textDocument/definition implementation for VS Code. to textDocument/definition implementation for VS Code. Oct 25, 2017

@kiritsuku

This comment has been minimized.

Show comment
Hide comment
@kiritsuku

kiritsuku Oct 27, 2017

Contributor

The contraband definitions for this file are missing. Are they part of another commit?

The contraband definitions for this file are missing. Are they part of another commit?

@kiritsuku

This comment has been minimized.

Show comment
Hide comment
@kiritsuku

kiritsuku Oct 27, 2017

Contributor

These error codes already exist in protocol/src/main/scala/sbt/internal/langserver/ErrorCodes.scala

Contributor

kiritsuku commented on main/src/main/scala/sbt/lsp/ErrorCodes.scala in d86fd9d Oct 27, 2017

These error codes already exist in protocol/src/main/scala/sbt/internal/langserver/ErrorCodes.scala

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Oct 27, 2017

Contributor
Contributor

wpopielarski replied Oct 27, 2017

wpopielarski added some commits Oct 30, 2017

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 2, 2017

Contributor

could anybody tell me what's wrong with tests? Locally they are passing here not.

Contributor

wpopielarski commented Nov 2, 2017

could anybody tell me what's wrong with tests? Locally they are passing here not.

@wpopielarski

ready to be checked again

@laughedelic

This comment has been minimized.

Show comment
Hide comment
@laughedelic

laughedelic Nov 5, 2017

Member

Hi @wpopielarski! I tried to test this branch, but couldn't make it work.

  1. There is one thing that I had to change locally: to comment these two lines

    case r: LanguageServerReporter =>
    r.resetPrevious(analysis)

    This is called in the compileIncremental task, that I guess, is used in your lspDefinition in some way. The result is that servers bombards client with empty publishDiagnostics notifications (to clean up), but this happens in response to every textDocument/definition request. On the sbt codebase it means that every request takes at least 20 seconds. And even on a small project it's ~5s.

    Commenting these lines out is not a solution, but the point is, that textDocument/definition shouldn't trigger diagnostics cleanup at all

    On a sidenote, I think that diagnostics cleanup should happen somewhere else. An alternative approach is to store diagnostics and every time new ones are generated, "merge" them and send only the necessary changes, this way there is no need to send lots of empty diagnostics on every compile. This is just an idea and, of course, out of scope of this PR.

  2. After that I could see textDocument/definition requests/responses. And I couldn't get any results: responses are always empty. I tried on the sbt code, on some work project, on a tiny test project — it didn't work anywhere. Could you provide an example code on which it is supposed to work well?

By the way, I tested it with Atom, but I think that there is nothing VSCode-specific happening here, so it sould work the same in both editors. I should install VSCode and try it out too...

Member

laughedelic commented Nov 5, 2017

Hi @wpopielarski! I tried to test this branch, but couldn't make it work.

  1. There is one thing that I had to change locally: to comment these two lines

    case r: LanguageServerReporter =>
    r.resetPrevious(analysis)

    This is called in the compileIncremental task, that I guess, is used in your lspDefinition in some way. The result is that servers bombards client with empty publishDiagnostics notifications (to clean up), but this happens in response to every textDocument/definition request. On the sbt codebase it means that every request takes at least 20 seconds. And even on a small project it's ~5s.

    Commenting these lines out is not a solution, but the point is, that textDocument/definition shouldn't trigger diagnostics cleanup at all

    On a sidenote, I think that diagnostics cleanup should happen somewhere else. An alternative approach is to store diagnostics and every time new ones are generated, "merge" them and send only the necessary changes, this way there is no need to send lots of empty diagnostics on every compile. This is just an idea and, of course, out of scope of this PR.

  2. After that I could see textDocument/definition requests/responses. And I couldn't get any results: responses are always empty. I tried on the sbt code, on some work project, on a tiny test project — it didn't work anywhere. Could you provide an example code on which it is supposed to work well?

By the way, I tested it with Atom, but I think that there is nothing VSCode-specific happening here, so it sould work the same in both editors. I should install VSCode and try it out too...

@laughedelic

This comment has been minimized.

Show comment
Hide comment
@laughedelic

laughedelic Nov 5, 2017

Member

No luck in VS Code either:
screen shot 2017-11-05 at 05 39 16

I have a feeling that I'm holding it from the wrong end.. 😅

Member

laughedelic commented Nov 5, 2017

No luck in VS Code either:
screen shot 2017-11-05 at 05 39 16

I have a feeling that I'm holding it from the wrong end.. 😅

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 5, 2017

Contributor
Contributor

wpopielarski commented Nov 5, 2017

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 5, 2017

Contributor
Contributor

wpopielarski commented Nov 5, 2017

@laughedelic

This comment has been minimized.

Show comment
Hide comment
@laughedelic

laughedelic Nov 5, 2017

Member

Just Foo is not class, trait, object so not available in zinc analysis file.

One of the reason can be the lack of zinc analysis cache files
which are used to find products of compilation. So probably you have to
compile project first and than try to jump to definition.

screen shot 2017-11-05 at 17 17 33

I definitely compiled the project and I don't get definitions on any symbol in this little piece of code. Tried with defining a class, a trait, an object — nothing works..
As I said before, could you provide some example code which works for you?

there are not my lines and I think you should ask Eugen Yokota about the reason of these 2 lines.

Sure, I wrote that more for @eed3si9n

Right now jump to definition is made from these analysis files not scalameta-sbt plugin.

I was actually thinking about it recently and was going to ask you. It seems that using information from the scalameta's semantic DB would be a straightforward way to get any symbol information for the language server. But I guess it's out of sbt's scope, because it requires a third-party plugin..
Anyway, in this light, is it still worth digging zinc analysis cache to get only partial information only for some symbols? Or would it make more sense to just implement this functionality outside of sbt, using scalameta-sbt (and interact with it through sbt server)?

Member

laughedelic commented Nov 5, 2017

Just Foo is not class, trait, object so not available in zinc analysis file.

One of the reason can be the lack of zinc analysis cache files
which are used to find products of compilation. So probably you have to
compile project first and than try to jump to definition.

screen shot 2017-11-05 at 17 17 33

I definitely compiled the project and I don't get definitions on any symbol in this little piece of code. Tried with defining a class, a trait, an object — nothing works..
As I said before, could you provide some example code which works for you?

there are not my lines and I think you should ask Eugen Yokota about the reason of these 2 lines.

Sure, I wrote that more for @eed3si9n

Right now jump to definition is made from these analysis files not scalameta-sbt plugin.

I was actually thinking about it recently and was going to ask you. It seems that using information from the scalameta's semantic DB would be a straightforward way to get any symbol information for the language server. But I guess it's out of sbt's scope, because it requires a third-party plugin..
Anyway, in this light, is it still worth digging zinc analysis cache to get only partial information only for some symbols? Or would it make more sense to just implement this functionality outside of sbt, using scalameta-sbt (and interact with it through sbt server)?

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 5, 2017

Contributor
Contributor

wpopielarski commented Nov 5, 2017

@laughedelic

This comment has been minimized.

Show comment
Hide comment
@laughedelic

laughedelic Nov 5, 2017

Member

So my question is: do you get
compilation errors? This part should work as well. You should get also info
as logs about received requests by sbt server

It compiles well, no errors (really simple code on the screenshot above). And I do see logs about receiving requests. I also see empty responses in Atom (can I see client-server communication log somewhere in VS Code?).

Member

laughedelic commented Nov 5, 2017

So my question is: do you get
compilation errors? This part should work as well. You should get also info
as logs about received requests by sbt server

It compiles well, no errors (really simple code on the screenshot above). And I do see logs about receiving requests. I also see empty responses in Atom (can I see client-server communication log somewhere in VS Code?).

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 5, 2017

Contributor
Contributor

wpopielarski commented Nov 5, 2017

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 5, 2017

Contributor
Contributor

wpopielarski commented Nov 5, 2017

@laughedelic

This comment has been minimized.

Show comment
Hide comment
@laughedelic

laughedelic Nov 5, 2017

Member

I don't mean 'correct code' :) I mean if you make failing snippet do you get error compilation back?

😄 I see. Yes, this part works fine, errors are reported as diagnostics.

Finding a symbol helps to find its definition by determining its type or some local stable identifier or whatever else but it still need work to figure out its source file and position in this file. So analysis files are just first step in this way I guess

You mean that even with scalameta we will need analysis cache from zinc to map symbols to the source locations? I thought it's also stored in the SemanticDB, but now I see that it's not.

Member

laughedelic commented Nov 5, 2017

I don't mean 'correct code' :) I mean if you make failing snippet do you get error compilation back?

😄 I see. Yes, this part works fine, errors are reported as diagnostics.

Finding a symbol helps to find its definition by determining its type or some local stable identifier or whatever else but it still need work to figure out its source file and position in this file. So analysis files are just first step in this way I guess

You mean that even with scalameta we will need analysis cache from zinc to map symbols to the source locations? I thought it's also stored in the SemanticDB, but now I see that it's not.

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 5, 2017

Contributor
Contributor

wpopielarski commented Nov 5, 2017

@laughedelic

This comment has been minimized.

Show comment
Hide comment
@laughedelic

laughedelic Nov 5, 2017

Member

Oh, I didn't see these logs. Should they appear in the sbt shell? If I don't get them, what am I doing wrong?

Member

laughedelic commented Nov 5, 2017

Oh, I didn't see these logs. Should they appear in the sbt shell? If I don't get them, what am I doing wrong?

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 5, 2017

Contributor
Contributor

wpopielarski commented Nov 5, 2017

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 5, 2017

Contributor
Contributor

wpopielarski commented Nov 5, 2017

@laughedelic

This comment has been minimized.

Show comment
Hide comment
@laughedelic

laughedelic Nov 5, 2017

Member

Nope, I only see (in the console)

[debug] onRequestMessage: JsonRpcRequestMessage(2.0, 8, textDocument/definition, {"textDocument":{"uri":".../test.scala"},"position":{"line":13,"character":17}}})

And then empty textDocument/publishDiagnostics notifications. I also see that Atom receives an empty response:

Scala (sbt) rpc.sendRequest textDocument/definition sending Object ...
...
Scala (sbt) rpc.sendRequest textDocument/definition received (5673ms) []

But no logs from lspDefinitionTask anywhere.

Member

laughedelic commented Nov 5, 2017

Nope, I only see (in the console)

[debug] onRequestMessage: JsonRpcRequestMessage(2.0, 8, textDocument/definition, {"textDocument":{"uri":".../test.scala"},"position":{"line":13,"character":17}}})

And then empty textDocument/publishDiagnostics notifications. I also see that Atom receives an empty response:

Scala (sbt) rpc.sendRequest textDocument/definition sending Object ...
...
Scala (sbt) rpc.sendRequest textDocument/definition received (5673ms) []

But no logs from lspDefinitionTask anywhere.

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 14, 2017

Contributor

@laughedelic class trait and object are for good start. zinc can give info about methods and class fields too. I think we will use scalameta finally. meantime I add super efficient optimization (if working :) )

Contributor

wpopielarski commented Nov 14, 2017

@laughedelic class trait and object are for good start. zinc can give info about methods and class fields too. I think we will use scalameta finally. meantime I add super efficient optimization (if working :) )

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 15, 2017

Contributor

well, even with simple classes is problem because names are encoded so there is no other possibility than use scalameta :). But is NameTransformer

Contributor

wpopielarski commented Nov 15, 2017

well, even with simple classes is problem because names are encoded so there is no other possibility than use scalameta :). But is NameTransformer

@dwijnand

I left one comment about making sure that private/internal things are private.

Additional things for this PR:

  1. squash the commits
  2. remove the weird, side diffs from this PR (e.g import Configurations.Test, etc)
  3. check and report what the transitive dependencies of scalacache-caffeine are.
Show outdated Hide outdated main/src/main/scala/sbt/lsp/Definition.scala Outdated
@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 28, 2017

Contributor

@dwijnand
looks like there is no dependency in compile. For simple maven project dependency:tree returns:

[INFO] --- maven-dependency-plugin:2.8:tree (default-cli) @ my-app ---
[INFO] com.mycompany.app:my-app:jar:1.0-SNAPSHOT
[INFO] \- com.github.ben-manes.caffeine:caffeine:jar:2.6.0:compile
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
Contributor

wpopielarski commented Nov 28, 2017

@dwijnand
looks like there is no dependency in compile. For simple maven project dependency:tree returns:

[INFO] --- maven-dependency-plugin:2.8:tree (default-cli) @ my-app ---
[INFO] com.mycompany.app:my-app:jar:1.0-SNAPSHOT
[INFO] \- com.github.ben-manes.caffeine:caffeine:jar:2.6.0:compile
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
@dwijnand

This comment has been minimized.

Show comment
Hide comment
@dwijnand

dwijnand Nov 28, 2017

Member

the transitive dependencies of scalacache-caffeine look fine too:

com.github.cb372:scalacache-caffeine_2.12:0.20.0 [S]
+-com.github.ben-manes.caffeine:caffeine:2.5.6
+-com.github.cb372:scalacache-core_2.12:0.20.0 [S]
| +-org.scala-lang:scala-reflect:2.12.3 [S]
| +-org.slf4j:slf4j-api:1.7.25
|
+-org.slf4j:slf4j-api:1.7.25
Member

dwijnand commented Nov 28, 2017

the transitive dependencies of scalacache-caffeine look fine too:

com.github.cb372:scalacache-caffeine_2.12:0.20.0 [S]
+-com.github.ben-manes.caffeine:caffeine:2.5.6
+-com.github.cb372:scalacache-core_2.12:0.20.0 [S]
| +-org.scala-lang:scala-reflect:2.12.3 [S]
| +-org.slf4j:slf4j-api:1.7.25
|
+-org.slf4j:slf4j-api:1.7.25

wpopielarski and others added some commits Oct 31, 2017

Adds copyrights header to file.
Final implementation.

Adds tests to Definition.

Adds reliable implementation of scala valid indentifier search (but with o(n^2) complexity).
Fixes #3297 and #3531
Add commandName as an extension method in Command
Fixes problem with finding valid scala identifier.
Adds better search for valid scala symbol.

Adds in-memory cache to sbt and lspCollectAnalyses task.

Adds lspCollectAnalyses to compile and initialize LSP actions.
implement window/logMessage
This sends sbt's log message as "window/logMessage" event to LSP.
Async LSP Definition implementation.
Groups imports and corrects logging info.

Optimizes access to analyses files.

Parallel computation should speed up processing.

Adds super efficient optimization.

Encodes class names.

Code style correction.

Escapes regexp characters in class name.
Remove 2 useless files in protocol
They have been moved to Contraband since, but never removed.
Adds backticks to class/trait/object name.
Adapts tests to changed specs2 dependency.

Tiny fixes.

Removes Scala IDE compiler clues.

@eed3si9n eed3si9n added this to the 1.1.0 milestone Nov 28, 2017

@eed3si9n

This comment has been minimized.

Show comment
Hide comment
@eed3si9n

eed3si9n Nov 28, 2017

Member

I'm closing this in favor of #3777

Member

eed3si9n commented Nov 28, 2017

I'm closing this in favor of #3777

@eed3si9n eed3si9n closed this Nov 28, 2017

@eed3si9n

This comment has been minimized.

Show comment
Hide comment
@eed3si9n

eed3si9n Nov 28, 2017

Member

If you could give me push rights, I could put those commits here too.

Member

eed3si9n commented Nov 28, 2017

If you could give me push rights, I could put those commits here too.

@wpopielarski

This comment has been minimized.

Show comment
Hide comment
@wpopielarski

wpopielarski Nov 29, 2017

Contributor

@eed3si9n thanks a lot for flattening history. I did the mess. And I think if #3777 is merged so is there a need to commit it to this branch? If I got you correctly.

Contributor

wpopielarski commented Nov 29, 2017

@eed3si9n thanks a lot for flattening history. I did the mess. And I think if #3777 is merged so is there a need to commit it to this branch? If I got you correctly.

@eed3si9n

This comment has been minimized.

Show comment
Hide comment
@eed3si9n

eed3si9n Nov 29, 2017

Member

@wpopielarski I guess not. Thanks for the contribution :)

Member

eed3si9n commented Nov 29, 2017

@wpopielarski I guess not. Thanks for the contribution :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment