Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

x/tools/gopls: bad ranking for global variables in completion #39207

stamblerre opened this issue May 22, 2020 · 1 comment

x/tools/gopls: bad ranking for global variables in completion #39207

stamblerre opened this issue May 22, 2020 · 1 comment


Copy link

@stamblerre stamblerre commented May 22, 2020

In this case, the completion item I wanted was defined "near" the code I was writing, and it had a matching type. It seems to me that a global variable of a matching type defined in my file should be ranked about completions from other packages and variables that have to be dereferenced.

Screen Shot 2020-05-21 at 10 52 14 PM

/cc @heschik

@gopherbot gopherbot added this to the Unreleased milestone May 22, 2020
@stamblerre stamblerre modified the milestones: Unreleased, gopls/v0.5.0 May 22, 2020
Copy link

@muirdm muirdm commented May 22, 2020

I presume the string you want is an untyped constant string? I think there are two things causing this:

  1. We downrank untyped constant candidates so that typed constants rank higher. For example, you don't want an untyped int constant to be suggested as a candidate for a named int "myInt" when there is also a "myInt" value in scope (even though they are both assignable). Maybe the fix is to only downrank untyped constants if the expected type is a named basic type.

  2. We don't give a score penalty for dereferencing candidates.

The scoring logic needs a retrofit. Mainly I want to consolidate scoring to a single place since it is not maintainable to have random score tweaks happening all over the place. We could add a bitmask (or boolean flags) for each candidate attribute/fact (e.g. Dereferenced | Assignable | Untyped) and then have a single scoring method that considers all dimensions of the candidate at once.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
3 participants
You can’t perform that action at this time.