Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide full import statement text from auto-import suggestions #31658

Closed
compojoom opened this issue May 26, 2019 · 52 comments · Fixed by #43149
Closed

Provide full import statement text from auto-import suggestions #31658

compojoom opened this issue May 26, 2019 · 52 comments · Fixed by #43149
Assignees
Labels
Domain: Auto-import Domain: Completion Lists The issue relates to showing completion lists in an editor Fix Available A PR has been opened for this issue Rescheduled This issue was previously scheduled to an earlier milestone Suggestion An idea for TypeScript

Comments

@compojoom
Copy link

compojoom commented May 26, 2019

Edit from @DanielRosenwasser

Given

import { someThi/**/

or

import { someThi/**/ }

The suggestion is to look for auto-imports to and to fully complete the paths with

import { someThing } from "./some/path"

  • VSCode Version: Version 1.35.0-insider (1.35.0-insider)
  • OS Version: macos 10.14.5

Steps to Reproduce:

  1. I'm working on a react project. I have a file that does export const Buttons
    2.In another file I do import { Button... and I expect auto-complete to display Buttons and hitting enter to complete the path.

But intelliSense just shows the text Buttons (with abc in front of it) and hitting enter doesn't add the path to the import.

If I try to directly import the component from a function - then auto-import correctly adds the Component to the top of the file with an import {Button} from 'correct_path'

Does this issue occur when all extensions are disabled?: Yes

@mjbvz
Copy link
Contributor

mjbvz commented May 28, 2019

Can you please share some example code or a small example project

Also, when you are typing out the import, is the entire line just import { without any import path yet?

@compojoom
Copy link
Author

here I made a gif for you:
2019-05-29 11 54 25

I have a something.js - in that file I export the const something

in index.js I try to import { some -> and I was hoping that autocomplete will kick in and complete the line. That doesn't happen.

If I go down and type something -> intellisense suggest something -> I hit it and the import on the top of the file is inserted...

And another funny thing is that - if I use flowjs whenever I define a type and intelliSense suggest it -> instead of importing on the top, the import happens inline? I guess this is another issue, but just wanted to mention it in case it is related.

@mjbvz mjbvz transferred this issue from microsoft/vscode May 29, 2019
@mjbvz mjbvz changed the title suggestion doesn't import path Autoimports while typing import May 29, 2019
@mjbvz
Copy link
Contributor

mjbvz commented May 29, 2019

This is a feature request. The requested behavior is to support auto imports while typing out the import itself, such as:

import { ab|

Could show a completion for an auto import for abc which would complete the line to:

import { abc } './abc'

@mjbvz mjbvz removed their assignment May 29, 2019
@compojoom
Copy link
Author

Ok. That's standard with intelliJ so I expected it with vscode as well. Sorry then.

the completion should be

import { abc } from './abc'

Since we are talking about import features.
When I try to import get from 'lodash-es' - the generated import is
import {get} from '../../../../lodash-es'
Is there a way for it to be just import {...} from 'node_module' ?

@RyanCavanaugh RyanCavanaugh added Awaiting More Feedback This means we'd like to hear from more people who would be helped by this feature Suggestion An idea for TypeScript labels Jun 13, 2019
@jasonwilliams
Copy link

jasonwilliams commented Feb 19, 2020

Agree with @compojoom
I was surprised that this didn’t work by design, it’s pretty standard to have import suggestions in other editors, I’ve linked my issue as I raised the same thing in VSCode.

@jasonwilliams
Copy link

jasonwilliams commented Feb 19, 2020

  • VSCode Version: 1.42.1
  • OS Version: MacOS 10.14.6

Steps to Reproduce:

  1. clone https://github.com/jasonwilliams/nextjs9-typescript-server-vscode-example
  2. Try to import test.tsx into page pages/index.tsx

no_auto_complete

TS Server trace

It does work if i just type it out in the document without being inside of import braces.

Looks like the difference is a competionInfo request is sent to the typescript server when i type in the document and i get a response with suggestions, but inside of import i get

[Trace  - 14:23:48.406] <semantic> Response received: completionInfo (1416). Request took 5 ms. Success: false . Message: No content available.

Typing inside import

Screenshot 2020-02-17 at 14 28 45

Typing inside document

Screenshot 2020-02-17 at 14 35 27

@RyanCavanaugh does the above help?

@jasonwilliams
Copy link

Can you please share some example code or a small example project

Also, when you are typing out the import, is the entire line just import { without any import path yet?

@mjbvz comment above

@DanielRosenwasser DanielRosenwasser changed the title Autoimports while typing import Provide full import statement text from auto-import suggestions Sep 14, 2020
@DanielRosenwasser DanielRosenwasser added this to the Backlog milestone Sep 14, 2020
@DanielRosenwasser DanielRosenwasser added Domain: Auto-import Domain: Completion Lists The issue relates to showing completion lists in an editor Effort: Moderate Requires experience with the TypeScript codebase, but feasible. Harder than "Effort: Casual". Help Wanted You can do this and removed Awaiting More Feedback This means we'd like to hear from more people who would be helped by this feature labels Sep 14, 2020
@DanielRosenwasser
Copy link
Member

Spoke with @andrewbranch about this a little bit. We think this can do a lot to alleviate the ECMAScript import order syntax that a lot of users are unhappy with.

Here's some rules for imports that I'm currently thinking.

  1. If there is an import path that resolves to a module, only provide suggestions to that import path (today's behavior)
  2. If there is an import path that resolves to a file that is not a module, provide no completions (today's behavior)
  3. If there is an import path that doesn't resolve to anything, or there is no import path, then
  • for named imports (import { /**/, import { Yadd/**/, import { Yadd/**/, Stuff } from)
    • complete the identifier
    • insert the missing braces if necessary
    • insert the missing from keyword if necessary
    • fix-up or insert this missing path if necessary
  • for a default import (import Identifi/**/)
    • complete the identifier
    • insert the missing from keyword if necessary
    • fix-up or insert this missing path if necessary
  • for a namespace import (import * as Identif/**/)
    • complete the identifier
    • insert the missing from keyword if necessary
    • fix-up or insert this missing path if necessary

Most of the same logic applies except for named exports, which I think we would not want to power this feature for.

One open question is how this works when you already have existing identifiers in a named import list - do you try to filter the possible module specifiers down? It's not strictly necessary since it's probably an edge case, but it would be a nice UX improvement to try that, and if no modules are found, fall back to module specifiers based on the current identifier.

Curious to hear feedback. We're also open to pull requests, though it's probably not a good first issue.

@jasonwilliams
Copy link

This is great, i agree with the logic you have come up with for section 3.

One open question is how this works when you already have existing identifiers in a named import list - do you try to filter the possible module specifiers down? It's not strictly necessary since it's probably an edge case, but it would be a nice UX improvement to try that, and if no modules are found, fall back to module specifiers based on the current identifier.

I don't follow, isn't this logic already covered in 1?
If you have existing identifiers in a named import list then you've already set the path, so the logic can just remain as it is today.
But i feel like I'm missing something here.

@DanielRosenwasser
Copy link
Member

If you have existing identifiers in a named import list then you've already set the path

Well, potentially not 🙂. That's why the filtering logic is probably unnecessary. Why would you ever be in this situation? But it would be neat if we could leverage existing intent.

@jasonwilliams
Copy link

Why would you ever be in this situation?

The only scenario I can think of is if you start listing modules without providing the path. import {moda, modb, modc...}
Usually when you know what you want to import but haven't added the path yet, but. you want to add some more modules.

One open question is how this works when you already have existing identifiers in a named import list - do you try to filter the possible module specifiers down?

Im interested to know how you would do this without a path. How do you know what to filter down to? Do you base this off the modules that are already in the braces? Do you then use these heuristics to both:

  • Know what modules can be added
  • Provide the path to where these modules live

@DanielRosenwasser
Copy link
Member

Today with auto-imports, we add an entry for every module that exports something with the appropriate name. That gets served up in the completion list.

In that proposal, you could start with that logic, then do a post-pass to look at what other identifiers are in the braces, and ask "does this module export all of these other identifiers as well?" If you end up with more than 0, those are your suggestions; if you end up with none, fall back to the original list for the current identifier.

@jasonwilliams
Copy link

jasonwilliams commented Sep 16, 2020

Today with auto-imports, we add an entry for every module that exports something with the appropriate name. That gets served up in the completion list.

For my understanding. What do these entries look like? Is this a list of filenames or paths for every file (module) that exports something? Or some inverted index of every module and the file it links back to.

If you end up with more than 0, those are your suggestions

When you say more than 0, are you referring to modules (files) that export those identifiers or the identifiers themselves? I’m assuming you’re referring to identifiers left from this file that matches as you’re within the braces so the path doesn’t matter at this point.

Assuming the above is correct there’s also enough information to autocomplete the from “./file/path/to/module” when adding the closing brace.
I agree that if 0 modules match you can fall back to today’s behaviour and just provide the (global completion?) list and do nothing on closing brace.

@andrewbranch
Copy link
Member

The full text of the discussion is

By default the request can only delay the computation of the detail and documentation properties. Since 3.16.0 the client can signal that it can resolve more properties lazily. This is done using the completionItem#resolveSupport client capability which lists all properties that can be filled in during a completionItem/resolve request. All other properties (usually sortText, filterText, insertText and textEdit) must be provided in the textDocument/completion response and must not be changed during resolve.

I read “all other properties” to mean “all properties not specified by resolveSupport.” Without any other indication from the spec, I would expect the only property that could not be resolved lazily is label, since that is the only required property in CompletionItem. Is that not the case? Why would the textEdit be needed up front to display the item in the completion list?

@NTaylorMullen
Copy link
Member

@NTaylorMullen I thought that was the point of the new resolveSupport client capability, though?
That is, the platform could specify that it does allow it.
Notably, VS does allow for this, and should be specifying as such in the client capabilities.

Wasn't sure if VS did but I know that VSCode didn't not that long ago 😄, @333fred went down that path and ran into headaches.

I read “all other properties” to mean “all properties not specified by resolveSupport.” Without any other indication from the spec, I would expect the only property that could not be resolved lazily is label, since that is the only required property in CompletionItem. Is that not the case? Why would the textEdit be needed up front to display the item in the completion list?

I should have been more specific. Yes the resolveSupport can specify that the edit can be replaced; however, in VSCode it doesn't which I presume is a blocker?

Why would the textEdit be needed up front to display the item in the completion list?

From my understanding it influences when to show/dismiss the completion list based off of the potential "replacable" completion span. I don't 100% recall though to be honest.

@andrewbranch
Copy link
Member

Looking at the typescript-language-features source code for VS Code, I don’t see why it would be a problem to set vscode.CompletionItem’s insertText and replacementRange (which together are analogous to LSP’s textEdit) during the details resolution, but I haven’t tried it yet. At this stage I’m happy to move forward as long as the functionality is theoretically supported by the relevant specs, as I think I’m capable of putting the missing pieces into VS Code myself. Maybe I’ll find out why it’s a problem soon 😄

@NTaylorMullen
Copy link
Member

NTaylorMullen commented Dec 4, 2020

I don’t see why it would be a problem to set vscode.CompletionItem’s insertText and replacementRange (which together are analogous to LSP’s textEdit) during the details resolution, but I haven’t tried it yet.

I'd also imagine in VSCode that insertText would also be required to be in its "final" state on the initial completion request unfortunately. That being said if the TypeScript code is factored in a way where updating one platform doesn't require updating another then you're good! 😄

@333fred
Copy link

333fred commented Dec 4, 2020

Wasn't sure if VS did but I know that VSCode didn't not that long ago 😄, @333fred went down that path and ran into headaches.

Indeed, vscode only supports resolving detail, documentation, and additionalTextEdits. https://github.com/microsoft/vscode-languageserver-node/blob/f1bc8dba5b8192ce8730aaeb05ce823cfff8c9b5/client/src/common/client.ts#L1514

@andrewbranch
Copy link
Member

This might be where the confusion lies—TypeScript does not speak LSP to VS Code. As far as I can tell, I could make any imperative change in here in conjunction with a TS Server protocol change.

@NTaylorMullen
Copy link
Member

This might be where the confusion lies—TypeScript does not speak LSP to VS Code. As far as I can tell, I could make any imperative change in here in conjunction with a TS Server protocol change.

Don't you wire your "custom" LSP messages up via VSCode's hooks though? Aka their completion resolve hook? If that's the case it'd have the same restrictions.

@andrewbranch
Copy link
Member

@andrewbranch
Copy link
Member

andrewbranch commented Dec 9, 2020

Only a couple possible ways forward here. It seems like we have to calculate the full edit for all completions we offer up front, which adds a little bit of overhead¹ for module specifier resolution on every item. Options are

  1. Don’t worry about the overhead, since it likely won’t be noticeable for users with a smaller number of dependencies, and anything is an improvement over what we do now, which is offer no completions.
  2. Use a stricter filter than we currently do to reduce the number of completions returned. For normal auto-import completions, we don’t process any symbols whose names do not contain the token text characters in order, case-insensitive. We could use a less fuzzy match, or simply put a hard limit on how many auto-importable symbols we’ll offer. We could also wait to return any completions until more than one character is available to filter by or completions are explicitly requested, though this might violate expectations (any time completions don’t show up when you expect them to feels jarring).
  3. Abandon this issue, and revive it if clients ever support lazily resolving the main edit in the future.

¹ Or currently, a lot of bit for pnpm users, a la #40584

@DanielRosenwasser
Copy link
Member

Can we be concrete about the specific overhead of each item? From the top level scope, we already provide auto import completions, so what sort of work has to be done per module to generate the full statement?

@DanielRosenwasser
Copy link
Member

Use a stricter filter than we currently do to reduce the number of completions returned. For normal auto-import completions, we don’t process any symbols whose names do not contain the token text characters in order, case-insensitive.

I think this might be reasonable here too.

@andrewbranch
Copy link
Member

Can we be concrete about the specific overhead of each item?

When auto import completions are not cached, the details request for a single item is usually an order of magnitude faster than the completion list request. Obviously there are huge variables that could affect this relationship, but that’s typical based on old notes of mine. Some portion of the details request is spent redoing work that was done in the completion list request, and the rest is spent on module specifier resolution and building display parts and code actions—so if we tried to do all that work in the completion list request, we’d be able to skip the former part. But let’s still call the details work an order of magnitude faster than the full list work—200ms for the list and 20ms for the details might be a ballpark guess for a small program. This order of magnitude relationship means the total response time for the list request could easily double if we tried to do the details work for 10 auto imports.

My gut feeling is that analysis is coming out overly pessimistic for typical cases, but 10 auto imports is not very many. One of the things that can make module specifier resolution more complicated is symlinks, which play a big role in lerna, yarn workspaces, and pnpm. I don’t want to write a feature that appears miserably slow for power users with big multi-project workspaces, even if it is purely adding functionality where none exists now. I’m open to a smarter filter only if the experience feels really seamless. But part of my hesitation is that this is a non-trivial amount of work that would largely not be reusable if the limitation of not being able to lazily resolve the main edit were removed. Conversely, because normal auto-imports are lazily resolved now, this would be a pretty small feature that could reuse a lot of existing code if that were an option for these kinds of completions. So trying to resolve everything eagerly feels risky, because it’s a decent amount of work and there’s likely to be a performance/experience tradeoff.

@andrewbranch andrewbranch added the Rescheduled This issue was previously scheduled to an earlier milestone label Jan 14, 2021
@jasonwilliams
Copy link

jasonwilliams commented Feb 10, 2021

I know this issue is already prioritised but I thought it was worth sharing this thread https://twitter.com/markdalgleish/status/1356404787244195840?s=21 showing just how frustrating users find the import intellisense experience at the moment.
bmeck also gathered more here

Just for some anecdotal evidence.

@andrewbranch
Copy link
Member

What are the reasons someone might want to write an import statement manually, getting good completions, vs. using auto imports as they exist today?

@andrewbranch
Copy link
Member

@mjbvz @jrieken this feature is getting close, but here’s the next UX hiccup in practice: the existence of the import statement snippet completion blocks new completions from being requested once you start typing the identifier you want to import, for as long as the characters you type are an ordered subset of “statement.” (I forgot to turn on screencast mode in this GIF, but to finally get completions, I press Esc and then Ctrl+Space.)

Kapture 2021-03-11 at 11 39 11

I could change the snippet prefix to just “import,” but that would give it the same label as the plain keyword completion for import, provided from TS Server. Alternatively, could we find a way to not block a new completions request in this case? It’s blocking because continuing to type is simply filtering down the list that was returned when typing the import keyword, and it usually does make sense not to request new completions when an existing item is still a match. But I think snippets that include a space in their prefix kind of break that logic.

@jrieken
Copy link
Member

jrieken commented Mar 12, 2021

The vscode.CompletionList has the isIncomplete property for that. This means that we will ask for new completions, for that provider, after each keystroke.

@andrewbranch
Copy link
Member

@jrieken how would that work in this example? From i to impor, it looks like we should just be offering normal global completions. It’s only once the user has gotten to import (with trailing space) that we want to start offering import statement completions, but because of the snippet, we never get a chance to tell VS Code that the list is incomplete, unless we said it was incomplete from the very beginning.

@jrieken
Copy link
Member

jrieken commented Mar 12, 2021

unless we said it was incomplete from the very beginning.

That. Also note that we only re-request completions from those providers that signaled incomplete-ness. So, you want to split this up into a provider that is complete and one which isn't. The trigger context will also tell you if you are being ask to complete an incomplete result (which again can be another incomplete result)

@andrewbranch
Copy link
Member

That would be a huge architectural change that is not worth this feature in the slightest, especially considering the feature is not just for VS Code. There is only one (semantic) TS Server, and it’s currently leveraged by one VS Code completion provider—I think changing that is off the table for now, because it would create a ton of non-backward-compatible work for every other language server client.

@andrewbranch
Copy link
Member

For now I’ve managed to use a combination of isIncomplete and conditional triggering on a space character from the typescript-language-features completions provider. I wish the import statement snippet completion remained intact, but I guess with the additional trigger, it goes away, and the snippet completion is not re-triggered with a space character:

Kapture 2021-03-12 at 16 21 03

Users who don’t want that behavior can disable import statement completions with a preference, and it will avoid retriggering on that space.

@typescript-bot typescript-bot added the Fix Available A PR has been opened for this issue label Mar 15, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Domain: Auto-import Domain: Completion Lists The issue relates to showing completion lists in an editor Fix Available A PR has been opened for this issue Rescheduled This issue was previously scheduled to an earlier milestone Suggestion An idea for TypeScript
Projects
None yet
Development

Successfully merging a pull request may close this issue.