Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

configure autocomplete provider based on cody LLM settings in site config #1035

Conversation

taras-yemets
Copy link
Contributor

@taras-yemets taras-yemets commented Sep 13, 2023

Part of #931
Test in pair with sourcegraph/sourcegraph#56568

Defines the autocomplete provider config based on the completions provider and model names from the site config.
The suggested configuration hierarchy:

  1. If cody.autocomplete.advanced.provider field in VSCode settings is set to a supported provider name, and all the additional conditions like model, access token, etc. are met the corresponding provider config is returned. Otherwise, return null (completions provider is not created).
  2. If the provider name and model can be defined based on the evaluated feature flags, return the corresponding provider.
  3. If the completions provider name is defined in the connected Sourcegraph instance site config, we return the corresponding provider config. If the provider name/model can't be parsed or this provider is not supported null is returned (completions provider is not created).
  4. Anthropic config provider is used by default.

TODO:

  • Add createProviderConfig tests

Test plan

  • CI passes

@taras-yemets taras-yemets changed the title Taras-yemets/configure-autocomplete-provider-based-on-site-config configure autocomplete provider based on cody LLM settings in site config Sep 13, 2023
@taras-yemets taras-yemets marked this pull request as ready for review September 13, 2023 15:31
@taras-yemets taras-yemets requested review from a team and chwarwick September 13, 2023 15:31
@taras-yemets
Copy link
Contributor Author

I wonder if we should fall back to the completions provider from site config if provider/model/token is misconfigured in VSCode settings. The current approach for this case is not to create any completions provider. It may be more user-friendly. WDYT, @chwarwick?

@philipp-spiess
Copy link
Contributor

if provider/model/token is misconfigured in VSCode settings. The current approach for this case is not to create any completions provider.

yeah this was done on purpose so if you really manually set something, we can show you what's wrong and not "silently ignore it and leave you with no debug info"

Copy link
Contributor

@chwarwick chwarwick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good to me, but if you haven't already can you verify this works or has a proper fall back when connecting to a legacy sg version that doesn't have the provider available in the LLM config

vscode/src/completions/providers/unstable-openai.ts Outdated Show resolved Hide resolved
@@ -46,6 +46,7 @@ query CurrentSiteCodyLlmConfiguration {
fastChatModelMaxTokens
completionModel
completionModelMaxTokens
provider
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume this doesn't cause issues if the provider field does not exist?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great catch! How did I miss that?! Thank you, @chwarwick!
Addressed in a16c760
Honestly, I don't like the implementation. Do you know of a more robust way of getting a potentially non-existing field?

cc: @philipp-spiess

@taras-yemets
Copy link
Contributor Author

can you verify this works or has a proper fall back when connecting to a legacy sg version that doesn't have the provider available in the LLM config

Thanks for your review, @chwarwick!
I tested this PR with the latest changes (a16c760) and can confirm it works with Sourcegraph versions without Cody LLM provider field available.

@taras-yemets taras-yemets enabled auto-merge (squash) September 15, 2023 13:07
@taras-yemets taras-yemets merged commit c475300 into main Sep 15, 2023
20 checks passed
@taras-yemets taras-yemets deleted the taras-yemets/configure-autocomplete-provider-based-on-site-config branch September 15, 2023 13:36
taras-yemets added a commit that referenced this pull request Sep 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants