You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently the autocomplete provider in the extension must be manually selected if a user is not using the default Anthropic provider.
Sourcegraph instances can be be configured to use the following providers for autocomplete:
Sourcegraph gateway
Anthropic (using own keys)
Azure OpenAI (using own keys)
When an extensions connects to a sourcegraph instance the correct autocomplete provider should be defaulted based on the how the instance is configured. For example if the sourcegraph instance is using the Azure OpenAI provider the extension should default to using the unstable-openai
The text was updated successfully, but these errors were encountered:
Extending the current feature flag based solution to work for instances that are not sourcegraph.com
I think this is a reasonable approach if we end up in a situation where we need a quick solution but isn't ideal
Extending the current logic that gets the LLMConfiguration to include enough information to properly select an autocomplete provider. To me this feels like a better long term solution.
To properly select the extension autocomplete provider the site completion provider and model are potentially needed. When using the sourcegraph provider the model will be prefixed with the type openai, anthropic etc... If using Azure OpenAI the provider is the relevant piece of information as the model name is user defined and therefore isn't useful.
…nfig (#1035)
Part of #931
Test in pair with sourcegraph/sourcegraph#56568
Defines the autocomplete provider config based on the completions
provider and model names from the site config.
The suggested configuration hierarchy:
1. If `cody.autocomplete.advanced.provider` field in VSCode settings is
set to a supported provider name, and all the additional conditions like
model, access token, etc. are met the corresponding provider config is
returned. Otherwise, return `null` (completions provider is not
created).
2. If the provider name and model can be defined based on the evaluated
feature flags, return the corresponding provider.
3. If the completions provider name is defined in the connected
Sourcegraph instance site config, we return the corresponding provider
config. If the provider name/model can't be parsed or this provider is
not supported `null` is returned (completions provider is not created).
4. Anthropic config provider is used by default.
Currently the autocomplete provider in the extension must be manually selected if a user is not using the default Anthropic provider.
Sourcegraph instances can be be configured to use the following providers for autocomplete:
When an extensions connects to a sourcegraph instance the correct autocomplete provider should be defaulted based on the how the instance is configured. For example if the sourcegraph instance is using the Azure OpenAI provider the extension should default to using the
unstable-openai
The text was updated successfully, but these errors were encountered: