Skip to content

Commit

Permalink
refactor: split min latency flags (#1351)
Browse files Browse the repository at this point in the history
RE: https://sourcegraph.slack.com/archives/C05AGQYD528/p1696845420506809

refactor: replace cody autocomplete minimum latency flag with user,
provider, and language based latency flags

Replaced the `CodyAutocompleteMinimumLatency` feature flag with two new
flags:

- `CodyAutocompleteUserLatency `: Enables minimum latency for Cody
autocomplete. Works the same as the old `CodyAutocompleteMinimumLatency`
flag but rename it to make sure the old flag will not be used in older
versions.
- `CodyAutocompleteLanguageLatency`: Enables language-specific minimum
latency for Cody autocomplete for low-performance languages ONLY
- `CodyAutocompleteProviderLatency`: Enables provider-specific minimum
latency for non-anthropic provider

Expected behavior:
- latency for comments will be applied if any of the feature flags is
enabled
- latency for user-based actions will only be applied if
`CodyAutocompleteUserLatency` is enabled
- latency for language-based actions will only be applied if
`CodyAutocompleteLanguageLatency` is enabled
- latency for provider-based actions will only be applied if
`CodyAutocompleteProviderLatency` is enabled

The getLatency function now accepts feature flags and calculates latency
accordingly.

This allows more granular control over autocomplete latency based on
user, provider, and language.

## Test plan

<!-- Required. See
https://docs.sourcegraph.com/dev/background-information/testing_principles.
-->

Unit tests are updated.

---------

Co-authored-by: Philipp Spiess <hello@philippspiess.com>
Co-authored-by: Dominic Cooney <dominic.cooney@sourcegraph.com>
  • Loading branch information
3 people committed Oct 11, 2023
1 parent bbfd647 commit 5680c24
Show file tree
Hide file tree
Showing 4 changed files with 306 additions and 117 deletions.
4 changes: 3 additions & 1 deletion lib/shared/src/experimentation/FeatureFlagProvider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,11 @@ export enum FeatureFlag {
CodyAutocompleteLlamaCode7B = 'cody-autocomplete-default-llama-code-7b',
CodyAutocompleteLlamaCode13B = 'cody-autocomplete-default-llama-code-13b',
CodyAutocompleteGraphContext = 'cody-autocomplete-graph-context',
CodyAutocompleteMinimumLatency = 'cody-autocomplete-minimum-latency',
CodyAutocompleteSyntacticTriggers = 'cody-autocomplete-syntactic-triggers',
CodyAutocompleteStarCoderExtendedTokenWindow = 'cody-autocomplete-starcoder-extended-token-window',
CodyAutocompleteLanguageLatency = 'cody-autocomplete-language-latency',
CodyAutocompleteUserLatency = 'cody-autocomplete-user-latency',
CodyAutocompleteProviderLatency = 'cody-autocomplete-provider-latency',
}

const ONE_HOUR = 60 * 60 * 1000
Expand Down
22 changes: 17 additions & 5 deletions vscode/src/completions/inline-completion-item-provider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ import {
LastInlineCompletionCandidate,
TriggerKind,
} from './get-inline-completions'
import { getLatency, resetLatency } from './latency'
import { getLatency, LatencyFeatureFlags, resetLatency } from './latency'
import * as CompletionLogger from './logger'
import { CompletionEvent, SuggestionID } from './logger'
import { ProviderConfig } from './providers/provider'
Expand Down Expand Up @@ -135,11 +135,15 @@ export class InlineCompletionItemProvider implements vscode.InlineCompletionItem

// We start feature flag requests early so that we have a high chance of getting a response
// before we need it.
const [isIncreasedDebounceTimeEnabledPromise, minimumLatencyFlagsPromise, syntacticTriggersPromise] = [
const [isIncreasedDebounceTimeEnabledPromise, syntacticTriggersPromise] = [
featureFlagProvider.evaluateFeatureFlag(FeatureFlag.CodyAutocompleteIncreasedDebounceTimeEnabled),
featureFlagProvider.evaluateFeatureFlag(FeatureFlag.CodyAutocompleteMinimumLatency),
featureFlagProvider.evaluateFeatureFlag(FeatureFlag.CodyAutocompleteSyntacticTriggers),
]
const minLatencyFlagsPromises = {
user: featureFlagProvider.evaluateFeatureFlag(FeatureFlag.CodyAutocompleteUserLatency),
language: featureFlagProvider.evaluateFeatureFlag(FeatureFlag.CodyAutocompleteLanguageLatency),
provider: featureFlagProvider.evaluateFeatureFlag(FeatureFlag.CodyAutocompleteProviderLatency),
}

const tracer = this.config.tracer ? createTracerForInvocation(this.config.tracer) : undefined
const graphContextFetcher = this.config.graphContextFetcher ?? undefined
Expand Down Expand Up @@ -254,9 +258,17 @@ export class InlineCompletionItemProvider implements vscode.InlineCompletionItem
// latency so that we don't show a result before the user has paused typing for a brief
// moment.
if (result.source !== InlineCompletionsResultSource.LastCandidate) {
const minimumLatencyFlag = await minimumLatencyFlagsPromise
if (triggerKind === TriggerKind.Automatic && minimumLatencyFlag) {
const latencyFeatureFlags: LatencyFeatureFlags = {
user: await minLatencyFlagsPromises.user,
language: await minLatencyFlagsPromises.language,
provider: await minLatencyFlagsPromises.provider,
}

const isMinLatencyEnabled =
latencyFeatureFlags.user || latencyFeatureFlags.language || latencyFeatureFlags.provider
if (triggerKind === TriggerKind.Automatic && isMinLatencyEnabled) {
const minimumLatency = getLatency(
latencyFeatureFlags,
this.config.providerConfig.identifier,
document.uri.fsPath,
document.languageId,
Expand Down
Loading

0 comments on commit 5680c24

Please sign in to comment.