-
Notifications
You must be signed in to change notification settings - Fork 209
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor: split min latency flags #1351
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some comments and questions inline.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it make more sense to have two feature flags that turn on their respective features individually instead of having this dependency where cody-autocomplete-lang-latency
requires cody-autocomplete-user-latency
to be on in order to work?
Co-authored-by: Philipp Spiess <hello@philippspiess.com> Co-authored-by: Dominic Cooney <dominic.cooney@sourcegraph.com>
@dominiccooney @philipp-spiess thanks for taking the time to review the PR. I've made the changes based on your valuable input,added a new feature flag for provider based, and made sure they all work independently. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice work. Some suggestions inline.
Co-authored-by: Dominic Cooney <dominic.cooney@sourcegraph.com>
RE: #1351 fix: apply min latency only when last suggestion was read The minimum latency for code suggestions is now only applied when the last suggestion was read by the user. This avoids applying the latency when the user is actively typing and has not read the previous suggestions. The lastCompletionRequestTimestamp property tracks when the last suggestion request was made. The latency is skipped if the time since last request is less than 750ms. ## Test plan <!-- Required. See https://docs.sourcegraph.com/dev/background-information/testing_principles. --> Check the output channel. You should not see latency getting increased after 5 keystrokes, which is the current behavior.
RE: https://sourcegraph.slack.com/archives/C05AGQYD528/p1696845420506809
refactor: replace cody autocomplete minimum latency flag with user, provider, and language based latency flags
Replaced the
CodyAutocompleteMinimumLatency
feature flag with two new flags:CodyAutocompleteUserLatency
: Enables minimum latency for Cody autocomplete. Works the same as the oldCodyAutocompleteMinimumLatency
flag but rename it to make sure the old flag will not be used in older versions.CodyAutocompleteLanguageLatency
: Enables language-specific minimum latency for Cody autocomplete for low-performance languages ONLYCodyAutocompleteProviderLatency
: Enables provider-specific minimum latency for non-anthropic providerExpected behavior:
CodyAutocompleteUserLatency
is enabledCodyAutocompleteLanguageLatency
is enabledCodyAutocompleteProviderLatency
is enabledThe getLatency function now accepts feature flags and calculates latency accordingly.
This allows more granular control over autocomplete latency based on user, provider, and language.
Test plan
Unit tests are updated.