Releases: sourcegraph/cody
Cody for VS Code 1.8.0
✨ See the What’s new in v1.8 blog post for what’s new in this release since v1.6 ✨
v1.8.0 Changes
- Chat: Adds experimental support for local Ollama chat models. Simply start the Ollama app. You should be able to find the models you have pulled from Ollama in the model dropdown list in your chat panel after restarting VS Code. For detailed instructions, see pull/3282 by @abeatrix
- Chat: Adds support for line ranges with @-mentioned files (Example:
Explain @src/README.md:1-5
). pull/3174 by @abeatrix - Chat: Command prompts are now editable and compatible with @ mentions. pull/3243 by @abeatrix
- Chat: Add Claude 3 Sonnet and Claude 3 Opus for Pro users. pull/3301 by @philipp-spiess
- Commands: Updated the prompts for the
Explain Code
andFind Code Smell
commands to include file ranges. pull/3243 by @abeatrix - Custom Command: All custom commands are now listed individually under the
Custom Commands
section in the Cody sidebar. pull/3245 by @abeatrix - Custom Commands: You can now assign keybindings to individual custom commands. Simply search for
cody.command.custom.{CUSTOM_COMMAND_NAME}
(e.g.cody.command.custom.commit
) in the Keyboard Shortcuts editor to add keybinding. pull/3242 by @abeatrix - Chat/Search: Local indexes are rebuilt automatically on a daily cadence when they are stale. Staleness is determined by checking whether files have changed across Git commits and in the set of working file updates not yet committed. pull/3261 by @beyang
- Debug: Added
Export Logs
functionality toSettings & Support
sidebar for exporting output logs whencody.debug.enabled
is enabled. Also available in the Command Palette underCody: Export Logs
. pull/3256 by @abeatrix - Auth: Adds a new onboarding flow that does not require the redirect back to VS Code behind a feature flag. pull/3244 by @philipp-spiess
- Font: Adds Ollama logo. pull/3281 by @abeatrix
- Auth: Logging in via redirect should now work in Cursor. This requires Sourcegraph 5.3.2 or later. pull/3241 by @beyang
- Chat: Fixed error
found consecutive messages with the same speaker 'assistant'
that occurred when prompt length exceeded limit. pull/3228 by @abeatrix - Edit: Fixed an issue where preceding and following text would not be included for instruction-based Edits. pull/3309 by @umpox
- Debug: The
cody.debug.enabled
setting is now set totrue
by default. pull/ by @abeatrix
New Contributors
- @hitesh-1997 made their first contribution in #3111
- @PriNova made their first contribution in #3273
Full Changelog: vscode-v1.6.0...vscode-v1.8.0
Cody for VS Code 1.6.1
✨ See the What’s new in v1.6 blog post for what’s new in this release since v1.5 ✨
v1.6.1 Changes
- Autocomplete: Reduce the adaptive timeout to match latency improvements by @valerybugakov in #3283
Full Changelog: vscode-v1.6.0...vscode-v1.6.1
Cody for VS Code 1.6.0
✨ See the What’s new in v1.6 blog post for what’s new in this release since v1.5 ✨
v1.6.0 Changes
- Autocomplete: Adds a new experimental throttling mechanism that should decrease latency and backend load by @philipp-spiess in #3186
- Edit: Added keyboard shortcuts for codelens actions such as "Undo" and "Retry" [pull/2757][https://github.com//pull/2757
- Chat: Displays warnings for large @-mentioned files during selection by @abeatrix in #3118
- Once sourcegraph/sourcegraph#60515 is deployed, login works in VSCodium by @dominiccooney in #3167
- Autocomplete: Fixed an issue where the loading indicator might get stuck in the loading state by @philipp-spiess in #3178
- Autocomplete: Fixes an issue where Ollama results were sometimes not visible when the current line has text after the cursor by @philipp-spiess in #3213
- Chat: Fixed an issue where Cody Chat steals focus from file editor after a request is completed by @abeatrix in #3147
- Chat: Fixed an issue where the links in the welcome message for chat are unclickable by @abeatrix in #3155
- Chat: File range is now displayed correctly in the chat view by @abeatrix in #3172
- Autocomplete: Removes the latency for cached completions. [https://github.com//pull/3138](#3138
- Autocomplete: Enable the recent jaccard similarity improvements by default by @philipp-spiess in #3135
- Autocomplete: Start retrieval phase earlier to improve latency by @philipp-spiess in #3149
- Autocomplete: Trigger one LLM request instead of three for multiline completions to reduce the response latency by @valerybugakov in #3176
- Autocomplete: Allow the client to pick up feature flag changes that were previously requiring a client restart by @philipp-spiess in #2992
- Chat: Add tracing by @philipp-spiess in #3168
- Command: Leading slashes are removed from command names in the command menu by @abeatrix in #3061
Full Changelog: vscode-v1.4.4...vscode-v1.6.0
Cody for VS Code 1.4.4
✨ See the What’s new in v1.4 blog post for what’s new in this release since v1.3 ✨
v1.4.4 Changes
Full Changelog: vscode-v1.4.3...vscode-v1.4.4
Cody for VS Code 1.4.3
✨ See the What’s new in v1.4 blog post for what’s new in this release since v1.3 ✨
v1.4.3 Changes
- Autocomplete: Updated the BFG binary version by @valerybugakov in #3130
Full Changelog: vscode-v1.4.2...vscode-v1.4.3
Cody for VS Code 1.4.2
✨ See the What’s new in v1.4 blog post for what’s new in this release since v1.3 ✨
v1.4.2 Changes
- Chat: Fixed an issue where Cody would sometimes exceed the context window limit for shorter context OpenAI models by @beyang in #3121
Full Changelog: vscode-v1.4.1...vscode-v1.4.2
Cody for VS Code 1.2.4
✨ See the What’s new in v1.2 blog post for what’s new in this release since v1.2 ✨
v1.2.4 Changes
- Chat: Fixed an issue where Cody would sometimes exceed the context window limit for shorter context OpenAI models by @beyang in #3121
Full Changelog: vscode-v1.2.3...vscode-v1.2.4
Cody for VS Code 1.4.1
✨ See the What’s new in v1.4 blog post for what’s new in this release since v1.3 ✨
v1.4.1 Changes
- Chat: Support
@-mentioned
in mid sentences by @abeatrix in #3043 - Chat: Support
@-mentioned
in editing mode by @abeatrix in #3091 - Autocomplete: Fixed the completion partial removal upon acceptance caused by
cody.autocomplete.formatOnAccept
by @valerybugakov in #3083 - Autocomplete: Improve client side tracing to get a better understanding of the E2E latency by @philipp-spiess in #3034
- Autocomplete: Move some work off the critical path in an attempt to further reduce latency by @philipp-spiess in #3096
- Custom Command: The
description
field is now optional and will default to use the command prompt by @abeatrix in #3025
Full Changelog: vscode-v1.4.0...vscode-v1.4.1
Cody for VS Code 1.4.0
✨ See the What’s new in v1.4 blog post for what’s new in this release since v1.3 ✨
v1.4.0 Changes
- Autocomplete: Add a new
cody.autocomplete.disableInsideComments
option to prevent completions from being displayed while writing code comments by @philipp-spiess in #3049 - Autocomplete: Added a shortcut to go to the Autocomplete settings from the Cody Settings overlay by @philipp-spiess in #3048
- Chat: Display Cody icon in the editor title of the chat panels when
cody.editorTitleCommandIcon
is enabled by @abeatrix in #2937 - Command: The
Generate Unit Tests
command now functions as an inline edit command. When executed, the new tests will be automatically appended to the test file. If no existing test file is found, a temporary one will be created by @abeatrix in #2959 - Command: You can now highlight the output in your terminal panel and right-click to
Ask Cody to Explain
by @abeatrix in #3008 - Edit: Added a multi-model selector to the Edit input, allowing quick access to change the Edit LLM by @umpox in #2951
- Edit: Added Cody Pro support for models: GPT-4, GPT-3.5, Claude 2.1 and Claude Instant by @umpox in #2951
- Edit: Added new keyboard shortcuts for Edit (
Alt+K
) and Chat (Alt+L
) by @umpox in #2865 - Edit: Improved the input UX. You can now adjust the range of the Edit, select from available symbols in the document, and get quick access to the "Document" and "Test" commands by @umpox in #2884
- Edit/Chat: Added "ghost" text alongside code to showcase Edit and Chat commands. Enable it by setting
cody.commandHints.enabled
to true by @umpox in #2865 - [Internal] Command: Added new code lenses for generating additional unit tests by @abeatrix in #2959
- Chat: Messages without enhanced context should not include the sparkle emoji in context list by @abeatrix in #3006
- Custom Command: Fixed an issue where custom commands could fail to load due to an invalid entry (e.g. missing prompt) by @abeatrix in #3012
- Edit: Fixed an issue where "Ask Cody to Explain" would result in an error by @umpox in #3015
- Autocomplete: Expanded the configuration list to include
astro
,rust
,svelte
, andelixir
for enhanced detection of multiline triggers by @valerybugakov in #3044 - Autocomplete: Improved the new jaccard similarity retriever and context mixing experiments by @philipp-spiess in #2898
- Autocomplete: Multiline completions are now enabled only for languages from a predefined list by @valerybugakov in #3044
- Autocomplete: Remove obvious prompt-continuations by @philipp-spiess in #2974
- Autocomplete: Enables the new fast-path mode for all Cody community users to directly connect with our inference service by @philipp-spiess in #2927
- Autocomplete: Rename
unstable-ollama
option toexperimental-ollama
to better communicate the current state. We still supportunstable-ollama
in the config for backward compatibility by @philipp-spiess in #3077 - Chat: Edit buttons are disabled on messages generated by the default commands by @abeatrix in #3005
Full Changelog: vscode-v1.2.3...vscode-v1.4.0
Cody for VS Code 1.2.3
✨ See the What’s new in v1.2 blog post for what’s new in this release since v1.1 ✨
v1.2.3 Changes
- Autocomplete: local inference support with deepseek-coder powered by ollama by @valerybugakov in #2966
- Autocomplete: Add a new experimental fast-path mode for Cody community users that directly connections to our inference services by @philipp-spiess in #2927
Full Changelog: vscode-v1.2.2...vscode-v1.2.3