Use Cody's chat to get contextually-aware answers to your questions.
-Cody **chat** allows you to ask coding-related questions about any part of your codebase or specific code snippets. You can do it from the **Chat** panel of the supported editor extensions (VS Code, JetBrains) or in the web app.
+You can **chat** with Cody to ask questions about your code, generate code, and edit code. By default, Cody has the context of your open file and entire repository, and you can use `@` to add context for specific files, symbols, remote repositories, or other non-code artifacts.
-Key functionalities in the VS Code extension include support for multiple simultaneous chats, enhanced chat context configurability through @-mentions, detailed visibility into the code that Cody read before providing a response, and more.
-
-You can learn more about the IDE support for these functionalities in the [feature parity reference](/cody/clients/feature-reference#chat).
+You can do it from the **chat** panel of the supported editor extensions ([VS Code](/clients/install-vscode), [JetBrains](/clients/install-jetbrains), [Visual Studio](/clients/install-visual-studio)) or in the [web](/clients/cody-with-sourcegraph) app.
## Prerequisites
-To use Cody's chat, you'll need to have the following:
+To use Cody's chat, you'll need the following:
- A Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise account
-- A supported editor extension (VS Code, JetBrains) installed
+- A supported editor extension (VS Code, JetBrains, Visual Studio) installed
## How does chat work?
-Cody answers questions by searching your codebase and retrieving context relevant to your questions. Cody uses several methods to search for context, including Sourcegraph's native search and keyword search. Finding and using context allows Cody to make informed responses based on your code rather than being limited to general knowledge. When Cody retrieves context to answer a question, it will tell you which code files it read to generate its response.
+Cody answers questions by searching your codebase and retrieving context relevant to your questions. Cody uses several methods to search for context, including Sourcegraph's native search and keyword search. Finding and using context allows Cody to make informed responses based on your code rather than being limited to general knowledge. When Cody retrieves context to answer a question, it will tell you which code files it reads to generate its response.
-Cody can assist you with various use cases such as:
+Cody can assist you with various use cases, such as:
- Generating an API call: Cody can analyze your API schema to provide context for the code it generates
- Locating a specific component in your codebase: Cody can identify and describe the files where a particular component is defined
- Handling questions that involve multiple files, like understanding data population in a React app: Cody can locate React component definitions, helping you understand how data is passed and where it originates
-## Ask Cody your first question
+## Chat features
-Let's use Cody VS Code extension's chat interface to answer your first question.
+There are several features that you can use to make your chat experience better. These features may vary depending on the [client](/cody/clients) you are using. You can learn more about the support for these functionalities in the [feature parity reference](/cody/clients/feature-reference#chat).
-- Click the Cody icon in the sidebar to view the detailed panel
-- Next, click the icon for **New Chat** to open a new chat window
-- Write your question or instruction to Cody and then press **Enter**.
+## Default context
-For example, ask Cody "What does this file do?"
+When you start a new Cody chat, the input window opens with a default `@-mention` context chips for the opened file and the current repository.
-Cody will take a few seconds to process your question, providing contextual information about the files it reads and generating the answer.
+
-
+At any point in time, you can edit these context chips or remove them entirely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate `@-mention` file or symbols to let Cody use it as a new context source.
-## Ask Cody to write code
+When you have both a repository and files @-mentioned, Cody will search the repository for context while prioritizing the mentioned files.
-The chat feature can also write code for your questions. For example, in VS Code, ask Cody to "write a function that sorts an array in ascending order".
+## Add new context
-You are provided with code suggestions in the chat window along with the following options for using the code.
+You can add new custom context by adding `@-mention` context chips to the chat. At any point, you can use `@-mention` a repository, file, line range, or symbol, to ask questions about your codebase. Cody will use this new context to generate contextually relevant code.
-- The **Copy Code** icon to your clipboard and paste the code suggestion into your code editor
-- Insert the code suggestion at the current cursor location by the **Insert Code at Cursor** icon
-- The **Save Code to New File** icon to save the code suggestion to a new file in your project
+## OpenCtx context providers
+
+OpenCtx context providers are in the Experimental stage for all Cody VS Code users. Enterprise users can also use this, but with limited support. If you have feedback or questions, please visit our [support forum](https://community.sourcegraph.com/c/openctx/10).
+
+[OpenCtx](https://openctx.org/) is an open standard for bringing contextual info about code into your dev tools. Cody Free and Pro users can use OpenCtx providers to fetch and use context from the following sources:
+
+- [Webpages](https://openctx.org/docs/providers/web) (via URL)
+- [Jira tickets](https://openctx.org/docs/providers/jira)
+- [Linear issues](https://openctx.org/docs/providers/linear-issues)
+- [Notion pages](https://openctx.org/docs/providers/notion)
+- [Google Docs](https://openctx.org/docs/providers/google-docs)
+- [Sourcegraph code search](https://openctx.org/docs/providers/sourcegraph-search)
+
+You can use `@-mention` web URLs to pull live information like docs. You can connect Cody to OpenCtx to `@-mention` non-code artifacts like Google Docs, Notion pages, Jira tickets, and Linear issues.
+
+## Run offline
+
+Support with Ollama is currently in the Experimental stage and is available for Cody Free and Pro plans.
+
+Cody chat can run offline with Ollama. The offline mode does not require you to sign in with your Sourcegraph account to use Ollama. Click the button below the Ollama logo, and you'll be ready to go.
+
+
+
+You can still switch to your Sourcegraph account whenever you want to use Claude, OpenAI, Gemini, Mixtral, etc.
+
+## LLM selection
+
+Cody allows you to select the LLM you want to use for your chat, optimized for speed versus accuracy. Cody Free and Pro users can select multiple models. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model.
-During the chat, if Cody needs additional context, it can ask you to provide more information with a follow-up question. If your question is beyond the scope of the context, Cody will ask you to provide an alternate question aligned with the context of your codebase.
+You can read about these supported LLM models [here](/cody/capabilities/supported-models#chat-and-commands).
-## Selecting Context
+
-Cody's chat allows you to add files and symbols as context in your messages.
+## Smart Apply and Execute code suggestions
-- Type `@` and then a filename to include a file as context
-- Type `@#` and then a symbol name to include the symbol's definition as context. Functions, methods, classes, types, etc., are all symbols
+Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Whenever Cody provides a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff. For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code.
-Cody's experimental [OpenCtx](/cody/capabilities/openctx) support adds even more context sources, including Jira, Linear, Google Docs, Notion, and more.
+Smart Apply also supports the executing of commands in the terminal. When you ask Cody a question related to terminal commands, you can now execute the suggestion in your terminal by clicking the `Execute` button in the chat window.
-### Chat vs Commands
+
+
+## Chat history
+
+Cody keeps a history of your chat sessions. You can view it by clicking the **History** button in the chat panel. You can **Export** it to a JSON file for later use or click the **Delete all** button to clear the chat history.
+
+## Prompts and Commands
+
+Cody offers quick, ready-to-use [prompts and commands](/cody/capabilities/commands) for common actions to write, describe, fix, and smell code. These allow you to run predefined actions with smart context-fetching anywhere in the editor, like:
+
+- **New Chat**: Ask Cody a question
+- **Document Code**: Add code documentation
+- **Edit Code**: Edit code with instructions
+- **Explain Code**: Describe your code with more details
+- **Generate Unit Tests**: Write tests for your code
+
+
+
+Read more about [prompts and commands](/cody/capabilities/commands).
+
+## Ask Cody to write code
+
+Cody chat can also write code for your questions. For example, in VS Code, ask Cody to "write a function that sorts an array in ascending order".
+
+You are provided with code suggestions in the chat window and the following options for using the code.
+
+- The **Copy Code** icon to your clipboard and paste the code suggestion into your code editor
+- Insert the code suggestion at the current cursor location by the **Insert Code at Cursor** icon
+- The **Save Code to New File** icon to save the code suggestion to a new file in your project
-There could be scenarios when Cody's chat might not be able to answer your question. Or the answer lacks the context that you need. In these cases, it's recommended to use Cody **commands**. Cody's responses to commands might be better at times than responses to chats since they've been pre-packaged and prompt-engineered.
+If Cody's answer isn't helpful, you can try asking again with a different context:
- Commands are only supported in the VS Code and JetBrains extension.
+- **Public knowledge only**: Cody will not use your own code files as context; it’ll only use knowledge trained into the base model.
+- **Current file only**: Re-run the prompt again using just the current file as context.
+- **Add context**: Provides @-mention context options to improve the response by explicitly including files, symbols, remote repositories, or even web pages (by URL).
From 8d13d0a00444f35ee423123fc5cc9c821c6a2c72 Mon Sep 17 00:00:00 2001
From: Maedah Batool
Date: Mon, 4 Nov 2024 14:04:47 -0800
Subject: [PATCH 2/2] Add more improvements
---
docs/cody/capabilities/autocomplete.mdx | 4 +++
docs/cody/capabilities/commands.mdx | 36 ++++++++++++++-----------
docs/cody/capabilities/index.mdx | 22 +++++----------
3 files changed, 30 insertions(+), 32 deletions(-)
diff --git a/docs/cody/capabilities/autocomplete.mdx b/docs/cody/capabilities/autocomplete.mdx
index 6488392a8..884d52a99 100644
--- a/docs/cody/capabilities/autocomplete.mdx
+++ b/docs/cody/capabilities/autocomplete.mdx
@@ -41,3 +41,7 @@ By default, a fully configured Sourcegraph instance picks a default LLM to gener
- Click the **Save** button to save the changes
Cody supports and uses a set of models for autocomplete. Learn more about these [here](/cody/capabilities/supported-models#autocomplete). It's also recommended to read the [Enabling Cody on Sourcegraph Enterprise](/cody/clients/enable-cody-enterprise) docs.
+
+## LLM models for autocomplete
+
+Cody uses a set of models for autocomplete. Learn more about these [here](/cody/capabilities/supported-models#autocomplete).
diff --git a/docs/cody/capabilities/commands.mdx b/docs/cody/capabilities/commands.mdx
index 760a75808..6d7bd45ba 100644
--- a/docs/cody/capabilities/commands.mdx
+++ b/docs/cody/capabilities/commands.mdx
@@ -1,16 +1,14 @@
# Prompts and Commands
-
Learn how prompts and commands can kick-start your workflow.
+
Learn how prompts and commands can kick-start your workflow with Cody.
## Prompts
Prompts and Prompts Library are currently available in Cody's VS Code extension and the Sourcegraph Web UI.
-Apart from these ready-to-use commands, you can create your own prompts from the **Prompt Library** via the Sourcegraph's Web UI. From here, you can easily create, edit, share, and discover prompts you’ve created or have been shared within your organization.
+Cody offers quick, ready-to-use **Prompts** to automate key tasks in your workflow. Prompts are created and saved in the **Prompt Library** and can be accessed from the **Tools > Prompt Library** in the top navigation bar in Sourcegraph's web UI.
-## Prompt Library
-
-The Prompt Library can be accessed from the **Tools > Prompt Library** in the top navigation bar in Sourcegraph's web UI. From here, you can view, create or edit a prompt. You can also search for prompts, filter the list to find a specific prompt by owner, and sort by name or recently updated.
+From here, you can easily create, edit, share, and discover prompts you’ve created or have been shared within your organization. You can also search for prompts, filter the list to find a specific prompt by owner, and sort by name or recently updated.

@@ -18,37 +16,41 @@ The Prompt Library can be accessed from the **Tools > Prompt Library** in the to
Click the **New prompt** button on the **Prompt Library** page to create a new prompt.
-- Select the **Owner** and **Name** of the prompt
-- Then, write a prompt description
-- And finally, fill out the **Prompt template** box with all your prompt instructions
-- Moreover, you can mark your prompt as **Draft**.
+- Select the **Owner** and **Prompt Name**
+- Next, write a prompt description
+- Finally, fill out the **Prompt template** box with all your prompt instructions
+- You can also add dynamic context that will allow your prompt to use content from different sources like current selection and current file
+- Select the visibility of the prompt, either **Public** or **Private**
- Once done, click the **Create prompt** button
+A few advanced options exist to create a prompt that you can configure. They are in the Advanced section.
+
+
+
The prompt is visible to and usable by:
-- The prompt's owner: if the prompt's owner is a user
-- All members of the organization: if the prompt's owner is an organization
-- Everyone: if the prompt is marked **Public** (which only site admins can do)
+- **The prompt's owner**: If the prompt's owner is a user
+- **All members of the organization**: If the prompt's owner is an organization
+- **Everyone**: If the prompt is marked **Public** (which only site admins can do)
Completing this process will do the following:
- The new prompt will be added to the **Prompt Library** page
-- The prompt will appear in the Prompts list in Cody (in the editor and on the web)
+- The prompt will appear in the Prompts list in the Cody chat panel (in the editor and on the web)

### Draft prompts
-You can mark your prompt as a draft. A draft prompt cannot be seen by anyone but you. You can only see and modify your draft prompts via the **Prompt Library** via the Sourcegraph's Web UI.
+You can mark your prompt as a draft. A draft prompt is not visible to everyone. You can only see and modify your draft prompts via the **Prompt Library** via the Sourcegraph's Web UI.
### Editing a prompt
To edit a prompt, click the Edit button next to the prompt in the Prompt Library and make the necessary changes. You can also use this interface to **transfer ownership** of the prompt or delete it from this view.
-
### Using prompts
-Prompts work in the same way as commands. Inside Cody's chat window there is a drop-down called **prompts** next to the LLM selector. Use this to select a prompt and run on your codebase.
+Prompts work in the same way as commands. Inside Cody's chat window is a drop-down called **prompts** next to the LLM selector. Use this to select a prompt and run on your codebase.

@@ -66,6 +68,8 @@ Promoted Prompts are marked with an icon next to their name and appear at the to
## Commands
+ Prompts' functionality largely overlaps with commands. We intend to sunset commands in an upcoming release in favor of Prompts and the Prompt Library.
+
Cody offers quick, ready-to-use **commands** for common actions to write, describe, fix, and smell code. These allow you to run predefined actions with smart context-fetching anywhere in the editor. Like autocomplete and chat, commands will search for context in your codebase to provide more contextually aware and informed answers.
Commands are available in VS Code, JetBrains, and the Sourcegraph web app. Commands can handle tasks like:
diff --git a/docs/cody/capabilities/index.mdx b/docs/cody/capabilities/index.mdx
index d4a04a541..42944f70e 100644
--- a/docs/cody/capabilities/index.mdx
+++ b/docs/cody/capabilities/index.mdx
@@ -5,28 +5,18 @@
Cody offers a rich set of capabilities and features that help you write better code faster. These include:
-
-- Chat: Allows users to ask general programming questions or inquire about specific code
-- Autocomplete: Suggests code completions as you type, utilizing context from your code, open files, and file history
-- Commands & Custom Commands: Provide predefined, reusable prompts for common actions, such as documenting code, explaining code, generating unit tests, and identifying code smells
-- OpenCtx Context Providers: Add additional context sources from outside of your code base by leveraging OpenCtx Providers
-- Debug code: Helps you with identifying and fixing code errors and bugs.
-- Cody Ignore: Helps you ignore selected files or folders from chat and autocomplete result
-
-Learn more about each by exploring their respective documentation:
-
-
+
-
+
-
+
-
+
-
+
-
+