Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Sourcegraph Docs

<!-- Working branch for August Release -->
<!-- Working branch for September 2024 Release -->

Welcome to the Sourcegraph documentation! We're excited to have you contribute to our docs. We've recently rearchitectured our docs tech stack — powered by Next.js, TailwindCSS and deployed on Vercel. This guide will walk you through the process of contributing to our documentation using the new tech stack.

Expand Down
2 changes: 1 addition & 1 deletion docs.config.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
const config = {
DOCS_LATEST_VERSION: '5.6'
DOCS_LATEST_VERSION: '5.7'
};

module.exports = config;
2 changes: 1 addition & 1 deletion docs/admin/config/webhooks/index.mdx
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Webhooks

- [Incoming webhooks](/admin/config/webhooks/incoming)
- [Outgoing webhooks](/admin/config/webhooks/outgoing) (Beta)
- [Outgoing webhooks](/admin/config/webhooks/outgoing)
2 changes: 1 addition & 1 deletion docs/admin/config/webhooks/outgoing.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Outgoing webhooks

<Callout type="note"> This feature is currently in beta and supported on Sourcegraph versions 5.0 or more.</Callout>
<Callout type="note"> This feature is supported on Sourcegraph versions 5.0 or more.</Callout>

Outgoing webhooks can be configured on a Sourcegraph instance in order to send Sourcegraph events to external tools and services. This allows for deeper integrations between Sourcegraph and other applications.

Expand Down
49 changes: 23 additions & 26 deletions docs/code-search/types/search-jobs.mdx
Original file line number Diff line number Diff line change
@@ -1,13 +1,25 @@
# Search Jobs

<p className="subtitle">Use Search Jobs to search code at scale for large-scale organizations.</p>

<Callout type="note" title="Beta"> Search Jobs feature is in Beta stage and only available for Enterprise accounts. It is enabled by default since 5.3.0.</Callout>
Use Search Jobs to search code at scale for large-scale organizations.

Search Jobs allows you to run search queries across your organization's codebase (all repositories, branches, and revisions) at scale. It enhances the existing Sourcegraph's search capabilities, enabling you to run searches without query timeouts or incomplete results.

With Search Jobs, you can start a search, let it run in the background, and then download the results from the Search Jobs UI when it's done. Site administrators can **enable** or **disable** the Search Jobs feature, making it accessible to all users on the Sourcegraph instance.

## Using Search Jobs

To use Search Jobs, you need to:

- Run a search query from your Sourcegraph instance
- Click the result menu below the search bar to see if your query is supported by Search Jobs

![run-query-for-search-jobs](https://storage.googleapis.com/sourcegraph-assets/Docs/search-jobs/search-jobs-create.png)

- If your query is valid, click **Create a search job** to initiate the search job
- You will be redirected to the "Search Jobs UI" page at `/search-jobs`, where you can view all your created search jobs. If you're a site admin, you can also view search jobs from other users on the instance

![view-search-jobs](https://storage.googleapis.com/sourcegraph-assets/Docs/search-jobs/search-jobs-manage.png)

## Search results format

The downloaded results are formatted as [JSON lines](https://jsonlines.org).
Expand Down Expand Up @@ -67,21 +79,6 @@ If you would like to allow your Sourcegraph instance to control the creation and

- `SEARCH_JOBS_UPLOAD_MANAGE_BUCKET=true`

## Using Search Jobs

To use Search Jobs, you need to:

- Run a search query from your Sourcegraph instance
- Click the result menu below the search bar to see if your query is valid for the long search

![run-query-for-search-jobs](https://storage.googleapis.com/sourcegraph-assets/Docs/query-serach-jobs.png)

- If your query is valid, click **Run search job** to initiate the search job
- You will be redirected to the "Search Jobs UI" page at `/search-jobs`, where you can view all your created search jobs. If you're a site admin, you can also view search jobs from other users on the instance

![view-search-jobs](https://storage.googleapis.com/sourcegraph-assets/Docs/view-search-jobs.png)


## Supported result types

Search jobs supports the following result types:
Expand All @@ -99,20 +96,20 @@ The following result types are not supported:
The following elements of our query language are not supported:

- file predicates, such as `file:has.content`, `file:has.owner`, `file:has.contributor`, `file:contains.content`
- `.*` regexp search
- catch-all `.*` regexp search
- Multiple `rev` filters
- Queries with `index: filter`

<Callout type="note">The search bar already supports exhaustive searches with the `count:all` operator. However, these searches are subject to lower timeouts, making them harder to use in large codebases.</Callout>

## Disable Search Jobs
<Callout type="note">
Alternatively, the search bar supports the `count:all` operator which increases result limits and timeouts.
This works well if the search completes within a few minutes and the number of results is less than the configured display limit.
For longer running searches and searches with huge result sets, Search Jobs is the better choice.
</Callout>

Follow these steps to disable Search Jobs and to hide the feature in the Sourcegraph UI:
## Disable Search Jobs

- Login to your Sourcegraph instance and go to the site admin
- Next, click the site configuration
- From here, you'll see `experimentalFeatures`
- Set `searchJobs` to `false` and then refresh the page
To disable Search Jobs, set `DISABLE_SEARCH_JOBS=true` in your frontend and worker services.

## FAQ

Expand Down
2 changes: 1 addition & 1 deletion docs/code_monitoring/how-tos/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@

* [Starting points](/code_monitoring/how-tos/starting_points)
* [Setting up Slack notifications](/code_monitoring/how-tos/slack)
* [Setting up Webhook notifications](/code_monitoring/how-tos/webhook) (Beta)
* [Setting up Webhook notifications](/code_monitoring/how-tos/webhook)
2 changes: 0 additions & 2 deletions docs/code_monitoring/how-tos/webhook.mdx
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
# Setting up Webhook notifications

<Callout type="note">This feature is in beta.</Callout>

Webhook notifications provide a way to execute custom responses to a code monitor notification.
They are implemented as a POST request to a URL of your choice. The body of the request is defined
by Sourcegraph, and contains all the information available about the cause of the notification.
Expand Down
2 changes: 1 addition & 1 deletion docs/cody/capabilities/autocomplete.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Cody's autocompletion model has been designed to enhance speed, accuracy, and th
- **Improved accuracy for multi-line completions**: Completions across multiple lines are more relevant and accurately aligned with the surrounding code context
- **Higher completion acceptance rates**: The average completion acceptance rate (CAR) is improved by more than 4%, providing a more intuitive user interaction

On the technical side, Cody's autocomplete is optimized for both server-side and client-side performance, ensuring seamless integration into your coding workflow. The **default** autocomplete model for Cody Free and Pro users is **[DeepSeek V2](https://huggingface.co/deepseek-ai/DeepSeek-V2)**, which significantly helps boost both the responsiveness and accuracy of autocomplete. Cody Enterprise users get **StarCoder** as the default autocomplete model.
On the technical side, Cody's autocomplete is optimized for both server-side and client-side performance, ensuring seamless integration into your coding workflow. The **default** autocomplete model for Cody Free, Pro and Enterprise users is **[DeepSeek V2](https://huggingface.co/deepseek-ai/DeepSeek-V2)**, which significantly helps boost both the responsiveness and accuracy of autocomplete. Cody Enterprise users get **StarCoder** as the default autocomplete model.

## Prerequisites

Expand Down
2 changes: 1 addition & 1 deletion docs/cody/capabilities/supported-models.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Cody uses a set of models for autocomplete which are suited for the low latency

| **Provider** | **Model** | **Free** | **Pro** | **Enterprise** | | | | |
| :-------------------- | :---------------------------------------------------------------------------------------- | :------- | :------ | :------------- | --- | --- | --- | --- |
| Fireworks.ai | [DeepSeek-V2](https://huggingface.co/deepseek-ai/DeepSeek-V2) | ✅ | ✅ | - | | | | |
| Fireworks.ai | [DeepSeek-V2](https://huggingface.co/deepseek-ai/DeepSeek-V2) | ✅ | ✅ | | | | | |
| Fireworks.ai | [StarCoder](https://arxiv.org/abs/2305.06161) | - | - | ✅ | | | | |
| Anthropic | [claude Instant](https://docs.anthropic.com/claude/docs/models-overview#model-comparison) | - | - | ✅ | | | | |
| Google Gemini (Beta) | [1.5 Flash](https://deepmind.google/technologies/gemini/flash/) | - | - | ✅ | | | | |
Expand Down
10 changes: 5 additions & 5 deletions docs/cody/clients/cody-with-sourcegraph.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,10 @@

<p className="subtitle">Learn how to use Cody in the web interface with your Sourcegraph.com instance.</p>

<Callout type="info"> The new chat UI for Cody for web is currently in Beta and is available to users on Sourcegraph versions >=5.5.</Callout>

In addition to the Cody extensions for [VS Code](/cody/clients/install-vscode), [JetBrains](/cody/clients/install-jetbrains) IDEs, and [Neovim](/cody/clients/install-neovim), Cody is also available in the Sourcegraph web app. Community users can use Cody for free by logging into their accounts on Sourcegraph.com, and enterprise users can use Cody within their Sourcegraph instance.

<LinkCards>
<LinkCard href="https://sourcegraph.com/cody/chat" imgSrc="https://sourcegraph.com/.assets/img/sourcegraph-mark.svg" imgAlt="Cody for Web" title="Cody for Web (Beta)" description="Use Cody in the Sourcegraph Web App." />
<LinkCard href="https://sourcegraph.com/cody/chat" imgSrc="https://sourcegraph.com/.assets/img/sourcegraph-mark.svg" imgAlt="Cody for Web" title="Cody for Web" description="Use Cody in the Sourcegraph Web App." />
</LinkCards>

## Initial setup
Expand All @@ -27,13 +25,15 @@ The chat interface for Cody on the web is similar to the one you get with the [V

The chat interface with your Code Search queries is operated per chat. You cannot run multiple chats and store them in parallel. A new chat window opens whenever you click the Cody button from the query editor or the top header.

<Callout type="info"> The new and improved chat UI for Cody for web is currently available to users on Sourcegraph versions >=5.5. It's recommeded to update your Sourcegraph instance to the latest version to use this new chat interface. </Callout>

## LLM Selection

Sourcegraph.com users with Cody **Free** and **Pro** can choose from a list of supported LLM models for a chat. Claude Sonnet 3 is the default LLM model, but users can select the LLM of their choice from the drop-down menu.
Sourcegraph.com users with Cody **Free** and **Pro** can choose from a list of supported LLM models for a chat. Claude Sonnet 3.5 is the default LLM model, but users can select the LLM of their choice from the drop-down menu.

![llm-select-web](https://storage.googleapis.com/sourcegraph-assets/Docs/llm-select-web-0724.jpg)

Users on an Enterprise Sourcegraph instance do not have the option to choose an LLM model. Their site admin will configure the default LLM model for chat.
Users on an Enterprise Sourcegraph instance do not have the option to choose an LLM model. Their site admin will configure the default LLM model for chat. However, Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model.

## Selecting Context with @-mentions

Expand Down
16 changes: 16 additions & 0 deletions docs/cody/clients/enable-cody-enterprise.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,22 @@ Cody Enterprise supports searching up to 10 repositories to find relevant contex
* In VS Code, open a new Cody chat, type `@`, and select `Remote Repositories` to search other repositories for context
* In JetBrains, use the enhanced context selector

### @-mention directory

<Callout type="info">@-mentioning directory is available for Enterprise users on VS Code, JetBrains, and Cody Web.</Callout>

To better support teams working with large monorepos, Enterprise users can `@-mention` directories when chatting with Cody. This helps you define more specific directories and sub-directories within that monorepo to give more precise context.

To do this, type `@` in the chat, and then select **Directories** to search other repositories for context in your codebase.

<video width="1920" height="1080" loop playsInline controls style={{ width: '100%', height: 'auto' }}>
<source src="https://storage.googleapis.com/sourcegraph-assets/Docs/Media/%40-mention-dir-0824.mp4" type="video/mp4"/>
</video>

Please note that you can only `@-mention` remote directories (i.e., directories in your Sourcegraph instance) but not local directories. This means any recent changes to your directories can't be utilized as context until your Sourcegraph instance re-indexes any changes.

If you want to include recent changes that haven't been indexed in your Sourcegraph instance, you can `@-mention` specific files, lines of code, or symbols.

## Supported LLM models

Sourcegraph Enterprise supports many different LLM providers and models. You can use state of the art code completion models such as Anthropic's Claude or OpenAI's ChatGPT by adjusting your Sourcegraph instance's configuration.
Expand Down
3 changes: 2 additions & 1 deletion docs/cody/clients/feature-reference.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
| Local context | ✅ | ✅ | ❌ |
| OpenCtx context providers (experimental) | ✅ | ❌ | ❌ |
| **Prompts and Commands** | | | |
| Access to prompts and Prompt library | ✅ | | ✅ |
| Access to prompts and Prompt library | ✅ | | ✅ |
| Custom commands | ✅ | ❌ | ❌ |
| Edit code | ✅ | ✅ | ❌ |
| Generate unit test | ✅ | ✅ | ❌ |
Expand Down Expand Up @@ -49,5 +49,6 @@ Few exceptions that apply to Cody Pro and Cody Enterprise users:
- Multi-repo context is supported on VS Code, JetBrains, and Web
- [Guardrails](/cody/clients/enable-cody-enterprise#guardrails) are supported on VS Code, JetBrains, and Web
- [Repo-based Cody Context Filters](/cody/capabilities/ignore-context#cody-context-filters) are supported on VS Code, JetBrains, and Web
- `@-mention` directories are supported on VS Code, JetBrains, and Web

</Accordion>
Loading
Loading