This extension allows you to interact with the DeepSeek models for Github Copilot Chat, fully locally and offline. It uses Ollama under the hood to provide a seamless experience.
- Install the extension from the Visual Studio Code Marketplace
- Open Github Copilot Chat panel
- In the chat, type
@deepseek
followed by your prompt
Note
During the first run, the extension will download the model. This may take a few minutes.
- Clone this repository
- Run
npm install
- Run
npm run package
- Install the generated
.vsix
file in Visual Studio Code
You can configure the extension by opening the settings panel (or settings.json
) and editing the following settings:
{
"deepseek.model.name": "deepseek-coder:1.3b",
"deepseek.ollama.host": "http://localhost:11434"
}
Note
You can find a list of available DeepSeek models at ollama.com.
Currently, the extension does not have access to your files, so it cannot provide context-aware completions. This will be fixed in future versions.
To remedy this, you can copy the code you want to complete and paste it in the chat.
This extension requires the Ollama app to be installed on your system. You can install it by following the instructions from ollama.com.