Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Autocomplete: add deepseek-coder:6.7b support #2966

Merged
merged 3 commits into from
Jan 31, 2024
Merged

Conversation

valerybugakov
Copy link
Member

@valerybugakov valerybugakov commented Jan 31, 2024

Context

Test plan

  1. Install and run Ollama
  2. Download one of the support local models:
  3. Update Cody's VS Code settings to use the unstable-ollama autocomplete provider.
  4. Confirm Cody uses Ollama by looking at the Cody output channel or the autocomplete trace view (in the command palette).

@valerybugakov valerybugakov merged commit 6971438 into main Jan 31, 2024
34 checks passed
@valerybugakov valerybugakov deleted the vb/deepseek-coder branch January 31, 2024 08:05
philipp-spiess pushed a commit that referenced this pull request Feb 1, 2024
- Adds autocomplete support for local inference with
[deepseek-coder:6.7b-base-q4_K_M](https://ollama.ai/library/deepseek-coder:6.7b-base-q4_K_M)
powered by ollama.
- Adds a bit of abstraction to simplify further addition of new ollama
models.
- Sets `deepseek-coder:6.7` as a default model for the ollama provider.
- Should be tested with the follow-up perf improvements
#2967
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants