You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Given an API & API key, how would I setup copilot to use my own locally hosted or propriety model?
Describe the solution you'd like
If this feature exists, a step by step setup procedure!
Describe alternatives you've considered
N/A
Additional context
N/A
The text was updated successfully, but these errors were encountered:
pengaustin
changed the title
Does Copilot support hosting of local models/proprietary models?
Does Copilot support models that are locally hosted or proprietary but we have an API & API key?
Mar 20, 2025
In Copilot Settings just add correct custom model for chat and embedding.
For both separately set model name and provider.
API key might not be needed.
At the tab "Basic" of Copilot Settings select proper default models for Chat and Embedding
Make sure you can refresh the index
Try to chat
I just did it myself for Ollama provider, Llama3.2 for chat model and Bge-m3(Ollama) for embeddings
Is your feature request related to a problem? Please describe.
Given an API & API key, how would I setup copilot to use my own locally hosted or propriety model?
Describe the solution you'd like
If this feature exists, a step by step setup procedure!
Describe alternatives you've considered
N/A
Additional context
N/A
The text was updated successfully, but these errors were encountered: