Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does Copilot support models that are locally hosted or proprietary but we have an API & API key? #1378

Open
pengaustin opened this issue Mar 20, 2025 · 1 comment
Labels
question Further information is requested

Comments

@pengaustin
Copy link

Is your feature request related to a problem? Please describe.
Given an API & API key, how would I setup copilot to use my own locally hosted or propriety model?

Describe the solution you'd like
If this feature exists, a step by step setup procedure!

Describe alternatives you've considered
N/A

Additional context
N/A

@pengaustin pengaustin changed the title Does Copilot support hosting of local models/proprietary models? Does Copilot support models that are locally hosted or proprietary but we have an API & API key? Mar 20, 2025
@logancyang logancyang added the question Further information is requested label Mar 21, 2025
@beeduino
Copy link

In Copilot Settings just add correct custom model for chat and embedding.
For both separately set model name and provider.
API key might not be needed.
At the tab "Basic" of Copilot Settings select proper default models for Chat and Embedding
Make sure you can refresh the index
Try to chat

I just did it myself for Ollama provider, Llama3.2 for chat model and Bge-m3(Ollama) for embeddings

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants