Question
This time around when setting up OpenCoder with Ollama to use offline, I am having a lot of trouble.
This is what my configuration file looks like.
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama (PC1)",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"ministral-3:3b": {
"name": "ministral-3:3b"
}
}
}
}
}
Whenever I use the /models it does not give me an option for my Ollama models.
I am currently using Linux Debbie 13 and I am using opencode Version 1.0.134.
Ollama Version 0.13.1