-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Description
Plugin Type
VSCode Extension
App Version
4.111.1
Description
Hey,
I am trying to setup Kilo Code plugin to use an Ollama model, running locally. Ollama is running and it's API are up.
KIlo Code can talk to the API, as available images in Ollama is visible in the UI config section.
The problem is that when talking to the agent, I never receive a response - this is a CPU based system, so I am not expecting great results, but I would expect to see at least something returned.
Maybe I am using the wrong models, or missing some configuration - all hints and ideas on how to move forward are welcome :-)
Reproduction steps
- pull ibm/granite4:latest with ollama
- configire Kilocode to use local API: http://127.0.0.1:11434
- select ibm/granite4:latest as the model
- Type 'hi' in chat window
Now nothing happens, other than it seems that the command did trigger something in Ollama, but nothing is returned.
When executing the model by hand - ollama run ibm/granite4:latest - and interacting with this model manually. It works fine and even feels a little snappy. Where's the disconnect?
Provider
Kilo Code
Model
ibm/granite4:latest
System Information
Windows 11, using WSL2
Metadata
Metadata
Assignees
Labels
Type
Projects
Status