Skip to content

Ollama provider compatibility broken #5013

@MartinHarding1998

Description

@MartinHarding1998

App Version

v3.21.3

API Provider

Ollama

Model Used

N/A

🔁 Steps to Reproduce

System Setup:

  • Linux: Debian 12 (Bookworm)
  • Roo Code v3.21.3
  • Defaults except for the model name: Tested with many different models, for example qwen coder and codellama.

The Issue: v3.21.3 and it seems that some previous versions (I'll update the thread with versions I tried below) refuse to work with Ollama on the same local installation, reporting that the provided model is "not available" even though it's clearly pulled in Ollama.

What I'm doing: Setting up Roo Code 3.21.3 with specific model in Ollama.

Expected result: Roo Code accepts the model and works.

Actual result: Roo Code reports that the model is unavailable. Attempted with several models (one from HF, one from Ollama model database and one locally defined), didn't work in any instance.

Note: v3.21.2 works, so the problem had to be introduced in v3.21.3

💥 Outcome Summary

Expected Roo Code to work with Ollama provider, but it refuses to accept the model and defaults to "1" context window size if attempting to run anyway.

📄 Relevant Logs or Errors (Optional)

Developer Tools have the following two errors immediately when opening Roo Code settings page:

workbench.desktop.main.js:35   ERR [Extension Host] Error parsing Ollama models response: {
  "issues": [
    {
      "code": "invalid_type",
      "expected": "array",
      "received": "null",
      "path": [
        "models",
        2,
        "details",
        "families"
      ],
      "message": "Expected array, received null"
    }
  ],
  "name": "ZodError"
}




workbench.desktop.main.js:1281 [Extension Host] Error parsing Ollama models response: {
  "issues": [
    {
      "code": "invalid_type",
      "expected": "array",
      "received": "null",
      "path": [
        "models",
        2,
        "details",
        "families"
      ],
      "message": "Expected array, received null"
    }
  ],
  "name": "ZodError"
}



All the models that I'm trying to use are definitely pulled and ready:

root@debian:/home/myuser/.config# ollama list
NAME                                                         ID              SIZE     MODIFIED     
fixed-qwen2.5-coder-32b-128k-q6_k:latest                     1af42851b709    26 GB    2 hours ago     
sammcj/qwen2.5-coder-32b-128k:q6_k                           bf89d7131e9e    26 GB    3 hours ago     
codellama:34b-code                                           d42f383a80dd    19 GB    26 hours ago    
hf.co/unsloth/Qwen2.5-Coder-32B-Instruct-128K-GGUF:latest    5101602c9038    19 GB    46 hours ago 


systemctl status ollama shows that there is an attempt to contact Ollama from Roo Code and that it succeeds in contacting Ollama:

Jun 22 15:18:46 debian ollama[2698]: [GIN] 2025/06/22 - 15:18:46 | 200 |   25.824063ms |       127.0.0.1 | GET      "/api/tags"
Jun 22 15:20:51 debian ollama[2698]: [GIN] 2025/06/22 - 15:20:51 | 200 |     7.74348ms |       127.0.0.1 | GET      "/api/tags"
Jun 22 15:32:59 debian ollama[2698]: [GIN] 2025/06/22 - 15:32:59 | 200 |    9.580021ms |       127.0.0.1 | GET      "/api/tags"
Jun 22 16:37:41 debian ollama[2698]: [GIN] 2025/06/22 - 16:37:41 | 200 |    1.372339ms |       127.0.0.1 | GET      "/api/tags"

Metadata

Metadata

Assignees

Labels

Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.bugSomething isn't working

Type

No type

Projects

Status

Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions