Description
Got tool call output in content:
Plugins
none
OpenCode version
1.14.40
Steps to reproduce
start llama.cpp server (b9041)
./llama-server -hf unsloth/Qwen3-Coder-30B-A3B-Instruct-GGUF:Q4_K_XL --threads -1 --parallel 1 --ctx-size 131072 --temp 0.7 --min-p 0.0 --top-p 0.8 --top-k 20 --repeat-penalty 1.05 --predict 131072 --cache-type-k q8_0 --cache-type-v q8_0
set opencode.json
{"$schema": "https://opencode.ai/config.json",
"enabled_providers": ["llama.cpp"],
"lsp": true,
"permission": {"bash": "ask", "edit": "ask"},
"provider": {
"llama.cpp": {
"npm": "@ai-sdk/openai-compatible", "name": "llama-server",
"options": {"baseURL": "http://127.0.0.1:8080/v1"},
"models": {
"llama.cpp": {"name": "llama.cpp", "limit": {"context": 131072, "output": 131072}}
}}}}
start opencode, run a prompt like "which package is currently installed?"
The problem is not happening on every attempt, but ~ 5/10.
output:
see screenshot
Screenshot and/or share link
Operating System
Ubuntu 24.04.4 LTS
Terminal
konsole
Description
Got tool call output in content:
Plugins
none
OpenCode version
1.14.40
Steps to reproduce
start llama.cpp server (b9041)
./llama-server -hf unsloth/Qwen3-Coder-30B-A3B-Instruct-GGUF:Q4_K_XL --threads -1 --parallel 1 --ctx-size 131072 --temp 0.7 --min-p 0.0 --top-p 0.8 --top-k 20 --repeat-penalty 1.05 --predict 131072 --cache-type-k q8_0 --cache-type-v q8_0set opencode.json
start opencode, run a prompt like "which package is currently installed?"
The problem is not happening on every attempt, but ~ 5/10.
output:
see screenshot
Screenshot and/or share link
Operating System
Ubuntu 24.04.4 LTS
Terminal
konsole