-
Notifications
You must be signed in to change notification settings - Fork 4.5k
System prompt not sent to local LM Studio models #10781
Copy link
Copy link
Open
Labels
area:integrationIntegrations (context providers, model providers, etc.)Integrations (context providers, model providers, etc.)ide:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:windowsHappening specifically on WindowsHappening specifically on Windows
Metadata
Metadata
Assignees
Labels
area:integrationIntegrations (context providers, model providers, etc.)Integrations (context providers, model providers, etc.)ide:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:windowsHappening specifically on WindowsHappening specifically on Windows
Type
Projects
Status
Todo
Before submitting your bug report
Relevant environment info
Description
Summary
When using a local model served via LM Studio's OpenAI-compatible API, Continue does not appear to send its system prompt to the model. The model instead follows whatever system prompt is configured in LM Studio's UI preset.
Steps to Reproduce
Expected Behavior
The model should receive Continue's system prompt and generate code as instructed.
Actual Behavior
The model follows LM Studio's UI-configured system prompt instead. For example, with a "Systems Architect" preset, the model outputs a
SystemPlanJSON withplan_summary,execution_steps, etc. instead of actual code. This output gets written directly to the target file.Evidence
A file (
draw_ascii.py) was created with the model's planning output instead of Python code:The model's LM Studio system prompt was visible in the UI: "You are a world-class Systems Architect. Your job is to take the user's request and produce a high-level technical plan..."
Environment
openai/gpt-oss-20b-gguf(mxfp4)Notes
The LM Studio Harmony tokens (
<|channel|>,<|constrain|>,<|message|>) appearing in the output are a known LM Studio issue being tracked separately.To reproduce
No response
Log output