-
Notifications
You must be signed in to change notification settings - Fork 12.5k
Open
Labels
opentuiThis relates to changes in v1.0, now that opencode uses opentuiThis relates to changes in v1.0, now that opencode uses opentuiwindows
Description
Description
When using the GLM-5 model (Z.AI), OpenCode's context window gets polluted with broken/malformed thinking output. The terminal displays ? characters scattered throughout the interface, the rendered content becomes garbled, and the TUI layout breaks completely. The context window shows raw thinking tokens that should be hidden or properly parsed.
The GLM 4.7 model works correctly without any of these rendering issues under the same conditions.
Steps to Reproduce
- Open OpenCode on Windows (cmd or PowerShell)
- Select the GLM-5 (Z.AI Coding Plan) model
- Run any task that triggers the model's thinking/reasoning mode (e.g., a build or code generation task)
- Observe the terminal output becoming corrupted
Expected Behavior
The context window should display clean, properly formatted output — thinking tokens should be hidden or correctly rendered, just like the GLM 4.7 model does.
Actual Behavior
- Multiple
?characters appear throughout the terminal (top, bottom, status bar) - The thinking/reasoning output leaks into the visible context window in a broken/malformed state
- The TUI layout breaks — code snippets, markdown, and status indicators become garbled
- The status bar at the bottom shows
???■■■■■instead of proper indicators - The overall context window becomes unusable and filled with noise
Specific observations from the screenshot
- The title bar and content area show
?characters on the left margin where they shouldn't be - Code content (Java —
EvidenciaRunner.java) is partially rendered but interspersed with broken formatting - The bottom status bar shows:
? Build · glm-5followed by more?lines, then? Build GLM-5 Z.AI Coding Plan - The very bottom shows
???■■■■■ esc interrupt— the progress/status indicators are corrupted
Comparison
| Model | Behavior |
|---|---|
| GLM 4.7 | Clean output, no artifacts, thinking is properly handled |
| GLM-5 | Broken thinking tokens leak into context window, ? characters everywhere, TUI breaks |
Environment
- OS: Windows 10/11
- Terminal: cmd / PowerShell
- OpenCode version: Latest
- Model: GLM-5 (Z.AI Coding Plan)
Screenshot
- I have verified that this issue has not already been requested
- I have read the contributing guidelines
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
opentuiThis relates to changes in v1.0, now that opencode uses opentuiThis relates to changes in v1.0, now that opencode uses opentuiwindows
