Skip to content

Gemini Pro 3: Server error. Stream terminated when used after other models in the same conversation #282138

@chryzxc

Description

@chryzxc

Issue: Gemini Pro 3 returns Server error. Stream terminated when used after other models in same conversation

Summary

When using Gemini Pro 3 in VS Code Copilot Chat, the model frequently stops mid-response and shows the error:

Server error. Stream terminated

This only occurs when the conversation includes previous messages from other models (e.g., GPT-5.1, Claude Sonnet 4.5, Claude Opus).
When starting a new chat with no prior context, Gemini Pro 3 works normally.

This suggests an issue with mixed-model conversation context or model switching within a thread.


Steps to Reproduce

  1. Open VS Code Copilot Chat.
  2. Start a long conversation using another model (e.g., GPT-5.1, Sonnet 4.5).
  3. After a few exchanges, switch the model to Gemini Pro 3.
  4. Ask any moderately complex question.
  5. The response stops halfway and the error appears:
    Server error. Stream terminated

Expected Behavior

Gemini Pro 3 should complete responses normally, regardless of which models were used earlier in the conversation.


Actual Behavior

Gemini Pro 3 terminates the stream mid-response with a server error when used in a chat that contains previous messages from other models.


Environment

  • VS Code version: 1.106.3
  • OS: Windows
  • Network: Wired / Wireless
  • Workspace type: (any project)

Additional Notes

  • Issue is not related to token limits (happens even with short prompts).
  • No rate-limit or network warnings.
  • Issue disappears completely when using a new chat with no prior model context.

Metadata

Metadata

Labels

bugIssue identified by VS Code Team member as probable buggeminiIssues related to the experience with Google Gemini models

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions