Skip to content

fix: filter empty text content blocks to prevent LLM API 400 errors#11808

Closed
RomneyDa wants to merge 1 commit intomainfrom
empty-message-bugs
Closed

fix: filter empty text content blocks to prevent LLM API 400 errors#11808
RomneyDa wants to merge 1 commit intomainfrom
empty-message-bugs

Conversation

@RomneyDa
Copy link
Collaborator

@RomneyDa RomneyDa commented Mar 25, 2026

Summary

Empty or whitespace-only text content blocks in array message content cause 400 errors across multiple providers (Anthropic, Bedrock, Venice, and other OpenAI-compatible proxies). This has been reported in 12 open issues.

Root causes fixed:

  • chatMessageIsEmpty() only checked string content — array content like [{type:"text", text:""}] was not detected as empty, so these messages passed through filtering
  • Empty text parts in array content were never strippedaddSpaceToAnyEmptyMessages() only handled the string case, leaving empty text parts in arrays to reach provider APIs
  • Bedrock _convertMessageContentToBlocks didn't filter empty text — blank text blocks were passed directly to the Bedrock API

Changes:

  • core/llm/messages.ts: Fix chatMessageIsEmpty to detect empty array content. Replace addSpaceToAnyEmptyMessages with stripEmptyContentParts that removes empty text parts from arrays
  • core/llm/openaiTypeConverters.ts: Extract toAssistantContent helper that filters empty text parts from assistant array content. Filter empty text parts in multi-media user messages
  • core/llm/llms/Bedrock.ts: Skip empty/whitespace text in _convertMessageContentToBlocks
  • packages/openai-adapters/src/apis/Bedrock.ts: Filter empty text parts in user message conversion
  • core/llm/countTokens.ts: Use new stripEmptyContentParts

What this does NOT change:

  • Existing " " fallbacks for string content (preserves compatibility with LM Studio, Ollama, etc.)
  • toResponsesInput behavior (preserves Responses API ID chains)
  • Bedrock error on empty assistant turns (keeps error surfacing for debugging)

Issues Fixed

Fixes #9232, fixes #9765, fixes #9767, fixes #10148, fixes #10293, fixes #10504, fixes #10804, fixes #11045, fixes #11264, fixes #11446, fixes #11497, fixes #11728

Test plan

  • Existing openaiTypeConverters.test.ts tests pass (24/24)
  • Existing countTokens.test.ts tests pass
  • Prettier and lint checks pass
  • Manual testing with Bedrock provider
  • Manual testing with Anthropic direct provider
  • Manual testing with OpenAI-compatible proxy (Venice, OpenWebUI)
  • Verify tool call flows still work (assistant messages with empty content + tool calls)

@RomneyDa RomneyDa requested a review from a team as a code owner March 25, 2026 04:26
@RomneyDa RomneyDa requested review from sestinj and removed request for a team March 25, 2026 04:26
@dosubot dosubot bot added the size:L This PR changes 100-499 lines, ignoring generated files. label Mar 25, 2026
@continue
Copy link
Contributor

continue bot commented Mar 25, 2026

Docs Review: No documentation updates needed.

This PR fixes internal message handling to prevent empty/whitespace text content blocks from being sent to LLM APIs. These are implementation-level changes that:

  • Fix 400 errors across multiple providers (Anthropic, Bedrock, Venice, etc.)
  • Don't introduce new user-facing configuration options
  • Don't change how developers interact with Continue

The fixes improve reliability behind the scenes—users will simply stop seeing the errors without needing to know or configure anything differently.

Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 issues found across 5 files

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="core/llm/openaiTypeConverters.ts">

<violation number="1" location="core/llm/openaiTypeConverters.ts:150">
P1: Whitespace-only assistant string content is still sent instead of being treated as empty.</violation>

<violation number="2" location="core/llm/openaiTypeConverters.ts:187">
P1: Whitespace-only user string content is not normalized and can still be sent to providers.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

@RomneyDa RomneyDa force-pushed the empty-message-bugs branch 3 times, most recently from 7997113 to f199b47 Compare March 25, 2026 07:41
@RomneyDa RomneyDa marked this pull request as draft March 25, 2026 07:49
@RomneyDa
Copy link
Collaborator Author

this is a somewhat risky one, pending further evaluation

@RomneyDa RomneyDa force-pushed the empty-message-bugs branch from f199b47 to 2abb9c9 Compare March 26, 2026 01:39
@RomneyDa RomneyDa changed the title fix: prevent empty/whitespace text content blocks from being sent to LLM APIs fix: filter empty text content blocks to prevent LLM API 400 errors Mar 26, 2026
…LLM APIs

Empty or whitespace-only text content blocks cause 400 errors across
multiple providers (Anthropic, Bedrock, Venice, OpenAI-compatible proxies).

- Fix chatMessageIsEmpty to detect empty array content, not just strings
- Replace addSpaceToAnyEmptyMessages (which added " " rejected by strict
  providers) with stripEmptyContentParts that removes empty text parts
- Fix toChatMessage to use null for empty assistant content instead of " "
- Fix toResponsesInput to skip empty assistant/system messages
- Fix Bedrock _convertMessages to gracefully skip empty turns instead of
  throwing
- Filter empty text parts in Bedrock adapter's user message conversion

Fixes #9232, #9765, #9767, #10148, #10293, #10504, #10804, #11045,
#11264, #11446, #11497, #11728
@RomneyDa
Copy link
Collaborator Author

Closing in favor of #11877 which takes a more targeted approach — fixing empty text content blocks at each provider boundary (+15/-6 lines) instead of modifying the shared upstream message filtering layer. See the comment on #11877 for rationale.

@RomneyDa RomneyDa closed this Mar 26, 2026
@github-project-automation github-project-automation bot moved this from Todo to Done in Issues and PRs Mar 26, 2026
@github-actions github-actions bot locked and limited conversation to collaborators Mar 26, 2026
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Status: Done

1 participant