-
Notifications
You must be signed in to change notification settings - Fork 6.6k
Description
What version of Codex is running?
codex-cli 0.64.0
What subscription do you have?
Gemini 3 Pro via Vertex AI API on GCP
Which model were you using?
google/gemini-3-pro-preview
What platform is your computer?
Linux 6.17.8-orbstack-00308-g8f9c941121b1 aarch64 unknown
What issue are you seeing?
› Review README.md
• Explored
└ Read README.md
■ unexpected status 400 Bad Request: [{
"error": {
"code": 400,
"message": "Unable to submit request because function call default_api:shell in the 4. content block
is missing a thought_signature. Learn more: https://docs.cloud.google.com/vertex-ai/generative-ai/docs/
thought-signatures",
"status": "INVALID_ARGUMENT"
}
}
]
What steps can reproduce the bug?
Environment / Setup
-
Backend model:
google/gemini-3-pro-preview -
Access method: OpenAI compatibility layer as described in the Gemini 3 docs
https://docs.cloud.google.com/vertex-ai/generative-ai/docs/start/get-started-with-gemini-3 -
Scenario: Multi-turn conversation with tool calls enabled
-
config.toml:
model = "google/gemini-3-pro-preview" model_provider = "gemini-via-openai-compatibility-layer" [model_providers.gemini-via-openai-compatibility-layer] name = "Gemini via OpenAI compatibility layer" base_url = "https://aiplatform.googleapis.com/v1/projects/{PROJECT_ID}/locations/global/endpoints/openapi" env_key = "OPENAI_API_KEY" wire_api = "chat" query_params = {}
Reproduce Steps
$ touch README.md
$ export OPENAI_API_KEY=XXXX
$ codex
╭───────────────────────────────────────────────────────────╮
│ >_ OpenAI Codex (v0.64.0) │
│ │
│ model: google/gemini-3-pro-preview /model to change │
│ directory: /workspaces │
╰───────────────────────────────────────────────────────────╯
To get started, describe a task or try one of these commands:
/init - create an AGENTS.md file with instructions for Codex
/status - show current session configuration
/review - review any changes and find issues
› Review README.md
• Explored
└ Read README.md
■ unexpected status 400 Bad Request: [{
"error": {
"code": 400,
"message": "Unable to submit request because function call `default_api:shell` in the 4. content block
is missing a `thought_signature`. Learn more: https://docs.cloud.google.com/vertex-ai/generative-ai/docs/
thought-signatures",
"status": "INVALID_ARGUMENT"
}
}
]What is the expected behavior?
In multi-turn conversations with tool calls, Codex CLI should:
- Preserve the Gemini
thought_signatureacross turns for each tool call. - Successfully continue the agent loop without
400 INVALID_ARGUMENTerrors.
Additional information
Likely root cause (based on related issue)
The behavior appears to match the previously reported issue in openai-agents-js:
- PR: fix: preserve Gemini thought_signature in multi-turn tool calls openai-agents-js#718
"fix: preserve Gemini thought_signature in multi-turn tool calls"
From that PR, the root cause was:
- Non-streaming mode:
providerDatawas set from the overall response metadata (result.providerMetadata) instead of the per-tool-call metadata (toolCall.providerMetadata), which actually contains thethoughtSignature. - Streaming mode:
providerDatafrom tool-call parts was not captured at all.
As a result, the thought_signature was lost between iterations of the agent loop, causing Gemini to reject subsequent function calls.
Codex CLI seems to be hitting the same class of problem when used with Gemini 3 Pro via the OpenAI compatibility layer.
References
- Gemini thought signatures documentation:
https://docs.cloud.google.com/vertex-ai/generative-ai/docs/thought-signatures - Related PR (JavaScript SDK):
fix: preserve Gemini thought_signature in multi-turn tool calls openai-agents-js#718 - Related issue (Python SDK):
Missing thought_signature when using Gemini 3 pro preview openai-agents-python#2137