.NET: Fix function_call_output.output to be a JSON string on the wire#5705
.NET: Fix function_call_output.output to be a JSON string on the wire#5705
Conversation
OutputConverter was passing the JSON serialization of complex tool results (e.g. List<TodoItem>) directly into OutputItemFunctionToolCallOutput via BinaryData.FromString. The Responses SDK treats that BinaryData as the *raw JSON value* for the field, so non-string results landed on the wire as an unquoted JSON array (e.g. `"output":[{...}]`) instead of a JSON string.
The OpenAI Responses spec requires `function_call_output.output` to be a JSON string. The strict-parsing OpenAI .NET client (FunctionCallOutputResponseItem) consequently failed when threading a follow-up turn that replayed such an item, with: `The JSON value could not be converted... requires an element of type 'String', but the target element has type 'Array'`.
Always wrap the payload as a JSON string literal:
- string s -> JSON-encode s (quoted, with escapes)
- object o -> JSON-serialize o, then JSON-encode the resulting text
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
This PR updates the Foundry hosting stream converter to serialize function_call_output.output as a JSON string literal on the wire, aligning emitted events with the OpenAI Responses specification and avoiding strict client parsing failures for non-string tool results.
Changes:
- Wrap function tool outputs so
function_call_output.outputis always emitted as a JSON string literal (including when the tool result is an object/array). - Add inline rationale/comments describing the spec requirement and why the additional wrapping is necessary.
Show a summary per file
| File | Description |
|---|---|
| dotnet/src/Microsoft.Agents.AI.Foundry.Hosting/OutputConverter.cs | Ensures function_call_output.output is always emitted as a JSON string literal by double-serializing non-string tool results (serialize to JSON text, then JSON-encode that text). |
Copilot's findings
- Files reviewed: 1/1 changed files
- Comments generated: 2
There was a problem hiding this comment.
Automated Code Review
Reviewers: 2 | Confidence: 76%
✓ Security Reliability
The change correctly wraps function call outputs as JSON string literals to comply with the OpenAI Responses spec. No security issues found. One reliability suggestion: the InputConverter round-trip path (
ConvertFunctionToolCallOutputat line 487) passesfuncOutput.OutputBinaryData directly toFunctionResultContentwithout unwrapping the JSON string encoding introduced here. In multi-turn conversations where session-state function outputs are converted back to ChatMessages, this could result in double-encoded function results reaching the Chat Completions API. Additionally, the existing test K-06 (ConvertUpdatesToEventsAsync_FunctionResultStringPayload_EmittedAsRawTextAsyncat OutputConverterTests.cs:635) assertsAssert.Equal("sunny", output.Output.ToString())and would fail, sinceBinaryData.FromString(JsonSerializer.Serialize("sunny"))yields"sunny"(with JSON quotes) onToString().
✗ Design Approach
The outbound encoding change is directionally correct for the OpenAI wire contract, but it is incomplete as implemented: this repo also replays stored
function_call_outputitems back into agent messages, and those inbound paths still treatoutputas already-decoded text. That meansprevious_response_id/history replay will now hand tools quoted or escaped payloads instead of the original result.
Automated review by alliscode's agents
…ap, tests
OutputConverter: extract EncodeFunctionResultAsJsonStringPayload helper
that special-cases JsonElement / JsonDocument so a string-kind element
does not get double-encoded into "\"value\"". Other JsonElement kinds
(object/array/number/bool) round-trip via GetRawText() and are then
JSON-string-wrapped, matching the spec.
InputConverter: symmetric DecodeFunctionResultPayload added to
ConvertFunctionCallOutput and ConvertFunctionToolCallOutput so
previously-stored function_call_output items replayed via
previous_response_id unwrap back to the original tool result text
instead of leaking the JSON-encoded form into FunctionResultContent.Result.
Legacy non-conforming raw-JSON-value payloads pass through unchanged.
Tests:
- Replace ConvertUpdatesToEventsAsync_FunctionResultStringPayload_EmittedAsRawTextAsync
with EmittedAsJsonStringAsync asserting the new wire contract ("sunny" -> "\"sunny\"").
- Add coverage for object payloads, JsonElement string kind (no double-encoding),
and JsonElement array kind (JSON-stringified).
- Add InputConverter round-trip tests for spec-compliant JSON-string payloads
and legacy raw-JSON-array payloads.
All 663 tests pass on net8/net9/net10. Verified end-to-end against the local
hosted-harness sample: T1-T4 (incl. TodoList tool replay across turns) all
succeed with no SDK parse errors.
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
This pull request updates the way function call outputs are serialized to ensure compatibility with the OpenAI Responses specification and the .NET client. The main change is that all function call outputs are now wrapped as JSON string literals, preventing parsing errors when handling complex types.