Skip to content

Fix ResponseFormat=JsonSchema 500 error in chat completions API #84

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

Copilot
Copy link

@Copilot Copilot AI commented May 20, 2025

Problem

When using the OpenAI-compatible chat completions endpoint with response_format: { type: "json_object" }, the server responds with a 500 error instead of properly handling the unsupported parameter.

Solution

This PR makes the following changes:

  1. Added client-side validation in the JavaScript SDK to detect when the unsupported json_object response format is used
  2. Added a clear error message explaining that only the default text format is supported
  3. Updated API documentation to indicate which formats are currently supported
  4. Added warning comments in code examples
  5. Added unit tests to verify the validation logic

These changes provide users with clear feedback when they attempt to use unsupported features, rather than letting them encounter unexpected server errors.

Future Considerations

In the future, we could implement support for the json_object response format in the server implementation.

Fixes #81.

Warning

Firewall rules blocked me from connecting to one or more addresses

I tried to connect to the following addresses, but was blocked by firewall rules:

  • api.openai.com
    • Triggering command: curl -s REDACTED (dns block)

If you need me to access, download, or install something from one of these locations, you can either:


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Co-authored-by: natke <3302433+natke@users.noreply.github.com>
@Copilot Copilot AI changed the title [WIP] Foundry returns 500 status code when using ResponseFormat =JsonSchema in ChatClient options Fix ResponseFormat=JsonSchema 500 error in chat completions API May 20, 2025
@Copilot Copilot AI requested a review from natke May 20, 2025 17:43
Copilot finished work on behalf of natke May 20, 2025 17:43
@@ -68,6 +68,23 @@ export const postWithProgress = async (
body?: Record<string, unknown>,
onProgress?: (progress: number) => void,
): Promise<Record<string, unknown>> => {
// Check if body contains chat completions request with unsupported response_format
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this client is not used for chat completion requests. user must create their own chat client. no changes needed here

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

**Note: Currently, only the default `"text"` format is supported. The `"json_object"` format is not yet supported and will result in an error.**
Specifies the format that the model must output. Can be either:
- `{ "type": "text" }` - (default) model generates natural language responses
- `{ "type": "json_object" }` - NOT YET SUPPORTED - model guarantees to generate valid JSON
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"type": "json_schema" is also not yet supported

const chatEndpoint = 'http://example.com/v1/chat/completions';
const bodyWithJsonFormat = {
messages: [{ role: 'user', content: 'Hello' }],
response_format: { type: 'json_object' }
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot Could you also add case for "type": "json_schema"?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Foundry returns 500 status code when using ResponseFormat (aka structured response) in ChatClient options
4 participants