-
Notifications
You must be signed in to change notification settings - Fork 45
Fix ResponseFormat=JsonSchema 500 error in chat completions API #84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Co-authored-by: natke <3302433+natke@users.noreply.github.com>
@@ -68,6 +68,23 @@ export const postWithProgress = async ( | |||
body?: Record<string, unknown>, | |||
onProgress?: (progress: number) => void, | |||
): Promise<Record<string, unknown>> => { | |||
// Check if body contains chat completions request with unsupported response_format |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this client is not used for chat completion requests. user must create their own chat client. no changes needed here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
**Note: Currently, only the default `"text"` format is supported. The `"json_object"` format is not yet supported and will result in an error.** | ||
Specifies the format that the model must output. Can be either: | ||
- `{ "type": "text" }` - (default) model generates natural language responses | ||
- `{ "type": "json_object" }` - NOT YET SUPPORTED - model guarantees to generate valid JSON |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"type": "json_schema" is also not yet supported
const chatEndpoint = 'http://example.com/v1/chat/completions'; | ||
const bodyWithJsonFormat = { | ||
messages: [{ role: 'user', content: 'Hello' }], | ||
response_format: { type: 'json_object' } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot Could you also add case for "type": "json_schema"?
Problem
When using the OpenAI-compatible chat completions endpoint with
response_format: { type: "json_object" }
, the server responds with a 500 error instead of properly handling the unsupported parameter.Solution
This PR makes the following changes:
json_object
response format is usedThese changes provide users with clear feedback when they attempt to use unsupported features, rather than letting them encounter unexpected server errors.
Future Considerations
In the future, we could implement support for the
json_object
response format in the server implementation.Fixes #81.
Warning
Firewall rules blocked me from connecting to one or more addresses
I tried to connect to the following addresses, but was blocked by firewall rules:
api.openai.com
curl -s REDACTED
(dns block)If you need me to access, download, or install something from one of these locations, you can either:
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.