feat: Add comprehensive historical and current Claude model IDs for API compatibility and dynamic Docker port configuration.#30
Conversation
…oving size-based buffer limits and holding incomplete blocks longer.
…alid or incomplete tool JSON and refining detection in Node.js.
…tation, and tests, including the default model and thinking delta logic.
…e an example configuration file.
…PI compatibility and dynamic Docker port configuration.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 76ae2fed51
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| const detected = parseToolCalls(outputText, toolNames); | ||
| if (detected.length > 0 && !toolCallsEmitted) { |
There was a problem hiding this comment.
Gate tool-call detection on tools being requested
In the Vercel streaming path, finalize now runs parseToolCalls(outputText, toolNames) even when the request did not include tools. Because an empty toolNames allowlist is treated as "accept all", any normal assistant response that includes valid {"tool_calls": ...} JSON can incorrectly emit a tool_calls delta and end with finish_reason: "tool_calls" instead of "stop". This changes non-tool API behavior and can break clients that only requested plain chat completions.
Useful? React with 👍 / 👎.
| if start < 0 { | ||
| if len([]rune(captured)) >= 512 { | ||
| return captured, nil, "", true | ||
| } | ||
| return "", nil, "", false |
There was a problem hiding this comment.
Preserve plain text when
tool_calls appears without JSON
With tool mode enabled, capture can start at the raw tool_calls token even when no { precedes it (plain explanatory text). This branch now returns ready=false unconditionally, and the new flush behavior suppresses non-ready captures, so content from tool_calls onward is silently dropped at stream end. This is a correctness regression that truncates legitimate assistant output whenever it mentions tool_calls outside a parseable JSON object.
Useful? React with 👍 / 👎.
No description provided.