Skip to content

Conversation

alaa-eddine-k
Copy link
Contributor

@alaa-eddine-k alaa-eddine-k commented Sep 5, 2025

hotfix for OpenAI non stream requests in response API

Summary by CodeRabbit

  • Bug Fixes

    • Improved compatibility with OpenAI responses by supporting both chat and text completions, ensuring a consistent finish reason to prevent unexpected errors.
  • Chores

    • Bumped packages/core version to 1.5.61.
    • Bumped packages/sdk version to 1.1.3.

Copy link

coderabbitai bot commented Sep 5, 2025

Caution

Review failed

The pull request is closed.

Walkthrough

Version bumps in core and sdk packages. In OpenAIConnector, response parsing now falls back to alternative fields: message from choices[0].message or output_text; finishReason from choices[0].finish_reason or incomplete_details, defaulting to 'stop'.

Changes

Cohort / File(s) Summary
Version bumps
packages/core/package.json, packages/sdk/package.json
Incremented versions: core 1.5.60 → 1.5.61, sdk 1.1.2 → 1.1.3. No other metadata changes.
OpenAI connector response handling
packages/core/src/subsystems/LLMManager/LLM.service/connectors/openai/OpenAIConnector.class.ts
Adjusted extraction of message and finishReason to support both chat and text responses: use `choices?.[0]?.message

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant App as Caller
  participant Connector as OpenAIConnector
  participant OpenAI as OpenAI API

  App->>Connector: requestCompletion(input)
  Connector->>OpenAI: send request
  OpenAI-->>Connector: result { choices?, message?, output_text?, finish_reason?, incomplete_details? }

  rect rgb(240,248,255)
    note right of Connector: Parse response with fallbacks
    Connector->>Connector: message = choices[0].message || output_text
    Connector->>Connector: finishReason = choices[0].finish_reason || incomplete_details || 'stop'
  end

  Connector-->>App: { message, finishReason, ... }
Loading

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Poem

I nudge the version, hop and glide,
A tiny tweak by riverside—
If chat won’t speak, text will do,
Finish flags now clearer too.
Thump-thump! I stamp a tidy stop,
Then nibble greens and make a prop. 🐇✨


📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

Cache: Disabled due to data retention organization setting

Knowledge Base: Disabled due to data retention organization setting

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 04a2399 and 94289b4.

📒 Files selected for processing (3)
  • packages/core/package.json (1 hunks)
  • packages/core/src/subsystems/LLMManager/LLM.service/connectors/openai/OpenAIConnector.class.ts (1 hunks)
  • packages/sdk/package.json (1 hunks)
✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch dev

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore or @coderabbit ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@alaa-eddine-k alaa-eddine-k merged commit 75b6b23 into main Sep 5, 2025
1 check was pending
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants