Skip to content

Conversation

@furqan-shaikh-dev
Copy link
Contributor

Summary

Adds support for OpenAI Responses API by introducing a new class OCIChatOpenAI. This allows users to invoke OCI Generative AI to leverage OpenAI Responses API.

Changes

  • Add new file oci_generative_ai_responses_api.py which implements OCIChatOpenAI class and OCI Auth.
  • Add comprehensive unit tests covering most common langchain and langgraph scenarios.
  • Add examples folder to provide sample usages
  • Add documentation related to the change in README.md

Testing

  • Unit Tests: 5/5 tests passing
  • Integration Tests: To be raised as a separate PR

Breaking Changes

None - this is a new feature that is fully backward compatible.

@oracle-contributor-agreement
Copy link

Thank you for your pull request and welcome to our community! To contribute, please sign the Oracle Contributor Agreement (OCA).
The following contributors of this PR have not signed the OCA:

To sign the OCA, please create an Oracle account and sign the OCA in Oracle's Contributor Agreement Application.

When signing the OCA, please provide your GitHub username. After signing the OCA and getting an OCA approval from Oracle, this PR will be automatically updated.

If you are an Oracle employee, please make sure that you are a member of the main Oracle GitHub organization, and your membership in this organization is public.

@oracle-contributor-agreement oracle-contributor-agreement bot added the OCA Required At least one contributor does not have an approved Oracle Contributor Agreement. label Nov 4, 2025
@furqan-shaikh-dev furqan-shaikh-dev marked this pull request as ready for review November 10, 2025 03:47
import requests
from langchain_openai import ChatOpenAI
from oci.config import DEFAULT_LOCATION, DEFAULT_PROFILE
from openai import (
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

openai not required in dependencies?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on the way current packages are managed, we give error messages when specific required packages are not installed. For eg: oracle-ads, langchain-openai. So we have followed the same. When langchain-openai is not installed an error message is thrown. langchain-openai brings in the openai as a transitive dependency. wdyt?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is going to fail because you are importing at the top level. Could you check what ads is doing.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DONE

@streamnsight
Copy link
Member

@YouNeedCryDear
wouldn't it be a lot easier to switch to the oci_openai package when making calls with external models (llama, grok, gpt...), then the whole support would be propagated from the openai original SDK, and the only support we need to implement is support for Cohere models
https://github.com/oracle-samples/oci-openai

@YouNeedCryDear
Copy link
Member

@YouNeedCryDear wouldn't it be a lot easier to switch to the oci_openai package when making calls with external models (llama, grok, gpt...), then the whole support would be propagated from the openai original SDK, and the only support we need to implement is support for Cohere models https://github.com/oracle-samples/oci-openai

Sounds like a good idea. But this might be a huge change to the code base and I don't think the Reponses API team is able to do.

@YouNeedCryDear
Copy link
Member

The change looks good to me. One more question regarding the file naming and structure. Is the ChatOCIOpenAI specifically for responses API or it can also be used for chat-completion? I wonder if we should rename the file to something more generic or directly move the class under oci_generative_ai.py. @furqan-shaikh-dev

@streamnsight
Copy link
Member

The change looks good to me. One more question regarding the file naming and structure. Is the ChatOCIOpenAI specifically for responses API or it can also be used for chat-completion? I wonder if we should rename the file to something more generic or directly move the class under oci_generative_ai.py. @furqan-shaikh-dev

the openai API supports both. the langchain-openai tries to detects if the model selected used reasoning, and if so will automatically select the Response API, however it is also a flag you can set. in the example here: https://github.com/oracle-samples/oci-openai?tab=readme-ov-file#using-with-langchain-openai you can see how that works with langchai-openai
So, if we are wrapping the functionality, we need to make sure the flag is supported.

@YouNeedCryDear
Copy link
Member

The change looks good to me. One more question regarding the file naming and structure. Is the ChatOCIOpenAI specifically for responses API or it can also be used for chat-completion? I wonder if we should rename the file to something more generic or directly move the class under oci_generative_ai.py. @furqan-shaikh-dev

the openai API supports both. the langchain-openai tries to detects if the model selected used reasoning, and if so will automatically select the Response API, however it is also a flag you can set. in the example here: https://github.com/oracle-samples/oci-openai?tab=readme-ov-file#using-with-langchain-openai you can see how that works with langchai-openai So, if we are wrapping the functionality, we need to make sure the flag is supported.

That's why I think instead of having it under oci_generative_ai_responses_api.py, which the name is specific for responses api, we probably need to rename the file or just directly move the ChatOCIOpenAI class under oci_generative_ai.py

API_KEY = "<NOTUSED>"
COMPARTMENT_ID_HEADER = "opc-compartment-id"
CONVERSATION_STORE_ID_HEADER = "opc-conversation-store-id"
OUTPUT_VERSION = "responses/v1"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is this? i find it strange to say responses/v1

in OpenAI, the path are like v1/responses and v1/chat/completions

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

conversation_store_id=conversation_store_id
),
base_url=_resolve_base_url(region=region, service_endpoint=service_endpoint, base_url=base_url),
use_responses_api=True,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can customers use ChatOCIOpenAI with our compatible ChatCompletions API?

super().__init__(
model=model,
api_key=SecretStr(API_KEY),
http_client=get_sync_httpx_client(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's our plan for Async client? Does LangChain has that?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dont see it supported yet. We will keep a tab and add whenever its available

@YouNeedCryDear YouNeedCryDear merged commit 53b892f into oracle:main Nov 24, 2025
9 of 10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

OCA Required At least one contributor does not have an approved Oracle Contributor Agreement.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants