Skip to content

Conversation

@qiuosier
Copy link
Member

@qiuosier qiuosier commented Aug 12, 2025

As part of ongoing enhancements, the OCI Generative AI Service is introducing additional models from xAI and OpenAI. These new models adhere to the same API specifications as the Meta models; however, the existing implementation does not automatically derive the associated provider for xAI or OpenAI models.

The MetaProvider provider can be used for xAI and OpenAI models provided by the OCI Generative AI service.
This PR fixes #13 by adding a simple change to automatically use MetaProvider when user is using xAI or OpenAI models.

@oracle-contributor-agreement oracle-contributor-agreement bot added the OCA Verified All contributors have signed the Oracle Contributor Agreement. label Aug 12, 2025
Copy link
Member

@YouNeedCryDear YouNeedCryDear left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. I don't think OpenAI has been GA yet so we probably should remove it.
  2. For xai, it is not appropriate to use the name MetaProvider. Instead, I think we should define a base class for GenericProvider for generic API type.
  3. MetaProvider could be kept for backwards compatibility, but the implementation will likely be inheritance from GenericProvider.

@qiuosier
Copy link
Member Author

Hi @YouNeedCryDear, thanks for the comment. I made the following updates:

  1. Added a GenericProvider for both the llm model and the chat model.
  2. Change the MetaProvider to inherit from GenericProvider.
  3. Added _default_provider property.
  4. For models with provider not found in the _provider_map, the _default_provider will be used.

Here is a screenshot of the test:@Property
image

@qiuosier qiuosier changed the title Use MetaProvider for xai and openai models. Add GenericProvider Aug 12, 2025
@qiuosier
Copy link
Member Author

I have updated the code to raise and error when customer endpoint is used without a provider.
The GenericProvider is used only if the model_id is not a custom endpoint.

def _get_provider(self, provider_map: Mapping[str, Any]) -> Any:
if self.provider is not None:
provider = self.provider
elif self.model_id.startswith(CUSTOM_ENDPOINT_PREFIX):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The order might need to change. You have to check the self.model_id is None first, otherwise, this line will raise exception.

Copy link
Member Author

@qiuosier qiuosier Aug 12, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. I moved up the logic to check if model_id is None.

BTW, is there a reason to allow model_id to be None?
It appears to me that the proper way to do model_id validation is not to allow None as default value in OCIGenAIBase. However, this is probably out of the scope here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think allowing model_id to be None was just an old langchain standard of how other vendor packages were doing. We sure can revisit the detail later.

def _get_provider(self, provider_map: Mapping[str, Any]) -> Any:
if self.provider is not None:
provider = self.provider
elif self.model_id.startswith(CUSTOM_ENDPOINT_PREFIX):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think allowing model_id to be None was just an old langchain standard of how other vendor packages were doing. We sure can revisit the detail later.

@YouNeedCryDear YouNeedCryDear merged commit 1085b9c into oracle:main Aug 12, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

OCA Verified All contributors have signed the Oracle Contributor Agreement.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support OpenAI and xAI models

2 participants