New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Support for Custom Endpoint in OpenAIClient #41242
base: main
Are you sure you want to change the base?
Conversation
Thank you for your contribution @sangyuxiaowu! We will review the pull request and get back to you soon. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
--- redacted ---
|
@microsoft-github-policy-service agree |
Hi @sangyuxiaowu. Thank you for your interest in helping to improve the Azure SDK experience and for your contribution. We've noticed that there hasn't been recent engagement on this pull request. If this is still an active work stream, please let us know by pushing some changes or leaving a comment. Otherwise, we'll close this out in 7 days. |
This PR addresses the issue #4152 and #39284, which request the ability to connect to custom endpoints that expose an OpenAI compatible API using the
OpenAIClient
class.Changes:
SetIsConfiguredForAzureOpenAI
was added to theOpenAIClient
class. This method allows setting the value of the private field_isConfiguredForAzureOpenAI
, enabling the client to connect to any endpoint.This change will make it possible for the Semantic Kernel to interact with hosted Open Source models with technologies like vLLM, llama.cpp, etc., and allow applications to specify the Endpoint, useful in cases where applications cannot directly access OpenAI’s official endpoints and need to use an intermediary server for filtering.
Please review and let me know if any changes are required.