You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
When working in a complex enterprise networking setup it may be required to setup custom proxies and custom certificates when connecting to an Azure OpenAI endpoint. Think of custom internal LLM gateways with internal certificates and accessible only through proxies.
Current configuration parameters fall sort on options.
Describe the solution you'd like
AzureOpenAI* classes should allow to inject the underlaying OpenAI client and/or httpx client to overcome any enterprise networking requirement.
Other AI frameworks (PydanticAI, Lagnchain, Llamaindex) allows injecting that configuration allowing complex enterprise networking setups.
Langchain Embedding example with internal CAs support (based on truststore library) and proxy setup:
This solution probably does not works well with the component serialization and some thinking would be required.
Describe alternatives you've considered
Setting up global proxies and certificates could be cumbersome in python and not enough as different endpoints could require different proxies. In our scenario we require to access to LLM endpoints through different proxies.
Passing extra args to the OpenAI client construction could suffice but require some thinking on the interaction between current fixed parameters (azure_endpoint, azure_deployment, api_key)
Exposing the OpenAI client and having some post_construction lifecycle method executed after the from_dict could be a good solution. That way the object would be serialized/deserialized as of now and then client code would be able to fix with dedicated code the client connection parameters.
Additional context
Add any other context or screenshots about the feature request here.
The same requirement would apply to any Cloud provider that allows private endpoints and setups (AWS Bedrock, etc)
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
When working in a complex enterprise networking setup it may be required to setup custom proxies and custom certificates when connecting to an Azure OpenAI endpoint. Think of custom internal LLM gateways with internal certificates and accessible only through proxies.
Current configuration parameters fall sort on options.
Describe the solution you'd like
AzureOpenAI* classes should allow to inject the underlaying OpenAI client and/or httpx client to overcome any enterprise networking requirement.
Other AI frameworks (PydanticAI, Lagnchain, Llamaindex) allows injecting that configuration allowing complex enterprise networking setups.
Langchain Embedding example with internal CAs support (based on truststore library) and proxy setup:
This solution probably does not works well with the component serialization and some thinking would be required.
Describe alternatives you've considered
Setting up global proxies and certificates could be cumbersome in python and not enough as different endpoints could require different proxies. In our scenario we require to access to LLM endpoints through different proxies.
Passing extra args to the OpenAI client construction could suffice but require some thinking on the interaction between current fixed parameters (azure_endpoint, azure_deployment, api_key)
Exposing the OpenAI client and having some post_construction lifecycle method executed after the from_dict could be a good solution. That way the object would be serialized/deserialized as of now and then client code would be able to fix with dedicated code the client connection parameters.
Additional context
Add any other context or screenshots about the feature request here.
The same requirement would apply to any Cloud provider that allows private endpoints and setups (AWS Bedrock, etc)
The text was updated successfully, but these errors were encountered: