Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

http_client variable not available in OpenAILlm class #1357

Closed
AbdurNawaz opened this issue May 4, 2024 · 0 comments 路 Fixed by #1355
Closed

http_client variable not available in OpenAILlm class #1357

AbdurNawaz opened this issue May 4, 2024 · 0 comments 路 Fixed by #1355

Comments

@AbdurNawaz
Copy link
Contributor

AbdurNawaz commented May 4, 2024

馃殌 The feature

embedchain uses Langchain's ChatOpenAI class behind the scenes for its openai llm configuration but does not expose some of the useful fields available in ChatOpenAI. For eg., the http_client variable that can be used to add proxy configuration to connect to openai LLMs is not available through embedchain's interface.

I have raised a PR for the same #1355 . I can also tweak it to support all Langchain's params if the devs are okay with it.

Motivation, pitch

To be able to have fine grained control over OpenAi LLM interactions like proxies, SSL config, additional headers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant