Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When creating an OpenAI client, llm_options hash not working #410

Closed
johnknapp opened this issue Dec 5, 2023 · 3 comments
Closed

When creating an OpenAI client, llm_options hash not working #410

johnknapp opened this issue Dec 5, 2023 · 3 comments

Comments

@johnknapp
Copy link
Contributor

johnknapp commented Dec 5, 2023

My objective:

My code:

@llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'], llm_options: {chat_completion_model_name: 'gpt-3.5-turbo-1106', completion_model_name: 'gpt-3.5-turbo-1106', type: 'json_object', temperature: 0.7})

Generates this result:

#<Langchain::LLM::OpenAI:0x000000010cb73b28
 @client=
  #<OpenAI::Client:0x000000010c8bd9d8
   @access_token="redacted!",
   @api_type=nil,
   @api_version="v1",
   @extra_headers={},
   @faraday_middleware=nil,
   @organization_id=nil,
   @request_timeout=120,
   @uri_base="https://api.openai.com/">,
 @defaults=
  {:n=>1,
   :temperature=>0.0,
   :completion_model_name=>"gpt-3.5-turbo",
   :chat_completion_model_name=>"gpt-3.5-turbo",
   :embeddings_model_name=>"text-embedding-ada-002",
   :dimension=>1536}>

Problem statement:

  • the llm_options I am passing to Langchain::LLM::OpenAI are not picked up (see the client result)
  • I suspect the keys in my llm_options hash are wrong
  • None of my educated guesses on key names are working
  • I suspect langchain-rb doesn't do the llm key mapping

I'd be happy to submit a PR but I'd need help! Maybe @andreibondarev or another maintainer can help me get started. I'd also like to submit a PR with documentation on this.

@mattlindsey
Copy link
Contributor

mattlindsey commented Dec 5, 2023

Hi @johnknapp. I took a quick look, and I think the example in the README is missing the .completion in the 2nd line below. Could that be the issue?

lm = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
llm_response = llm.chat(prompt: prompt_text).completion
parser.parse(llm_response)

@johnknapp. Also note @defaults in your Issue above doesn't get set unless you also pass in default_options, I think, it makes sense that they don't match.

@johnknapp
Copy link
Contributor Author

Thanks @mattlindsey I'll give that a go!

And looking at PR 409, I noticed you got the suggestion from @andreibondarev to use default_options instead of llm_options which solved my default override issue very nicely.

I'll close this though I still think we could improve the docs.

@andreibondarev
Copy link
Collaborator

Would someone please issue a PR to fix the documentation by any chance?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants