You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
the llm_options I am passing to Langchain::LLM::OpenAI are not picked up (see the client result)
I suspect the keys in my llm_options hash are wrong
None of my educated guesses on key names are working
I suspect langchain-rb doesn't do the llm key mapping
I'd be happy to submit a PR but I'd need help! Maybe @andreibondarev or another maintainer can help me get started. I'd also like to submit a PR with documentation on this.
The text was updated successfully, but these errors were encountered:
@johnknapp. Also note @defaults in your Issue above doesn't get set unless you also pass in default_options, I think, it makes sense that they don't match.
And looking at PR 409, I noticed you got the suggestion from @andreibondarev to use default_options instead of llm_options which solved my default override issue very nicely.
I'll close this though I still think we could improve the docs.
My objective:
Langchain::OutputParsers::StructuredOutputParser
My code:
@llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'], llm_options: {chat_completion_model_name: 'gpt-3.5-turbo-1106', completion_model_name: 'gpt-3.5-turbo-1106', type: 'json_object', temperature: 0.7})
Generates this result:
Problem statement:
llm_options
I am passing toLangchain::LLM::OpenAI
are not picked up (see the client result)I'd be happy to submit a PR but I'd need help! Maybe @andreibondarev or another maintainer can help me get started. I'd also like to submit a PR with documentation on this.
The text was updated successfully, but these errors were encountered: