Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama_spec fails when single file run #493

Closed
kokuyouwind opened this issue Feb 27, 2024 · 0 comments · Fixed by #494
Closed

ollama_spec fails when single file run #493

kokuyouwind opened this issue Feb 27, 2024 · 0 comments · Fixed by #494

Comments

@kokuyouwind
Copy link
Contributor

Description

rspec ./spec/langchain/llm/ollama_spec.rb fails.

Failures:

  1) Langchain::LLM::Ollama#complete returns a completion
     Failure/Error: @defaults = DEFAULTS.deep_merge(default_options)

     NoMethodError:
       undefined method `deep_merge' for {:temperature=>0.8, :completion_model_name=>"llama2", :embeddings_model_name=>"llama2", :chat_completion_model_name=>"llama2"}:Hash
     # ./lib/langchain/llm/ollama.rb:37:in `initialize'
     # ./spec/langchain/llm/ollama_spec.rb:6:in `new'
     # ./spec/langchain/llm/ollama_spec.rb:6:in `block (2 levels) in <top (required)>'
     # ./spec/langchain/llm/ollama_spec.rb:31:in `block (3 levels) in <top (required)>'
     # ./spec/support/vcr.rb:26:in `block (2 levels) in <top (required)>'
...(and so on)

The problem seems to be that the deep_merge method is being used but not explicitly requiring active_support.
In the project-wide rspec, it seems to be working fine with active_support enabled at the same time in some require.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant