patterns-ai-core / langchainrb

Build LLM-powered applications in Ruby

Home Page:https://rubydoc.info/gems/langchainrb

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ollama_spec fails when single file run

kokuyouwind opened this issue · comments

Description

rspec ./spec/langchain/llm/ollama_spec.rb fails.

Failures:

  1) Langchain::LLM::Ollama#complete returns a completion
     Failure/Error: @defaults = DEFAULTS.deep_merge(default_options)

     NoMethodError:
       undefined method `deep_merge' for {:temperature=>0.8, :completion_model_name=>"llama2", :embeddings_model_name=>"llama2", :chat_completion_model_name=>"llama2"}:Hash
     # ./lib/langchain/llm/ollama.rb:37:in `initialize'
     # ./spec/langchain/llm/ollama_spec.rb:6:in `new'
     # ./spec/langchain/llm/ollama_spec.rb:6:in `block (2 levels) in <top (required)>'
     # ./spec/langchain/llm/ollama_spec.rb:31:in `block (3 levels) in <top (required)>'
     # ./spec/support/vcr.rb:26:in `block (2 levels) in <top (required)>'
...(and so on)

The problem seems to be that the deep_merge method is being used but not explicitly requiring active_support.
In the project-wide rspec, it seems to be working fine with active_support enabled at the same time in some require.