OpenAI doesn't accept temperature parameter for gpt-4o-search-preview and gpt-4o-mini-search-preview models.
chat = RubyLLM.chat(model: 'gpt-4o-mini-search-preview')
after request
chat.ask('What movie won best picture in 2025?')
D, [2025-05-07T15:09:25.047545 #52031] DEBUG -- RubyLLM: request: POST https://api.openai.com/v1/chat/completions
D, [2025-05-07T15:09:25.048236 #52031] DEBUG -- RubyLLM: request: {:model=>"gpt-4o-mini-search-preview", :messages=> [{:role=>"user", :content=>"What movie won best picture in 2025?"}], :temperature=>0.7, :stream=>false}
got an error:
DEBUG -- RubyLLM: error: ruby_llm-1.2.0/lib/ruby_llm/error.rb:60:in 'parse_error': Model incompatible request argument supplied: temperature (RubyLLM::BadRequestError)