Skip to content

Incorrect temperature parameter for OpenAI gpt-4o-search-preview and gpt-4o-mini-search-preview models #155

@croko

Description

@croko

OpenAI doesn't accept temperature parameter for gpt-4o-search-preview and gpt-4o-mini-search-preview models.
chat = RubyLLM.chat(model: 'gpt-4o-mini-search-preview')

after request

chat.ask('What movie won best picture in 2025?')

D, [2025-05-07T15:09:25.047545 #52031] DEBUG -- RubyLLM: request: POST https://api.openai.com/v1/chat/completions
D, [2025-05-07T15:09:25.048236 #52031] DEBUG -- RubyLLM: request: {:model=>"gpt-4o-mini-search-preview", :messages=> [{:role=>"user", :content=>"What movie won best picture in 2025?"}], :temperature=>0.7, :stream=>false}

got an error:

DEBUG -- RubyLLM: error: ruby_llm-1.2.0/lib/ruby_llm/error.rb:60:in 'parse_error': Model incompatible request argument supplied: temperature (RubyLLM::BadRequestError)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions