Skip to content

[Feature]: Support think parameter for Ollama models #11680

@saattrupdan

Description

@saattrupdan

The Feature

Ollama now supports thinking with the new think parameter in the ollama.generate and ollama.chat functions. This doesn't seem to be implemented in LiteLLM yet, however.

Motivation, pitch

Support for reasoning Ollama models.

LiteLLM is hiring a founding backend engineer, are you interested in joining us and shipping to all our users?

No

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions