Skip to content

optillm supports logits or logprobs in the API #182

@codelion

Description

@codelion

Add documentation to show how to use optillm with local inference server for getting logits.

This is a commonly requested feature in ollama ollama/ollama#2415 that is already supported in optillm and works well.

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentation

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions