Skip to content

Ollama backend #87

@krzysztofjeziorny

Description

@krzysztofjeziorny

In the last Wagtail webinar @tomdyson mentioned that this project can use Ollama with the Llava model as a backend. Is it already possible or meant for a future release? I've been looking at the docs, but didn't find any examples.

Thanks for this interesting project!

Metadata

Metadata

Assignees

No one assigned

    Labels

    backendBackend / AI integrationmodelsRelated to LLM models and configuration

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions