Skip to content

Ollama keeps unloading my models each request #3594

@lubek-dc

Description

@lubek-dc

Issue

Image

everything is loaded until i put in a prompt into the aider then it starts to load each gpu up again A

Image

Image

also keep in mind that i am the only user of this and i also use the openweb-ui and when i use it on there it works fine it dosent unload it it just works.

Version and model info

Aider: 0.77.1
Model: Ollama/Deepseek-r1:70b

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions