Skip to content

[bug]: can not import the models in ollama #7730

@zq19

Description

@zq19

Is there an existing issue for this problem?

  • I have searched the existing issues

Operating system

macOS

GPU vendor

None (CPU)

GPU model

No response

GPU VRAM

No response

Version number

1.4.1

Browser

Chrome Version 131.0.6778.205 (Official Build) (arm64)

Python dependencies

No response

What happened

Add models:
Url or local path
~/.ollama/models/manifests/registry.ollama.ai/library/llama3.2/latest: unrecognized suffix

What you expected to happen

invoke can use the models in ollama

How to reproduce the problem

No response

Additional context

No response

Discord username

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions