-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Is there an existing issue for this problem?
- I have searched the existing issues
Operating system
macOS
GPU vendor
None (CPU)
GPU model
No response
GPU VRAM
No response
Version number
1.4.1
Browser
Chrome Version 131.0.6778.205 (Official Build) (arm64)
Python dependencies
No response
What happened
Add models:
Url or local path
~/.ollama/models/manifests/registry.ollama.ai/library/llama3.2/latest: unrecognized suffix
What you expected to happen
invoke can use the models in ollama
How to reproduce the problem
No response
Additional context
No response
Discord username
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working