Is there currently a way to run quantized GGUF versions of Qwen-Image family models in LocalAi 3.6.0+? #6476
Branskugel
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Recently I tried using those with diffusers backend erroneously and was told in discord that it's purpose is to run safetensors models, so the question arises. What backend, if there is one supported by LocalAi, should I use to run quantized qwen-image models? If it's not supported, is it planned to in some future release?
Beta Was this translation helpful? Give feedback.
All reactions