You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
qdrddr
changed the title
[Question] Is GGUF quantized models supported?
[Question] Is GGUF model package format supported with quantized models?
Apr 26, 2024
Sorry for the stupid question, but aren't Hugging Face and GGUF two different formats?
I am specifically interested in GGUF and the manual you provided for HF format. Or are both supported by the converter? @Hzfengsy
❓ General Questions
Hi, can I use existing gguf model package format of quantized models such as this one?
https://huggingface.co/MaziyarPanahi/Mixtral-8x22B-Instruct-v0.1-GGUF
If not, can I convert it to MLC?
The text was updated successfully, but these errors were encountered: