You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to load my bloomz-3b-awq via vLLM but fail. Anyone know how to fix it?
/usr/local/lib/python3.10/dist-packages/pydantic/main.cpython-310-x86_64-linux-gnu.so in pydantic.main.BaseModel.__init__()
ValidationError: 1 validation error for VLLM
__root__
Quantization is not supported for <class 'vllm.model_executor.models.bloom.BloomForCausalLM'>. (type=value_error)