You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After little investigation, seems like vllm's using safetensors to load the adapter but returns empty tensors when calling f.get_tensor() in some of the cases, still, no idea how to fix this.
More context:
i trained the Adapter using trl SFTtrainer
as a workaround i'm going to use save_safetensors=False in SFTConfig
Your current environment
🐛 Describe the bug
Llama3 8b with tuned LoraAdapter fails on inference,
Model init:
The text was updated successfully, but these errors were encountered: