You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hmm Siglip does support FA2 as per this code block, Ca you check transformers version, and that FA2 was available for Siglip in that version? If not update to the latest
When using SiglipVisionModel inside VideoLLaMA2.1-7B-AV, I encounter the following error:
ValueError: SiglipVisionModel does not support Flash Attention 2.0 yet.
I do not need Flash Attention for my use case and would like to disable it.
Could you provide an official way to toggle it off?
The text was updated successfully, but these errors were encountered: