You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry for the late reply. I've been very busy lately.
It's an experimental feature, and you can simply replace the forward function and other related functions from LlamaAttention with those from LlamaFlashAttention2. I also update the code.
The
LlamaFlashAttention2
has no__init__()
. As a result running the system with flash-attn will crash.LMDrive/LAVIS/lavis/models/blip2_models/modeling_llama.py
Line 415 in ae0643d
The text was updated successfully, but these errors were encountered: