-
Notifications
You must be signed in to change notification settings - Fork 689
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Visualize Attention #177
Comments
Thanks a lot! |
Uh oh, I just noticed your documentation says the solution doesn't work for XFORMERS, and that's exactly what we're using. So if anyone has a code for that, I'd appreciate it. |
Change these lines and will work for XFORMERS |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Does anyone have a code that allows one to visualize the result of the attention heads of the last layer? I'm thinking of something similar to what was done in DinoV1 through the visualize_attention.py script.
The text was updated successfully, but these errors were encountered: