forked from Dao-AILab/flash-attention
-
Notifications
You must be signed in to change notification settings - Fork 0
Fast and memory-efficient exact attention
License
moreh-dev/flash-attention
ErrorLooks like something went wrong!
About
Fast and memory-efficient exact attention
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- Python 51.9%
- C++ 35.7%
- Cuda 12.2%
- Other 0.2%