Skip to content

Commit

Permalink
fix: add pip installable flash attention (#863)
Browse files Browse the repository at this point in the history
* fix: setup.py

* fix: ci

* fix: install flash-attn needs torch

* fix: typo

* fix: passby flash attn not installed

* fix: ci

Co-authored-by: Ziniu Yu <ziniuyu@gmail.com>

Co-authored-by: Ziniu Yu <ziniuyu@gmail.com>
  • Loading branch information
OrangeSodahub and ZiniuYu committed Nov 24, 2022
1 parent 53cd063 commit 0223e6f
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
7 changes: 3 additions & 4 deletions .github/workflows/ci.yml
Expand Up @@ -155,14 +155,13 @@ jobs:
run: |
python -m pip install --upgrade pip
python -m pip install wheel pytest pytest-cov nvidia-pyindex
pip install -e "client/[test]"
pip install -e "server/[tensorrt]"
{
python -m pip install torch==1.10.0+cu111 -f https://download.pytorch.org/whl/torch_stable.html
python -m pip install git+https://github.com/HazyResearch/flash-attention.git
pip install -e "server/[flash-attn]"
} || {
echo "flash attention was not installed."
}
pip install -e "client/[test]"
pip install -e "server/[tensorrt]"
- name: Test
id: test
run: |
Expand Down
1 change: 1 addition & 0 deletions server/setup.py
Expand Up @@ -59,6 +59,7 @@
'tensorrt': ['nvidia-tensorrt'],
'transformers': ['transformers>=4.16.2'],
'search': ['annlite>=0.3.10'],
'flash-attn': ['flash-attn'],
},
classifiers=[
'Development Status :: 5 - Production/Stable',
Expand Down

0 comments on commit 0223e6f

Please sign in to comment.