Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

哪位跑通的大哥把conda环境配置发一下作为参考? #33

Open
xiaoweiweixiao opened this issue Mar 29, 2023 · 6 comments
Open

Comments

@xiaoweiweixiao
Copy link

下面各个库都用的哪个版本?cuda用的哪个版本?
torch
transformers
peft

@cywjava
Copy link

cywjava commented Mar 29, 2023

Package Version


accelerate 0.18.0
aiohttp 3.8.4
aiosignal 1.3.1
async-timeout 4.0.2
asynctest 0.13.0
attrs 22.2.0
cachetools 5.3.0
certifi 2022.12.7
charset-normalizer 3.1.0
cpm-kernels 1.0.11
datasets 2.10.1
DBUtils 3.0.2
dill 0.3.6
filelock 3.10.7
frozenlist 1.3.3
fsspec 2023.1.0
gpt-db-tools 0.0.1
huggingface-hub 0.13.3
icetk 0.0.4
idna 3.4
importlib-metadata 6.1.0
multidict 6.0.4
multiprocess 0.70.14
numpy 1.21.6
nvidia-cublas-cu11 11.10.3.66
nvidia-cuda-nvrtc-cu11 11.7.99
nvidia-cuda-runtime-cu11 11.7.99
nvidia-cudnn-cu11 8.5.0.96
nvidia-ml-py 11.525.84
nvitop 1.0.0
packaging 23.0
pandas 1.3.5
peft 0.2.0
Pillow 9.4.0
pip 22.0.4
protobuf 3.20.0
psutil 5.9.4
psycopg2-binary 2.9.5
pyarrow 11.0.0
python-dateutil 2.8.2
pytz 2023.2
PyYAML 6.0
regex 2022.10.31
requests 2.28.2
responses 0.18.0
sentencepiece 0.1.97
setuptools 47.1.0
six 1.16.0
termcolor 2.2.0
tokenizers 0.13.2
torch 1.13.1
torchvision 0.14.1
tqdm 4.65.0
transformers 4.26.1
typing_extensions 4.5.0
urllib3 1.26.15
wheel 0.40.0
xxhash 3.2.0
yarl 1.8.2
zipp 3.15.0

我这个是跑的微调chatglm-6b 的,成功运行能微调

@cywjava
Copy link

cywjava commented Mar 29, 2023

cuda 11.7

@xiaoweiweixiao
Copy link
Author

Package Version

accelerate 0.18.0 aiohttp 3.8.4 aiosignal 1.3.1 async-timeout 4.0.2 asynctest 0.13.0 attrs 22.2.0 cachetools 5.3.0 certifi 2022.12.7 charset-normalizer 3.1.0 cpm-kernels 1.0.11 datasets 2.10.1 DBUtils 3.0.2 dill 0.3.6 filelock 3.10.7 frozenlist 1.3.3 fsspec 2023.1.0 gpt-db-tools 0.0.1 huggingface-hub 0.13.3 icetk 0.0.4 idna 3.4 importlib-metadata 6.1.0 multidict 6.0.4 multiprocess 0.70.14 numpy 1.21.6 nvidia-cublas-cu11 11.10.3.66 nvidia-cuda-nvrtc-cu11 11.7.99 nvidia-cuda-runtime-cu11 11.7.99 nvidia-cudnn-cu11 8.5.0.96 nvidia-ml-py 11.525.84 nvitop 1.0.0 packaging 23.0 pandas 1.3.5 peft 0.2.0 Pillow 9.4.0 pip 22.0.4 protobuf 3.20.0 psutil 5.9.4 psycopg2-binary 2.9.5 pyarrow 11.0.0 python-dateutil 2.8.2 pytz 2023.2 PyYAML 6.0 regex 2022.10.31 requests 2.28.2 responses 0.18.0 sentencepiece 0.1.97 setuptools 47.1.0 six 1.16.0 termcolor 2.2.0 tokenizers 0.13.2 torch 1.13.1 torchvision 0.14.1 tqdm 4.65.0 transformers 4.26.1 typing_extensions 4.5.0 urllib3 1.26.15 wheel 0.40.0 xxhash 3.2.0 yarl 1.8.2 zipp 3.15.0

我这个是跑的微调chatglm-6b 的,成功运行能微调

感谢感谢,transformers 4.26.1这个版本可以跑chatgml吗?

@cywjava
Copy link

cywjava commented Mar 29, 2023

Package Version
accelerate 0.18.0 aiohttp 3.8.4 aiosignal 1.3.1 async-timeout 4.0.2 asynctest 0.13.0 attrs 22.2.0 cachetools 5.3.0 certifi 2022.12.7 charset-normalizer 3.1.0 cpm-kernels 1.0.11 datasets 2.10.1 DBUtils 3.0.2 dill 0.3.6 filelock 3.10.7 frozenlist 1.3.3 fsspec 2023.1.0 gpt-db-tools 0.0.1 huggingface-hub 0.13.3 icetk 0.0.4 idna 3.4 importlib-metadata 6.1.0 multidict 6.0.4 multiprocess 0.70.14 numpy 1.21.6 nvidia-cublas-cu11 11.10.3.66 nvidia-cuda-nvrtc-cu11 11.7.99 nvidia-cuda-runtime-cu11 11.7.99 nvidia-cudnn-cu11 8.5.0.96 nvidia-ml-py 11.525.84 nvitop 1.0.0 packaging 23.0 pandas 1.3.5 peft 0.2.0 Pillow 9.4.0 pip 22.0.4 protobuf 3.20.0 psutil 5.9.4 psycopg2-binary 2.9.5 pyarrow 11.0.0 python-dateutil 2.8.2 pytz 2023.2 PyYAML 6.0 regex 2022.10.31 requests 2.28.2 responses 0.18.0 sentencepiece 0.1.97 setuptools 47.1.0 six 1.16.0 termcolor 2.2.0 tokenizers 0.13.2 torch 1.13.1 torchvision 0.14.1 tqdm 4.65.0 transformers 4.26.1 typing_extensions 4.5.0 urllib3 1.26.15 wheel 0.40.0 xxhash 3.2.0 yarl 1.8.2 zipp 3.15.0
我这个是跑的微调chatglm-6b 的,成功运行能微调

感谢感谢,transformers 4.26.1这个版本可以跑chatgml吗?

必须可以啊,我就是跑的这个模型

@xiaoweiweixiao
Copy link
Author

感谢大哥,不过我的cuda最高只能是11.3,用不了11.7的

@xiaoweiweixiao
Copy link
Author

有没有用cuda11.3的跑通的大哥,发一下conda环境参考一下,十分感谢!!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants