Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cannot use gpu #47

Closed
mathpopo opened this issue Sep 5, 2023 · 6 comments
Closed

cannot use gpu #47

mathpopo opened this issue Sep 5, 2023 · 6 comments

Comments

@mathpopo
Copy link

mathpopo commented Sep 5, 2023

i select local mode and download llama and select gpu,but no gpu use,cpu use completely

@mathpopo
Copy link
Author

mathpopo commented Sep 5, 2023

[?] Parameter count (smaller is faster, larger is more capable): 7B

7B
16B
34B

[?] Quality (lower is faster, higher is more capable): Low | Size: 3.01 GB, RAM usage: 5.51 GB

Low | Size: 3.01 GB, RAM usage: 5.51 GB
Medium | Size: 4.24 GB, RAM usage: 6.74 GB
High | Size: 7.16 GB, RAM usage: 9.66 GB

[?] Use GPU? (Large models might crash on GPU, but will run more quickly) (...: y

@hotwa
Copy link

hotwa commented Sep 5, 2023

I meet same question

@KillianLucas
Copy link
Collaborator

Hey @hotwa and @mathpopo! I've been seeing this across several systems. We're going to change interface packages for CodeLlama in the next week or so, and hopefully that should solve this problem. I'll keep you updated. In the meantime, this works for some users:

pip install --force-reinstall --upgrade llama-cpp-python

Let me know if the GPU gets used after rebuilding llama-cpp-python like that. Thanks!

@mathpopo
Copy link
Author

mathpopo commented Sep 6, 2023

@KillianLucas sorry , cannot work
2023-09-06 15-31-39 的屏幕截图

(base) chenxin@chenxin-Nitro-AN515-52:$ conda activate open-interpreter
(open-interpreter) chenxin@chenxin-Nitro-AN515-52:
$ pip install --force-reinstall --upgrade llama-cpp-python
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting llama-cpp-python
Downloading llama_cpp_python-0.1.83.tar.gz (1.8 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.8/1.8 MB 935.3 kB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting typing-extensions>=4.5.0 (from llama-cpp-python)
Obtaining dependency information for typing-extensions>=4.5.0 from https://files.pythonhosted.org/packages/ec/6b/63cc3df74987c36fe26157ee12e09e8f9db4de771e0f3404263117e75b95/typing_extensions-4.7.1-py3-none-any.whl.metadata
Downloading typing_extensions-4.7.1-py3-none-any.whl.metadata (3.1 kB)
Collecting numpy>=1.20.0 (from llama-cpp-python)
Obtaining dependency information for numpy>=1.20.0 from https://files.pythonhosted.org/packages/32/6a/65dbc57a89078af9ff8bfcd4c0761a50172d90192eaeb1b6f56e5fbf1c3d/numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata
Downloading numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.6 kB)
Collecting diskcache>=5.6.1 (from llama-cpp-python)
Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata
Downloading diskcache-5.6.3-py3-none-any.whl.metadata (20 kB)
Downloading diskcache-5.6.3-py3-none-any.whl (45 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 45.5/45.5 kB 1.6 MB/s eta 0:00:00
Downloading numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 18.2/18.2 MB 1.2 MB/s eta 0:00:00
Downloading typing_extensions-4.7.1-py3-none-any.whl (33 kB)
Building wheels for collected packages: llama-cpp-python
Building wheel for llama-cpp-python (pyproject.toml) ... done
Created wheel for llama-cpp-python: filename=llama_cpp_python-0.1.83-cp311-cp311-linux_x86_64.whl size=413883 sha256=54697e6f41cc6048446d5f80e174a933487d406e5fb75302d4d55dc65bfe5426
Stored in directory: /tmp/pip-ephem-wheel-cache-l9dvavp4/wheels/25/d2/5b/9f3c919284f260835ba686d041a70e06ae7c35adc62493188e
Successfully built llama-cpp-python
Installing collected packages: typing-extensions, numpy, diskcache, llama-cpp-python
Attempting uninstall: typing-extensions
Found existing installation: typing_extensions 4.7.1
Uninstalling typing_extensions-4.7.1:
Successfully uninstalled typing_extensions-4.7.1
Attempting uninstall: numpy
Found existing installation: numpy 1.25.2
Uninstalling numpy-1.25.2:
Successfully uninstalled numpy-1.25.2
Attempting uninstall: diskcache
Found existing installation: diskcache 5.6.3
Uninstalling diskcache-5.6.3:
Successfully uninstalled diskcache-5.6.3
Attempting uninstall: llama-cpp-python
Found existing installation: llama-cpp-python 0.1.83
Uninstalling llama-cpp-python-0.1.83:
Successfully uninstalled llama-cpp-python-0.1.83
Successfully installed diskcache-5.6.3 llama-cpp-python-0.1.83 numpy-1.25.2 typing-extensions-4.7.1

@mathpopo
Copy link
Author

mathpopo commented Sep 6, 2023

@KillianLucas is like that?

can you speak english?

yes, I can.

Can you write C++ code?

yes, I can.

Can you write a bubble algorithm?

yes, I can.

@jordanbtucker
Copy link
Collaborator

This is now a duplicate of #168. If you still need help, please leave a comment on that issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants