You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [103 lines of output]
*** scikit-build-core 0.8.0 using CMake 3.28.1 (wheel)
*** Configuring CMake...
2024-02-07 17:13:07,860 - scikit_build_core - WARNING - libdir/ldlibrary: /root/miniconda3/envs/privateGPT/lib/libpython3.11.a is not a real file!
2024-02-07 17:13:07,860 - scikit_build_core - WARNING - Can't find a Python library, got libdir=/root/miniconda3/envs/privateGPT/lib, ldlibrary=libpython3.11.a, multiarch=x86_64-linux-gnu, masd=None
loading initial cache file /tmp/tmpxld2danh/build/CMakeInit.txt
-- The C compiler identification is GNU 9.4.0
-- The CXX compiler identification is GNU 9.4.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: /usr/bin/git (found version "2.25.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Check if compiler accepts -pthread
-- Check if compiler accepts -pthread - yes
-- Found Threads: TRUE
-- Found CUDAToolkit: /usr/local/cuda-12.3/targets/x86_64-linux/include (found version "12.3.107")
-- cuBLAS found
CMake Error at /tmp/pip-build-env-c2k3mvlg/normal/lib/python3.11/site-packages/cmake/data/share/cmake-3.28/Modules/CMakeDetermineCompilerId.cmake:780 (message):
Compiling the CUDA compiler identification source file
"CMakeCUDACompilerId.cu" failed.
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
[notice] A new release of pip is available: 23.3.1 -> 24.0
[notice] To update, run: pip install --upgrade pip
How did the experts solve this issue?
The text was updated successfully, but these errors were encountered:
(base) root@yons-MS-7E06:/mnt/code/python/privateGPT# CMAKE_ARGS='-DLLAMA_CUBLAS=on' poetry run pip install --force-reinstall --no-cache-dir llama-cpp-python
Warning: Found deprecated key 'default' or 'secondary' in pyproject.toml configuration for source tsinghua. Please provide the key 'priority' instead. Accepted values are: 'default', 'primary', 'secondary', 'supplemental', 'explicit'.
Looking in indexes: http://mirrors.aliyun.com/pypi/simple/
Collecting llama-cpp-python
Downloading http://mirrors.aliyun.com/pypi/packages/af/a6/6b836876620823551650db19d217118b9ef0983a936aa7895ed5d05df9c0/llama_cpp_python-0.2.39.tar.gz (10.8 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.8/10.8 MB 45.4 MB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... done
Installing backend dependencies ... done
Preparing metadata (pyproject.toml) ... done
Collecting typing-extensions>=4.5.0 (from llama-cpp-python)
Downloading http://mirrors.aliyun.com/pypi/packages/b7/f4/6a90020cd2d93349b442bfcb657d0dc91eee65491600b2cb1d388bc98e6b/typing_extensions-4.9.0-py3-none-any.whl (32 kB)
Collecting numpy>=1.20.0 (from llama-cpp-python)
Downloading http://mirrors.aliyun.com/pypi/packages/3a/d0/edc009c27b406c4f9cbc79274d6e46d634d139075492ad055e3d68445925/numpy-1.26.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 18.3/18.3 MB 44.9 MB/s eta 0:00:00
Collecting diskcache>=5.6.1 (from llama-cpp-python)
Downloading http://mirrors.aliyun.com/pypi/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl (45 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 45.5/45.5 kB 400.6 MB/s eta 0:00:00
Collecting jinja2>=2.11.3 (from llama-cpp-python)
Downloading http://mirrors.aliyun.com/pypi/packages/30/6d/6de6be2d02603ab56e72997708809e8a5b0fbfee080735109b40a3564843/Jinja2-3.1.3-py3-none-any.whl (133 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.2/133.2 kB 349.6 MB/s eta 0:00:00
Collecting MarkupSafe>=2.0 (from jinja2>=2.11.3->llama-cpp-python)
Downloading http://mirrors.aliyun.com/pypi/packages/97/18/c30da5e7a0e7f4603abfc6780574131221d9148f323752c2755d48abad30/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (28 kB)
Building wheels for collected packages: llama-cpp-python
Building wheel for llama-cpp-python (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [103 lines of output]
*** scikit-build-core 0.8.0 using CMake 3.28.1 (wheel)
*** Configuring CMake...
2024-02-07 17:13:07,860 - scikit_build_core - WARNING - libdir/ldlibrary: /root/miniconda3/envs/privateGPT/lib/libpython3.11.a is not a real file!
2024-02-07 17:13:07,860 - scikit_build_core - WARNING - Can't find a Python library, got libdir=/root/miniconda3/envs/privateGPT/lib, ldlibrary=libpython3.11.a, multiarch=x86_64-linux-gnu, masd=None
loading initial cache file /tmp/tmpxld2danh/build/CMakeInit.txt
-- The C compiler identification is GNU 9.4.0
-- The CXX compiler identification is GNU 9.4.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: /usr/bin/git (found version "2.25.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Check if compiler accepts -pthread
-- Check if compiler accepts -pthread - yes
-- Found Threads: TRUE
-- Found CUDAToolkit: /usr/local/cuda-12.3/targets/x86_64-linux/include (found version "12.3.107")
-- cuBLAS found
CMake Error at /tmp/pip-build-env-c2k3mvlg/normal/lib/python3.11/site-packages/cmake/data/share/cmake-3.28/Modules/CMakeDetermineCompilerId.cmake:780 (message):
Compiling the CUDA compiler identification source file
"CMakeCUDACompilerId.cu" failed.
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
[notice] A new release of pip is available: 23.3.1 -> 24.0
[notice] To update, run: pip install --upgrade pip
How did the experts solve this issue?
The text was updated successfully, but these errors were encountered: