Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not Able To Utilize AMD GPU's #1545

Open
Essak786 opened this issue Jun 21, 2024 · 1 comment
Open

Not Able To Utilize AMD GPU's #1545

Essak786 opened this issue Jun 21, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@Essak786
Copy link

Can someone help me configure this
Using Python 3.11
ROCm Version 5.5.1

× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [55 lines of output]
*** scikit-build-core 0.9.6 using CMake 3.29.6 (wheel)
*** Configuring CMake...
2024-06-21 21:21:30,202 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None
loading initial cache file C:\Users\essak\AppData\Local\Temp\tmps6wly3ej\build\CMakeInit.txt
-- Building for: Visual Studio 17 2022
-- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.22631.
-- The C compiler identification is MSVC 19.40.33811.0
-- The CXX compiler identification is MSVC 19.40.33811.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Could NOT find Git (missing: GIT_EXECUTABLE)
CMake Warning at vendor/llama.cpp/scripts/build-info.cmake:14 (message):
Git not found. Build info will not be accurate.
Call Stack (most recent call first):
vendor/llama.cpp/CMakeLists.txt:151 (include)

  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
  -- Looking for pthread_create in pthreads
  -- Looking for pthread_create in pthreads - not found
  -- Looking for pthread_create in pthread
  -- Looking for pthread_create in pthread - not found
  -- Found Threads: TRUE
  -- Found OpenMP_C: -openmp (found version "2.0")
  -- Found OpenMP_CXX: -openmp (found version "2.0")
  -- Found OpenMP: TRUE (found version "2.0")
  -- OpenMP found
  CMake Error at vendor/llama.cpp/CMakeLists.txt:593 (find_package):
    By not providing "Findhip.cmake" in CMAKE_MODULE_PATH this project has
    asked CMake to find a package configuration file provided by "hip", but
    CMake did not find one.

    Could not find a package configuration file provided by "hip" with any of
    the following names:

      hipConfig.cmake
      hip-config.cmake

    Add the installation prefix of "hip" to CMAKE_PREFIX_PATH or set "hip_DIR"
    to a directory containing one of the above files.  If "hip" provides a
    separate development package or SDK, be sure it has been installed.


  -- Configuring incomplete, errors occurred!

  *** CMake configuration failed
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python

image

@Essak786
Copy link
Author

image

@abetlen abetlen added the bug Something isn't working label Jun 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants