Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The cmake .. Command Failed #25

Open
NickDeBeenSAE opened this issue Aug 9, 2023 · 5 comments
Open

The cmake .. Command Failed #25

NickDeBeenSAE opened this issue Aug 9, 2023 · 5 comments

Comments

@NickDeBeenSAE
Copy link

NickDeBeenSAE commented Aug 9, 2023

The above happened because you didn't specify what to build inside of the build directory of gpt4all-backend as per following the repositories instructions.

cmake ..
CMake Warning:
  Ignoring extra path from command line:

   ".."


CMake Error: The source directory "/home/kali/gpt4all" does not appear to contain CMakeLists.txt.
Specify --help for usage, or press the help button on the CMake GUI.
@NickDeBeenSAE
Copy link
Author

Ok, never mind, it worked the second time around via copy and paste.

@NickDeBeenSAE
Copy link
Author

I'll keep you updated.

@NickDeBeenSAE
Copy link
Author

NickDeBeenSAE commented Aug 9, 2023

Ok, here's that update:

poetry install --directory /home/nickdebeen/Downloads/gpt4all
Installing dependencies from lock file

@NickDeBeenSAE
Copy link
Author

As you can see, its not listing anything.

Therefore, Poetry itself has a bug.

@NickDeBeenSAE
Copy link
Author

cmake ..
-- Interprocedural optimization support detected
-- CMAKE_SYSTEM_PROCESSOR: x86_64
-- Configuring ggml implementation target llama-mainline-default in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-mainline
-- x86 detected
-- Configuring ggml implementation target llama-230511-default in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-230511
-- x86 detected
-- Configuring ggml implementation target llama-230519-default in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-230519
-- x86 detected
-- Configuring model implementation target llamamodel-mainline-default
-- Configuring model implementation target replit-mainline-default
-- Configuring model implementation target llamamodel-230519-default
-- Configuring model implementation target llamamodel-230511-default
-- Configuring model implementation target gptj-default
-- Configuring model implementation target falcon-default
-- Configuring model implementation target mpt-default
-- Configuring model implementation target bert-default
-- Configuring model implementation target starcoder-default
-- Configuring ggml implementation target llama-mainline-avxonly in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-mainline
-- x86 detected
-- Configuring ggml implementation target llama-230511-avxonly in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-230511
-- x86 detected
-- Configuring ggml implementation target llama-230519-avxonly in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-230519
-- x86 detected
-- Configuring model implementation target llamamodel-mainline-avxonly
-- Configuring model implementation target replit-mainline-avxonly
-- Configuring model implementation target llamamodel-230519-avxonly
-- Configuring model implementation target llamamodel-230511-avxonly
-- Configuring model implementation target gptj-avxonly
-- Configuring model implementation target falcon-avxonly
-- Configuring model implementation target mpt-avxonly
-- Configuring model implementation target bert-avxonly
-- Configuring model implementation target starcoder-avxonly
-- Configuring done (0.3s)
CMake Error at llama.cpp.cmake:296 (add_library):
  Cannot find source file:

    llama.cpp-mainline/ggml.c

  Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm
  .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90
  .f95 .f03 .hip .ispc
Call Stack (most recent call first):
  CMakeLists.txt:71 (include_ggml)


CMake Error at llama.cpp.cmake:325 (add_library):
  Cannot find source file:

    llama.cpp-mainline/llama.cpp

  Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm
  .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90
  .f95 .f03 .hip .ispc
Call Stack (most recent call first):
  CMakeLists.txt:71 (include_ggml)


CMake Error at llama.cpp.cmake:296 (add_library):
  Cannot find source file:

    llama.cpp-230511/ggml.c

  Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm
  .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90
  .f95 .f03 .hip .ispc
Call Stack (most recent call first):
  CMakeLists.txt:74 (include_ggml)


CMake Error at llama.cpp.cmake:325 (add_library):
  Cannot find source file:

    llama.cpp-230511/llama.cpp

  Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm
  .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90
  .f95 .f03 .hip .ispc
Call Stack (most recent call first):
  CMakeLists.txt:74 (include_ggml)


CMake Error at llama.cpp.cmake:296 (add_library):
  Cannot find source file:

    llama.cpp-230519/ggml.c

  Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm
  .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90
  .f95 .f03 .hip .ispc
Call Stack (most recent call first):
  CMakeLists.txt:75 (include_ggml)


CMake Error at llama.cpp.cmake:325 (add_library):
  Cannot find source file:

    llama.cpp-230519/llama.cpp

  Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm
  .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90
  .f95 .f03 .hip .ispc
Call Stack (most recent call first):
  CMakeLists.txt:75 (include_ggml)


CMake Error at llama.cpp.cmake:296 (add_library):
  No SOURCES given to target: ggml-mainline-default
Call Stack (most recent call first):
  CMakeLists.txt:71 (include_ggml)


CMake Error at llama.cpp.cmake:325 (add_library):
  No SOURCES given to target: llama-mainline-default
Call Stack (most recent call first):
  CMakeLists.txt:71 (include_ggml)


CMake Error at llama.cpp.cmake:296 (add_library):
  No SOURCES given to target: ggml-230511-default
Call Stack (most recent call first):
  CMakeLists.txt:74 (include_ggml)


CMake Error at llama.cpp.cmake:325 (add_library):
  No SOURCES given to target: llama-230511-default
Call Stack (most recent call first):
  CMakeLists.txt:74 (include_ggml)


CMake Error at llama.cpp.cmake:296 (add_library):
  No SOURCES given to target: ggml-230519-default
Call Stack (most recent call first):
  CMakeLists.txt:75 (include_ggml)


CMake Error at llama.cpp.cmake:325 (add_library):
  No SOURCES given to target: llama-230519-default
Call Stack (most recent call first):
  CMakeLists.txt:75 (include_ggml)


CMake Error at llama.cpp.cmake:296 (add_library):
  No SOURCES given to target: ggml-mainline-avxonly
Call Stack (most recent call first):
  CMakeLists.txt:71 (include_ggml)


CMake Error at llama.cpp.cmake:325 (add_library):
  No SOURCES given to target: llama-mainline-avxonly
Call Stack (most recent call first):
  CMakeLists.txt:71 (include_ggml)


CMake Error at llama.cpp.cmake:296 (add_library):
  No SOURCES given to target: ggml-230511-avxonly
Call Stack (most recent call first):
  CMakeLists.txt:74 (include_ggml)


CMake Error at llama.cpp.cmake:325 (add_library):
  No SOURCES given to target: llama-230511-avxonly
Call Stack (most recent call first):
  CMakeLists.txt:74 (include_ggml)


CMake Error at llama.cpp.cmake:296 (add_library):
  No SOURCES given to target: ggml-230519-avxonly
Call Stack (most recent call first):
  CMakeLists.txt:75 (include_ggml)


CMake Error at llama.cpp.cmake:325 (add_library):
  No SOURCES given to target: llama-230519-avxonly
Call Stack (most recent call first):
  CMakeLists.txt:75 (include_ggml)


CMake Generate step failed.  Build files cannot be regenerated correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant