Microsoft Windows [Version 10.0.19045.4170] (c) Microsoft Corporation. All rights reserved. D:\Dot\resources\llm\python>python.exe -m pip uninstall llama-cpp-python Found existing installation: llama_cpp_python 0.2.56 Uninstalling llama_cpp_python-0.2.56: Would remove: d:\dot\resources\llm\python\lib\site-packages\bin\convert-lora-to-ggml.py d:\dot\resources\llm\python\lib\site-packages\bin\convert.py d:\dot\resources\llm\python\lib\site-packages\bin\ggml_shared.dll d:\dot\resources\llm\python\lib\site-packages\bin\llama.dll d:\dot\resources\llm\python\lib\site-packages\bin\llava-cli.exe d:\dot\resources\llm\python\lib\site-packages\bin\llava.dll d:\dot\resources\llm\python\lib\site-packages\include\ggml-alloc.h d:\dot\resources\llm\python\lib\site-packages\include\ggml-backend.h d:\dot\resources\llm\python\lib\site-packages\include\ggml.h d:\dot\resources\llm\python\lib\site-packages\include\llama.h d:\dot\resources\llm\python\lib\site-packages\lib\cmake\llama\llamaconfig.cmake d:\dot\resources\llm\python\lib\site-packages\lib\cmake\llama\llamaconfigversion.cmake d:\dot\resources\llm\python\lib\site-packages\lib\ggml_shared.lib d:\dot\resources\llm\python\lib\site-packages\lib\llama.lib d:\dot\resources\llm\python\lib\site-packages\lib\llava.lib d:\dot\resources\llm\python\lib\site-packages\llama_cpp\* d:\dot\resources\llm\python\lib\site-packages\llama_cpp_python-0.2.56.dist-info\* Proceed (Y/n)? y Successfully uninstalled llama_cpp_python-0.2.56 D:\Dot\resources\llm\python>python.exe" -c "import os; os.environ['CMAKE_ARGS'] = '-DLLAMA_CUBLAS=on'; os.environ['FORCE_CMAKE'] = '1'; import pip._internal; pip._internal.main(['install', '--upgrade', '--force-reinstall', 'llama-cpp-python', '--no-cache-dir'])" 'python.exe" -c "import' is not recognized as an internal or external command, operable program or batch file. D:\Dot\resources\llm\python>python.exe -c "import os; os.environ['CMAKE_ARGS'] = '-DLLAMA_CUBLAS=on'; os.environ['FORCE_CMAKE'] = '1'; import pip._internal; pip._internal.main(['install', '--upgrade', '--force-reinstall', 'llama-cpp-python', '--no-cache-dir'])" WARNING: pip is being invoked by an old script wrapper. This will fail in a future version of pip. Please see https://github.com/pypa/pip/issues/5599 for advice on fixing the underlying issue. To avoid this problem you can invoke Python with '-m pip' instead of running pip directly. Collecting llama-cpp-python Downloading llama_cpp_python-0.2.56.tar.gz (36.9 MB) ---------------------------------------- 36.9/36.9 MB 7.3 MB/s eta 0:00:00 Installing build dependencies ... done Getting requirements to build wheel ... done Installing backend dependencies ... done Preparing metadata (pyproject.toml) ... done Collecting typing-extensions>=4.5.0 (from llama-cpp-python) Downloading typing_extensions-4.10.0-py3-none-any.whl.metadata (3.0 kB) Collecting numpy>=1.20.0 (from llama-cpp-python) Downloading numpy-1.26.4-cp311-cp311-win_amd64.whl.metadata (61 kB) ---------------------------------------- 61.0/61.0 kB ? eta 0:00:00 Collecting diskcache>=5.6.1 (from llama-cpp-python) Downloading diskcache-5.6.3-py3-none-any.whl.metadata (20 kB) Collecting jinja2>=2.11.3 (from llama-cpp-python) Downloading Jinja2-3.1.3-py3-none-any.whl.metadata (3.3 kB) Collecting MarkupSafe>=2.0 (from jinja2>=2.11.3->llama-cpp-python) Downloading MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl.metadata (3.1 kB) Downloading diskcache-5.6.3-py3-none-any.whl (45 kB) ---------------------------------------- 45.5/45.5 kB ? eta 0:00:00 Downloading Jinja2-3.1.3-py3-none-any.whl (133 kB) ---------------------------------------- 133.2/133.2 kB ? eta 0:00:00 Downloading numpy-1.26.4-cp311-cp311-win_amd64.whl (15.8 MB) ---------------------------------------- 15.8/15.8 MB 7.3 MB/s eta 0:00:00 Downloading typing_extensions-4.10.0-py3-none-any.whl (33 kB) Downloading MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl (17 kB) Building wheels for collected packages: llama-cpp-python Building wheel for llama-cpp-python (pyproject.toml) ... error error: subprocess-exited-with-error × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. ¦ exit code: 1 ?-> [472 lines of output] *** scikit-build-core 0.8.2 using CMake 3.28.3 (wheel) *** Configuring CMake... 2024-03-17 23:23:25,681 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None loading initial cache file C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\CMakeInit.txt -- Building for: Visual Studio 17 2022 -- Selecting Windows SDK version 10.0.20348.0 to target Windows 10.0.19045. -- The C compiler identification is MSVC 19.39.33522.0 -- The CXX compiler identification is MSVC 19.39.33522.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.39.33519/bin/Hostx64/x64/cl.exe - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.39.33519/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Could NOT find Git (missing: GIT_EXECUTABLE) CMake Warning at vendor/llama.cpp/scripts/build-info.cmake:14 (message): Git not found. Build info will not be accurate. Call Stack (most recent call first): vendor/llama.cpp/CMakeLists.txt:129 (include) -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Looking for pthread_create in pthreads -- Looking for pthread_create in pthreads - not found -- Looking for pthread_create in pthread -- Looking for pthread_create in pthread - not found -- Found Threads: TRUE -- Could not find nvcc, please set CUDAToolkit_ROOT. CMake Warning at vendor/llama.cpp/CMakeLists.txt:407 (message): cuBLAS not found -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF -- CMAKE_SYSTEM_PROCESSOR: AMD64 -- CMAKE_GENERATOR_PLATFORM: x64 -- x86 detected -- Performing Test HAS_AVX_1 -- Performing Test HAS_AVX_1 - Success -- Performing Test HAS_AVX2_1 -- Performing Test HAS_AVX2_1 - Success -- Performing Test HAS_FMA_1 -- Performing Test HAS_FMA_1 - Success -- Performing Test HAS_AVX512_1 -- Performing Test HAS_AVX512_1 - Failed -- Performing Test HAS_AVX512_2 -- Performing Test HAS_AVX512_2 - Failed CMake Warning (dev) at CMakeLists.txt:21 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. This warning is for project developers. Use -Wno-dev to suppress it. CMake Warning (dev) at CMakeLists.txt:30 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. This warning is for project developers. Use -Wno-dev to suppress it. -- Configuring done (23.2s) -- Generating done (0.1s) -- Build files have been written to: C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build *** Building project with Visual Studio 17 2022... Change Dir: 'C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build' Run Build Command(s): "C:/Program Files/Microsoft Visual Studio/2022/Community/MSBuild/Current/Bin/amd64/MSBuild.exe" ALL_BUILD.vcxproj /p:Configuration=Release /p:Platform=x64 /p:VisualStudioVersion=17.0 /v:n MSBuild version 17.9.5+33de0b227 for .NET Framework Build started 17/03/2024 23:23:49. Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" on node 1 (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ZERO_CHECK.vcxproj" (2) on node 1 (default targets). PrepareForBuild: Creating directory "x64\Release\ZERO_CHECK\". C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ZERO_CHECK.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\". InitializeBuildStatus: Creating "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild". CustomBuild: 1>Checking Build System FinalizeBuildStatus: Deleting file "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild". Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ZERO_CHECK.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\build_info.vcxproj" (3) on node 1 (default targets). PrepareForBuild: Creating directory "build_info.dir\Release\". C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\build_info.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "build_info.dir\Release\build_info.tlog\". InitializeBuildStatus: Creating "build_info.dir\Release\build_info.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "build_info.dir\Release\build_info.tlog\unsuccessfulbuild". CustomBuild: Generating build details from Git -- Could NOT find Git (missing: GIT_EXECUTABLE) CMake Warning at scripts/build-info.cmake:14 (message): Git not found. Build info will not be accurate. Call Stack (most recent call first): scripts/gen-build-info-cpp.cmake:1 (include) Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/common/CMakeLists.txt ClCompile: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\CL.exe /c /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /Gm- /EHsc /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"build_info.dir\Release\\" /Fd"build_info.dir\Release\build_info.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\common\build-info.cpp" build-info.cpp Lib: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\Lib.exe /OUT:"build_info.dir\Release\build_info.lib" /NOLOGO /MACHINE:X64 /machine:x64 "build_info.dir\Release\build-info.obj" build_info.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\build_info.dir\Release\build_info.lib FinalizeBuildStatus: Deleting file "build_info.dir\Release\build_info.tlog\unsuccessfulbuild". Touching "build_info.dir\Release\build_info.tlog\build_info.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\build_info.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.vcxproj" (4) on node 1 (default targets). PrepareForBuild: Creating directory "ggml.dir\Release\". C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "ggml.dir\Release\ggml.tlog\". InitializeBuildStatus: Creating "ggml.dir\Release\ggml.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "ggml.dir\Release\ggml.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/CMakeLists.txt ClCompile: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /Gm- /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c11 /Fo"ggml.dir\Release\\" /Fd"ggml.dir\Release\ggml.pdb" /external:W1 /Gd /TC /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\ggml.c" "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\ggml-alloc.c" "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\ggml-backend.c" "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\ggml-quants.c" ggml.c ggml-alloc.c ggml-backend.c ggml-quants.c Generating Code... Lib: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\Lib.exe /OUT:"ggml.dir\Release\ggml.lib" /NOLOGO /MACHINE:X64 /machine:x64 ggml.dir\Release\ggml.obj "ggml.dir\Release\ggml-alloc.obj" "ggml.dir\Release\ggml-backend.obj" "ggml.dir\Release\ggml-quants.obj" ggml.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml.lib FinalizeBuildStatus: Deleting file "ggml.dir\Release\ggml.tlog\unsuccessfulbuild". Touching "ggml.dir\Release\ggml.tlog\ggml.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (5) on node 1 (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (5) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\llama.vcxproj" (6) on node 1 (default targets). C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\llama.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\bin\Release\". Creating directory "llama.dir\Release\llama.tlog\". InitializeBuildStatus: Creating "llama.dir\Release\llama.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llama.dir\Release\llama.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/CMakeLists.txt ClCompile: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D llama_EXPORTS /Gm- /EHsc /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama.dir\Release\\" /Fd"llama.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\llama.cpp" llama.cpp C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\llama.cpp(3772,69): warning C4566: character represented by universal-character-name '\u010A' cannot be represented in the current code page (1252) [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\llama.vcxproj] MakeDirsForLink: Creating directory "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\Release\". PreLinkEvent: Auto build dll exports setlocal cd C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp if %errorlevel% neq 0 goto :cmEnd C: if %errorlevel% neq 0 goto :cmEnd C:\Users\User\AppData\Local\Temp\pip-build-env-hhxjugyq\normal\Lib\site-packages\cmake\data\bin\cmake.exe -E __create_def C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/llama.dir/Release/exports.def C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/llama.dir/Release//objects.txt if %errorlevel% neq 0 goto :cmEnd :cmEnd endlocal & call :cmErrorLevel %errorlevel% & goto :cmDone :cmErrorLevel exit /b %1 :cmDone if %errorlevel% neq 0 goto :VCEnd :VCEnd Link: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\bin\Release\llama.dll" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /DEF:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/llama.dir/Release/exports.def" /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/bin/Release/llama.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/Release/llama.lib" /MACHINE:X64 /machine:x64 /DLL llama.dir\Release\llama.obj C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml.obj "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-alloc.obj" "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-backend.obj" "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-quants.obj" Creating library C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/Release/llama.lib and object C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/Release/llama.exp llama.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\bin\Release\llama.dll FinalizeBuildStatus: Deleting file "llama.dir\Release\llama.tlog\unsuccessfulbuild". Touching "llama.dir\Release\llama.tlog\llama.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\llama.vcxproj" (default targets). PrepareForBuild: Creating directory "llava.dir\Release\". C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llava.dir\Release\llava.tlog\". InitializeBuildStatus: Creating "llava.dir\Release\llava.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava.dir\Release\llava.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/examples/llava/CMakeLists.txt ClCompile: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\." /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\..\.." /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\..\..\common" /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D GGML_USE_CUBLAS /D "CMAKE_INTDIR=\"Release\"" /Gm- /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llava.dir\Release\\" /Fd"llava.dir\Release\llava.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\llava.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp" llava.cpp clip.cpp C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp(835,9): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj] C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp(835,9): __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp(1180,13): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj] C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp(1180,13): __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp(2024,5): warning C4297: 'clip_n_mmproj_embd': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj] C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp(2024,5): __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function Generating Code... Lib: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\Lib.exe /OUT:"llava.dir\Release\llava.lib" /NOLOGO /MACHINE:X64 /machine:x64 llava.dir\Release\llava.obj llava.dir\Release\clip.obj llava.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.lib FinalizeBuildStatus: Deleting file "llava.dir\Release\llava.tlog\unsuccessfulbuild". Touching "llava.dir\Release\llava.tlog\llava.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\common.vcxproj" (7) on node 1 (default targets). PrepareForBuild: Creating directory "common.dir\Release\". C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\common.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\Release\". Creating directory "common.dir\Release\common.tlog\". InitializeBuildStatus: Creating "common.dir\Release\common.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "common.dir\Release\common.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/common/CMakeLists.txt ClCompile: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\common\." /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /Gm- /EHsc /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"common.dir\Release\\" /Fd"C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\Release\common.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\common\common.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\common\sampling.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\common\console.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\common\grammar-parser.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\common\train.cpp" common.cpp sampling.cpp console.cpp grammar-parser.cpp train.cpp Generating Code... Lib: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\Release\common.lib" /NOLOGO /MACHINE:X64 /machine:x64 common.dir\Release\common.obj common.dir\Release\sampling.obj common.dir\Release\console.obj "common.dir\Release\grammar-parser.obj" common.dir\Release\train.obj "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\build_info.dir\Release\build-info.obj" common.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\Release\common.lib FinalizeBuildStatus: Deleting file "common.dir\Release\common.tlog\unsuccessfulbuild". Touching "common.dir\Release\common.tlog\common.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\common.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_shared.vcxproj" (8) on node 1 (default targets). C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_shared.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "ggml_shared.dir\Release\ggml_shared.tlog\". InitializeBuildStatus: Creating "ggml_shared.dir\Release\ggml_shared.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "ggml_shared.dir\Release\ggml_shared.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/CMakeLists.txt PreLinkEvent: Auto build dll exports setlocal cd C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp if %errorlevel% neq 0 goto :cmEnd C: if %errorlevel% neq 0 goto :cmEnd C:\Users\User\AppData\Local\Temp\pip-build-env-hhxjugyq\normal\Lib\site-packages\cmake\data\bin\cmake.exe -E __create_def C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/ggml_shared.dir/Release/exports.def C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/ggml_shared.dir/Release//objects.txt if %errorlevel% neq 0 goto :cmEnd :cmEnd endlocal & call :cmErrorLevel %errorlevel% & goto :cmDone :cmErrorLevel exit /b %1 :cmDone if %errorlevel% neq 0 goto :VCEnd :VCEnd Link: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\bin\Release\ggml_shared.dll" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /DEF:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/ggml_shared.dir/Release/exports.def" /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/bin/Release/ggml_shared.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/Release/ggml_shared.lib" /MACHINE:X64 /machine:x64 /DLL C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml.obj "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-alloc.obj" "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-backend.obj" "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-quants.obj" Creating library C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/Release/ggml_shared.lib and object C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/Release/ggml_shared.exp ggml_shared.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\bin\Release\ggml_shared.dll FinalizeBuildStatus: Deleting file "ggml_shared.dir\Release\ggml_shared.tlog\unsuccessfulbuild". Touching "ggml_shared.dir\Release\ggml_shared.tlog\ggml_shared.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_shared.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_static.vcxproj" (9) on node 1 (default targets). PrepareForBuild: Creating directory "ggml_static.dir\Release\". C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_static.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "ggml_static.dir\Release\ggml_static.tlog\". InitializeBuildStatus: Creating "ggml_static.dir\Release\ggml_static.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "ggml_static.dir\Release\ggml_static.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/CMakeLists.txt Lib: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\Release\ggml_static.lib" /NOLOGO /MACHINE:X64 /machine:x64 C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml.obj "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-alloc.obj" "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-backend.obj" "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-quants.obj" ggml_static.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\Release\ggml_static.lib FinalizeBuildStatus: Deleting file "ggml_static.dir\Release\ggml_static.tlog\unsuccessfulbuild". Touching "ggml_static.dir\Release\ggml_static.tlog\ggml_static.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_static.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj" (10) on node 1 (default targets). PrepareForBuild: Creating directory "llava-cli.dir\Release\". C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\Release\". Creating directory "llava-cli.dir\Release\llava-cli.tlog\". InitializeBuildStatus: Creating "llava-cli.dir\Release\llava-cli.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava-cli.dir\Release\llava-cli.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/examples/llava/CMakeLists.txt ClCompile: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\common\." /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\." /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\." /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\..\.." /I"C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\..\..\common" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_USE_CUBLAS /D "CMAKE_INTDIR=\"Release\"" /Gm- /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llava-cli.dir\Release\\" /Fd"llava-cli.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\llava-cli.cpp" llava-cli.cpp Link: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\Release\llava-cli.exe" /INCREMENTAL:NO /NOLOGO ..\..\common\Release\common.lib ..\..\Release\llama.lib kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/examples/llava/Release/llava-cli.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/examples/llava/Release/llava-cli.lib" /MACHINE:X64 /machine:x64 "llava-cli.dir\Release\llava-cli.obj" C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj Creating library C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/examples/llava/Release/llava-cli.lib and object C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/examples/llava/Release/llava-cli.exp clip.obj : error LNK2019: unresolved external symbol ggml_backend_cuda_init referenced in function clip_model_load [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj] C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\Release\llava-cli.exe : fatal error LNK1120: 1 unresolved externals [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj] Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj" (default targets) -- FAILED. Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (11) on node 1 (default targets). PrepareForBuild: Creating directory "llava_shared.dir\Release\". C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llava_shared.dir\Release\llava_shared.tlog\". InitializeBuildStatus: Creating "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/examples/llava/CMakeLists.txt Link: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\Release\llava.dll" /INCREMENTAL:NO /NOLOGO ..\..\Release\llama.lib kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/examples/llava/Release/llava.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/examples/llava/Release/llava.lib" /MACHINE:X64 /machine:x64 /DLL C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml.obj "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-alloc.obj" "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-backend.obj" "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.dir\Release\ggml-quants.obj" Creating library C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/examples/llava/Release/llava.lib and object C:/Users/User/AppData/Local/Temp/tmpk5yhn46o/build/vendor/llama.cpp/examples/llava/Release/llava.exp clip.obj : error LNK2019: unresolved external symbol ggml_backend_cuda_init referenced in function clip_model_load [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj] C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\Release\llava.dll : fatal error LNK1120: 1 unresolved externals [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj] Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default targets) -- FAILED. Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (12) on node 1 (default targets). PrepareForBuild: Creating directory "llava_static.dir\Release\". C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llava_static.dir\Release\llava_static.tlog\". InitializeBuildStatus: Creating "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_w9oec7v/llama-cpp-python_c1e84d153a0c41218240e881ea8730da/vendor/llama.cpp/examples/llava/CMakeLists.txt Lib: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.39.33519\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib" /NOLOGO /MACHINE:X64 /machine:x64 C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj llava_static.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib FinalizeBuildStatus: Deleting file "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild". Touching "llava_static.dir\Release\llava_static.tlog\llava_static.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default targets). Done Building Project "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default targets) -- FAILED. Build FAILED. "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ZERO_CHECK.vcxproj" (default target) (2) -> (PrepareForBuild target) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ZERO_CHECK.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\build_info.vcxproj" (default target) (3) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\build_info.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.vcxproj" (default target) (4) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (5) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\llama.vcxproj" (default target) (6) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\llama.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (5) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\llama.vcxproj" (default target) (6) -> (ClCompile target) -> C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\llama.cpp(3772,69): warning C4566: character represented by universal-character-name '\u010A' cannot be represented in the current code page (1252) [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\llama.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (5) -> (PrepareForBuild target) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (5) -> (ClCompile target) -> C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp(835,9): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj] C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp(1180,13): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj] C:\Users\User\AppData\Local\Temp\pip-install-_w9oec7v\llama-cpp-python_c1e84d153a0c41218240e881ea8730da\vendor\llama.cpp\examples\llava\clip.cpp(2024,5): warning C4297: 'clip_n_mmproj_embd': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\common.vcxproj" (default target) (7) -> (PrepareForBuild target) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\common\common.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_shared.vcxproj" (default target) (8) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_shared.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_static.vcxproj" (default target) (9) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\ggml_static.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj" (default target) (10) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default target) (11) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default target) (12) -> C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj" (default target) (10) -> (Link target) -> clip.obj : error LNK2019: unresolved external symbol ggml_backend_cuda_init referenced in function clip_model_load [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj] C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\Release\llava-cli.exe : fatal error LNK1120: 1 unresolved externals [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj] "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default target) (11) -> clip.obj : error LNK2019: unresolved external symbol ggml_backend_cuda_init referenced in function clip_model_load [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj] C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\Release\llava.dll : fatal error LNK1120: 1 unresolved externals [C:\Users\User\AppData\Local\Temp\tmpk5yhn46o\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj] 15 Warning(s) 4 Error(s) Time Elapsed 00:01:00.92 *** CMake build failed [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects [notice] A new release of pip is available: 23.3.2 -> 24.0 [notice] To update, run: python.exe -m pip install --upgrade pip D:\Dot\resources\llm\python>