Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- c:\cUI\python_embeded>python -m pip install C:\Users\MPD_01\Downloads\llama_cpp_python-0.3.7.tar.gz
- Processing c:\users\mpd_01\downloads\llama_cpp_python-0.3.7.tar.gz
- Installing build dependencies ... done
- Getting requirements to build wheel ... done
- Installing backend dependencies ... done
- Preparing metadata (pyproject.toml) ... done
- Requirement already satisfied: typing-extensions>=4.5.0 in c:\cui\python_embeded\lib\site-packages (from llama_cpp_python==0.3.7) (4.12.2)
- Requirement already satisfied: numpy>=1.20.0 in c:\cui\python_embeded\lib\site-packages (from llama_cpp_python==0.3.7) (1.26.4)
- Requirement already satisfied: diskcache>=5.6.1 in c:\cui\python_embeded\lib\site-packages (from llama_cpp_python==0.3.7) (5.6.3)
- Requirement already satisfied: jinja2>=2.11.3 in c:\cui\python_embeded\lib\site-packages (from llama_cpp_python==0.3.7) (3.1.5)
- Requirement already satisfied: MarkupSafe>=2.0 in c:\cui\python_embeded\lib\site-packages (from jinja2>=2.11.3->llama_cpp_python==0.3.7) (3.0.2)
- Building wheels for collected packages: llama_cpp_python
- Building wheel for llama_cpp_python (pyproject.toml) ... error
- error: subprocess-exited-with-error
- × Building wheel for llama_cpp_python (pyproject.toml) did not run successfully.
- │ exit code: 1
- ╰─> [575 lines of output]
- *** scikit-build-core 0.11.0 using CMake 3.31.6 (wheel)
- *** Configuring CMake...
- 2025-03-05 11:07:41,991 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None
- loading initial cache file C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\CMakeInit.txt
- -- Building for: Visual Studio 17 2022
- -- Selecting Windows SDK version 10.0.18362.0 to target Windows 10.0.19045.
- -- The C compiler identification is MSVC 19.43.34808.0
- -- The CXX compiler identification is MSVC 19.43.34808.0
- -- Detecting C compiler ABI info
- -- Detecting C compiler ABI info - done
- -- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
- -- Detecting C compile features
- -- Detecting C compile features - done
- -- Detecting CXX compiler ABI info
- -- Detecting CXX compiler ABI info - done
- -- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
- -- Detecting CXX compile features
- -- Detecting CXX compile features - done
- -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.48.1.windows.1")
- -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
- -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
- -- Looking for pthread_create in pthreads
- -- Looking for pthread_create in pthreads - not found
- -- Looking for pthread_create in pthread
- -- Looking for pthread_create in pthread - not found
- -- Found Threads: TRUE
- -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
- -- CMAKE_SYSTEM_PROCESSOR: AMD64
- -- CMAKE_GENERATOR_PLATFORM: x64
- -- Including CPU backend
- -- Found OpenMP_C: -openmp (found version "2.0")
- -- Found OpenMP_CXX: -openmp (found version "2.0")
- -- Found OpenMP: TRUE (found version "2.0")
- -- x86 detected
- -- Performing Test HAS_AVX_1
- -- Performing Test HAS_AVX_1 - Success
- -- Performing Test HAS_AVX2_1
- -- Performing Test HAS_AVX2_1 - Success
- -- Performing Test HAS_FMA_1
- -- Performing Test HAS_FMA_1 - Success
- -- Performing Test HAS_AVX512_1
- -- Performing Test HAS_AVX512_1 - Success
- -- Adding CPU backend variant ggml-cpu: /arch:AVX512 GGML_AVX512
- CMake Warning at vendor/llama.cpp/ggml/CMakeLists.txt:285 (message):
- GGML build version fixed at 1 likely due to a shallow clone.
- CMake Warning (dev) at CMakeLists.txt:13 (install):
- Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
- Call Stack (most recent call first):
- CMakeLists.txt:97 (llama_cpp_python_install_target)
- This warning is for project developers. Use -Wno-dev to suppress it.
- CMake Warning (dev) at CMakeLists.txt:21 (install):
- Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
- Call Stack (most recent call first):
- CMakeLists.txt:97 (llama_cpp_python_install_target)
- This warning is for project developers. Use -Wno-dev to suppress it.
- CMake Warning (dev) at CMakeLists.txt:13 (install):
- Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
- Call Stack (most recent call first):
- CMakeLists.txt:98 (llama_cpp_python_install_target)
- This warning is for project developers. Use -Wno-dev to suppress it.
- CMake Warning (dev) at CMakeLists.txt:21 (install):
- Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
- Call Stack (most recent call first):
- CMakeLists.txt:98 (llama_cpp_python_install_target)
- This warning is for project developers. Use -Wno-dev to suppress it.
- -- Configuring done (6.5s)
- -- Generating done (0.1s)
- -- Build files have been written to: C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build
- *** Building project with Visual Studio 17 2022...
- Change Dir: 'C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build'
- Run Build Command(s): "C:/Program Files/Microsoft Visual Studio/2022/Community/MSBuild/Current/Bin/amd64/MSBuild.exe" ALL_BUILD.vcxproj /p:Configuration=Release /p:Platform=x64 /p:VisualStudioVersion=17.0 /v:n
- MSBuild version 17.13.15+18b3035f6 for .NET Framework
- Build started 03/05/2025 11:07:48.
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" on node 1 (default targets).
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj" (2) on node 1 (default targets).
- PrepareForBuild:
- Creating directory "x64\Release\ZERO_CHECK\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\".
- InitializeBuildStatus:
- Creating "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
- CustomBuild:
- 1>Checking Build System
- FinalizeBuildStatus:
- Deleting file "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
- Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate".
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj" (default targets).
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj" (3) on node 1 (default targets).
- PrepareForBuild:
- Creating directory "build_info.dir\Release\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "build_info.dir\Release\build_info.tlog\".
- InitializeBuildStatus:
- Creating "build_info.dir\Release\build_info.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "build_info.dir\Release\build_info.tlog\unsuccessfulbuild".
- CustomBuild:
- Generating build details from Git
- -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.48.1.windows.1")
- Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/common/CMakeLists.txt
- ClCompile:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"build_info.dir\Release\\" /Fd"build_info.dir\Release\build_info.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\build-info.cpp"
- build-info.cpp
- Lib:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\Lib.exe /OUT:"build_info.dir\Release\build_info.lib" /NOLOGO /MACHINE:X64 /machine:x64 "build_info.dir\Release\build-info.obj"
- build_info.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.dir\Release\build_info.lib
- FinalizeBuildStatus:
- Deleting file "build_info.dir\Release\build_info.tlog\unsuccessfulbuild".
- Touching "build_info.dir\Release\build_info.tlog\build_info.lastbuildstate".
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj" (default targets).
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) on node 1 (default targets).
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) on node 1 (default targets).
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj" (6) on node 1 (default targets).
- PrepareForBuild:
- Creating directory "ggml-base.dir\Release\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\".
- Creating directory "ggml-base.dir\Release\ggml-base.tlog\".
- InitializeBuildStatus:
- Creating "ggml-base.dir\Release\ggml-base.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "ggml-base.dir\Release\ggml-base.tlog\unsuccessfulbuild".
- CustomBuild:
- Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/CMakeLists.txt
- ClCompile:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BUILD /D GGML_SHARED /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D ggml_base_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c11 /Fo"ggml-base.dir\Release\\" /Fd"ggml-base.dir\Release\vc143.pdb" /external:W1 /Gd /TC /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml.c" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-alloc.c" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-quants.c"
- ggml.c
- C:\Program Files (x86)\Windows Kits\10\Include\10.0.18362.0\um\winbase.h(9305,5): warning C5105: macro expansion producing 'defined' has undefined behavior [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj]
- (compiling source file '../../../../../../pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/ggml.c')
- ggml-alloc.c
- ggml-quants.c
- Generating Code...
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BUILD /D GGML_SHARED /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D ggml_base_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"ggml-base.dir\Release\\" /Fd"ggml-base.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-backend.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-opt.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-threading.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\gguf.cpp"
- ggml-backend.cpp
- ggml-opt.cpp
- ggml-threading.cpp
- gguf.cpp
- Generating Code...
- MakeDirsForLink:
- Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\Release\".
- Link:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml-base.dll" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/bin/Release/ggml-base.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-base.lib" /MACHINE:X64 /machine:x64 /DLL "ggml-base.dir\Release\ggml.obj"
- "ggml-base.dir\Release\ggml-alloc.obj"
- "ggml-base.dir\Release\ggml-backend.obj"
- "ggml-base.dir\Release\ggml-opt.obj"
- "ggml-base.dir\Release\ggml-threading.obj"
- "ggml-base.dir\Release\ggml-quants.obj"
- "ggml-base.dir\Release\gguf.obj"
- Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-base.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-base.exp
- ggml-base.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml-base.dll
- FinalizeBuildStatus:
- Deleting file "ggml-base.dir\Release\ggml-base.tlog\unsuccessfulbuild".
- Touching "ggml-base.dir\Release\ggml-base.tlog\ggml-base.lastbuildstate".
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj" (default targets).
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj" (7) on node 1 (default targets).
- PrepareForBuild:
- Creating directory "ggml-cpu.dir\Release\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "ggml-cpu.dir\Release\ggml-cpu.tlog\".
- InitializeBuildStatus:
- Creating "ggml-cpu.dir\Release\ggml-cpu.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "ggml-cpu.dir\Release\ggml-cpu.tlog\unsuccessfulbuild".
- CustomBuild:
- Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/CMakeLists.txt
- MakeDirsForCl:
- Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.dir\Release\ggml-cpu".
- ClCompile:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BACKEND_BUILD /D GGML_BACKEND_SHARED /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D GGML_USE_CPU_AARCH64 /D GGML_AVX512 /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_cpu_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c11 /Fo"ggml-cpu.dir\Release\/ggml-cpu/ggml-cpu.c.obj" /Fd"ggml-cpu.dir\Release\vc143.pdb" /external:W1 /Gd /TC /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu.c"
- ggml-cpu.c
- C:\Program Files (x86)\Windows Kits\10\Include\10.0.18362.0\um\winbase.h(9305,5): warning C5105: macro expansion producing 'defined' has undefined behavior [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj]
- (compiling source file '../../../../../../pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/ggml-cpu/ggml-cpu.c')
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BACKEND_BUILD /D GGML_BACKEND_SHARED /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D GGML_USE_CPU_AARCH64 /D GGML_AVX512 /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_cpu_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c++17 /Fo"ggml-cpu.dir\Release\/ggml-cpu/ggml-cpu.cpp.obj" /Fd"ggml-cpu.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu.cpp"
- ggml-cpu.cpp
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BACKEND_BUILD /D GGML_BACKEND_SHARED /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D GGML_USE_CPU_AARCH64 /D GGML_AVX512 /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_cpu_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c++17 /Fo"ggml-cpu.dir\Release\\" /Fd"ggml-cpu.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu-aarch64.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu-hbm.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu-traits.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\amx\amx.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\amx\mmq.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\llamafile\sgemm.cpp"
- ggml-cpu-aarch64.cpp
- ggml-cpu-hbm.cpp
- ggml-cpu-traits.cpp
- amx.cpp
- mmq.cpp
- sgemm.cpp
- Generating Code...
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BACKEND_BUILD /D GGML_BACKEND_SHARED /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D GGML_USE_CPU_AARCH64 /D GGML_AVX512 /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_cpu_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c11 /Fo"ggml-cpu.dir\Release\\" /Fd"ggml-cpu.dir\Release\vc143.pdb" /external:W1 /Gd /TC /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu-quants.c"
- ggml-cpu-quants.c
- Link:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml-cpu.dll" /INCREMENTAL:NO /NOLOGO "Release\ggml-base.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/bin/Release/ggml-cpu.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-cpu.lib" /MACHINE:X64 /machine:x64 /DLL "ggml-cpu.dir\Release\ggml-cpu-aarch64.obj"
- "ggml-cpu.dir\Release\ggml-cpu-hbm.obj"
- "ggml-cpu.dir\Release\ggml-cpu-quants.obj"
- "ggml-cpu.dir\Release\ggml-cpu-traits.obj"
- "ggml-cpu.dir\Release\amx.obj"
- "ggml-cpu.dir\Release\mmq.obj"
- "ggml-cpu.dir\Release\sgemm.obj"
- "ggml-cpu.dir\Release\/ggml-cpu/ggml-cpu.c.obj"
- "ggml-cpu.dir\Release\/ggml-cpu/ggml-cpu.cpp.obj"
- Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-cpu.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-cpu.exp
- ggml-cpu.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml-cpu.dll
- FinalizeBuildStatus:
- Deleting file "ggml-cpu.dir\Release\ggml-cpu.tlog\unsuccessfulbuild".
- Touching "ggml-cpu.dir\Release\ggml-cpu.tlog\ggml-cpu.lastbuildstate".
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj" (default targets).
- PrepareForBuild:
- Creating directory "ggml.dir\Release\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "ggml.dir\Release\ggml.tlog\".
- InitializeBuildStatus:
- Creating "ggml.dir\Release\ggml.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "ggml.dir\Release\ggml.tlog\unsuccessfulbuild".
- CustomBuild:
- Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/CMakeLists.txt
- ClCompile:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_USE_CPU /D GGML_BUILD /D GGML_SHARED /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_BACKEND_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"ggml.dir\Release\\" /Fd"ggml.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-backend-reg.cpp"
- ggml-backend-reg.cpp
- Link:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml.dll" /INCREMENTAL:NO /NOLOGO "Release\ggml-cpu.lib" "Release\ggml-base.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/bin/Release/ggml.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml.lib" /MACHINE:X64 /machine:x64 /DLL "ggml.dir\Release\ggml-backend-reg.obj"
- Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml.exp
- ggml.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml.dll
- FinalizeBuildStatus:
- Deleting file "ggml.dir\Release\ggml.tlog\unsuccessfulbuild".
- Touching "ggml.dir\Release\ggml.tlog\ggml.lastbuildstate".
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default targets).
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj" (8) on node 1 (default targets).
- PrepareForBuild:
- Creating directory "llama.dir\Release\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "llama.dir\Release\llama.tlog\".
- InitializeBuildStatus:
- Creating "llama.dir\Release\llama.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "llama.dir\Release\llama.tlog\unsuccessfulbuild".
- CustomBuild:
- Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/src/CMakeLists.txt
- ClCompile:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\common" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_BUILD /D LLAMA_SHARED /D _CRT_SECURE_NO_WARNINGS /D GGML_USE_CPU /D GGML_SHARED /D GGML_BACKEND_SHARED /D "CMAKE_INTDIR=\"Release\"" /D llama_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"llama.dir\Release\\" /Fd"llama.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-adapter.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-arch.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-batch.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-chat.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-context.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-grammar.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-hparams.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-impl.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-kv-cache.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-mmap.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-model-loader.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-model.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-quant.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-sampling.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-vocab.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\unicode.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\unicode-data.cpp"
- llama.cpp
- llama-adapter.cpp
- llama-arch.cpp
- llama-batch.cpp
- llama-chat.cpp
- llama-context.cpp
- llama-grammar.cpp
- llama-hparams.cpp
- llama-impl.cpp
- llama-kv-cache.cpp
- llama-mmap.cpp
- llama-model-loader.cpp
- llama-model.cpp
- llama-quant.cpp
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\type_traits(1706,98): warning C4244: 'argument': conversion from 'unsigned __int64' to 'int', possible loss of data [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj]
- (compiling source file '../../../../../pip-req-build-jyivbl21/vendor/llama.cpp/src/llama-quant.cpp')
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\type_traits(1706,98):
- the template instantiation context (the oldest one first) is
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-quant.cpp(114,17):
- see reference to function template instantiation 'std::thread &std::vector<std::thread,std::allocator<std::thread>>::emplace_back<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,uint8_t*,float*,size_t&>(llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &,ggml_type &,uint8_t *&&,float *&&,size_t &)' being compiled
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-quant.cpp(114,29):
- see the first reference to 'std::vector<std::thread,std::allocator<std::thread>>::emplace_back' in 'llama_tensor_dequantize_impl'
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\vector(909,24):
- see reference to function template instantiation '_Ty &std::vector<_Ty,std::allocator<_Ty>>::_Emplace_one_at_back<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,unsigned char*,float*,size_t&>(llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &,ggml_type &,unsigned char *&&,float *&&,size_t &)' being compiled
- with
- [
- _Ty=std::thread
- ]
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\vector(830,20):
- see reference to function template instantiation '_Ty &std::vector<_Ty,std::allocator<_Ty>>::_Emplace_back_with_unused_capacity<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,unsigned char*,float*,size_t&>(llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &,ggml_type &,unsigned char *&&,float *&&,size_t &)' being compiled
- with
- [
- _Ty=std::thread
- ]
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\vector(848,27):
- see reference to function template instantiation 'void std::_Default_allocator_traits<_Alloc>::construct<_Ty,llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,unsigned char*,float*,size_t&>(_Alloc &,_Objty *const ,llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &,ggml_type &,unsigned char *&&,float *&&,size_t &)' being compiled
- with
- [
- _Alloc=std::allocator<std::thread>,
- _Ty=std::thread,
- _Objty=std::thread
- ]
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\xmemory(732,82):
- see reference to function template instantiation 'std::thread::thread<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,_T,float*,size_t&,0>(_Fn,ggml_type &,_T &&,float *&&,size_t &)' being compiled
- with
- [
- _T=uint8_t *,
- _Fn=llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &
- ]
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\thread(93,9):
- see reference to function template instantiation 'void std::thread::_Start<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,_Ty,float*,size_t&>(_Fn,ggml_type &,_Ty &&,float *&&,size_t &)' being compiled
- with
- [
- _Ty=uint8_t *,
- _Fn=llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &
- ]
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\thread(76,40):
- see reference to function template instantiation 'unsigned int (__cdecl *std::thread::_Get_invoke<std::thread::_Start::_Tuple,0,1,2,3,4>(std::integer_sequence<size_t,0,1,2,3,4>) noexcept)(void *) noexcept' being compiled
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\thread(67,17):
- see reference to function template instantiation 'unsigned int std::thread::_Invoke<_Tuple,0,1,2,3,4>(void *) noexcept' being compiled
- with
- [
- _Tuple=std::thread::_Start::_Tuple
- ]
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\thread(60,14):
- see reference to function template instantiation 'void std::invoke<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>,ggml_type,uint8_t*,float*,unsigned __int64>(_Callable &&,_Ty1 &&,uint8_t *&&,float *&&,unsigned __int64 &&) noexcept(false)' being compiled
- with
- [
- _Callable=llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>,
- _Ty1=ggml_type
- ]
- llama-sampling.cpp
- llama-vocab.cpp
- unicode.cpp
- unicode-data.cpp
- Generating Code...
- MakeDirsForLink:
- Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\Release\".
- Link:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\llama.dll" /INCREMENTAL:NO /NOLOGO ..\ggml\src\Release\ggml.lib "..\ggml\src\Release\ggml-cpu.lib" "..\ggml\src\Release\ggml-base.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/bin/Release/llama.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/src/Release/llama.lib" /MACHINE:X64 /machine:x64 /DLL llama.dir\Release\llama.obj
- "llama.dir\Release\llama-adapter.obj"
- "llama.dir\Release\llama-arch.obj"
- "llama.dir\Release\llama-batch.obj"
- "llama.dir\Release\llama-chat.obj"
- "llama.dir\Release\llama-context.obj"
- "llama.dir\Release\llama-grammar.obj"
- "llama.dir\Release\llama-hparams.obj"
- "llama.dir\Release\llama-impl.obj"
- "llama.dir\Release\llama-kv-cache.obj"
- "llama.dir\Release\llama-mmap.obj"
- "llama.dir\Release\llama-model-loader.obj"
- "llama.dir\Release\llama-model.obj"
- "llama.dir\Release\llama-quant.obj"
- "llama.dir\Release\llama-sampling.obj"
- "llama.dir\Release\llama-vocab.obj"
- llama.dir\Release\unicode.obj
- "llama.dir\Release\unicode-data.obj"
- Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/src/Release/llama.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/src/Release/llama.exp
- llama.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\llama.dll
- FinalizeBuildStatus:
- Deleting file "llama.dir\Release\llama.tlog\unsuccessfulbuild".
- Touching "llama.dir\Release\llama.tlog\llama.lastbuildstate".
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj" (default targets).
- PrepareForBuild:
- Creating directory "llava.dir\Release\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "llava.dir\Release\llava.tlog\".
- InitializeBuildStatus:
- Creating "llava.dir\Release\llava.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "llava.dir\Release\llava.tlog\unsuccessfulbuild".
- CustomBuild:
- Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/examples/llava/CMakeLists.txt
- ClCompile:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\..\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\..\..\common" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\common" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D GGML_USE_CPU /D GGML_SHARED /D GGML_BACKEND_SHARED /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"llava.dir\Release\\" /Fd"llava.dir\Release\llava.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\llava.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp"
- llava.cpp
- clip.cpp
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1131,9): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1131,9):
- __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1586,13): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1586,13):
- __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(2820,5): warning C4297: 'clip_n_mmproj_embd': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(2820,5):
- __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function
- Generating Code...
- Lib:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\Lib.exe /OUT:"llava.dir\Release\llava.lib" /NOLOGO /MACHINE:X64 /machine:x64 llava.dir\Release\llava.obj
- llava.dir\Release\clip.obj
- llava.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.lib
- FinalizeBuildStatus:
- Deleting file "llava.dir\Release\llava.tlog\unsuccessfulbuild".
- Touching "llava.dir\Release\llava.tlog\llava.lastbuildstate".
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default targets).
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj" (9) on node 1 (default targets).
- PrepareForBuild:
- Creating directory "common.dir\Release\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\Release\".
- Creating directory "common.dir\Release\common.tlog\".
- InitializeBuildStatus:
- Creating "common.dir\Release\common.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "common.dir\Release\common.tlog\unsuccessfulbuild".
- CustomBuild:
- Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/common/CMakeLists.txt
- ClCompile:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\common" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D LLAMA_SHARED /D GGML_USE_CPU /D GGML_SHARED /D GGML_BACKEND_SHARED /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"common.dir\Release\\" /Fd"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\Release\common.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\arg.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\common.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\console.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\json-schema-to-grammar.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\ngram-cache.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\sampling.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\speculative.cpp"
- arg.cpp
- common.cpp
- console.cpp
- json-schema-to-grammar.cpp
- log.cpp
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,79): error C2039: 'system_clock': is not a member of 'std::chrono' [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\__msvc_chrono.hpp(286,11):
- see declaration of 'std::chrono'
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,79): error C3083: 'system_clock': the symbol to the left of a '::' must be a type [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,93): error C2039: 'now': is not a member of 'std::chrono' [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\__msvc_chrono.hpp(286,11):
- see declaration of 'std::chrono'
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,93): error C3861: 'now': identifier not found [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,25): error C2672: 'std::chrono::duration_cast': no matching overloaded function found [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\__msvc_chrono.hpp(87,19):
- could be '_To std::chrono::duration_cast(const std::chrono::duration<_Rep,_Period> &) noexcept(<expr>)'
- ngram-cache.cpp
- sampling.cpp
- speculative.cpp
- Generating Code...
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj" (default targets) -- FAILED.
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (13) on node 1 (default targets).
- PrepareForBuild:
- Creating directory "llava_shared.dir\Release\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\".
- Creating directory "llava_shared.dir\Release\llava_shared.tlog\".
- InitializeBuildStatus:
- Creating "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild".
- CustomBuild:
- Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/examples/llava/CMakeLists.txt
- Link:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\llava.dll" /INCREMENTAL:NO /NOLOGO ..\..\src\Release\llama.lib ..\..\ggml\src\Release\ggml.lib "..\..\ggml\src\Release\ggml-cpu.lib" "..\..\ggml\src\Release\ggml-base.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/examples/llava/Release/llava.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/examples/llava/Release/llava.lib" /MACHINE:X64 /machine:x64 /DLL C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj
- C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj
- Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/examples/llava/Release/llava.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/examples/llava/Release/llava.exp
- llava_shared.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\llava.dll
- FinalizeBuildStatus:
- Deleting file "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild".
- Touching "llava_shared.dir\Release\llava_shared.tlog\llava_shared.lastbuildstate".
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default targets).
- Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (14) on node 1 (default targets).
- PrepareForBuild:
- Creating directory "llava_static.dir\Release\".
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj]
- Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
- Creating directory "llava_static.dir\Release\llava_static.tlog\".
- InitializeBuildStatus:
- Creating "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
- Touching "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild".
- CustomBuild:
- Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/examples/llava/CMakeLists.txt
- Lib:
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib" /NOLOGO /MACHINE:X64 /machine:x64 C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj
- C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj
- llava_static.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib
- FinalizeBuildStatus:
- Deleting file "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild".
- Touching "llava_static.dir\Release\llava_static.tlog\llava_static.lastbuildstate".
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default targets).
- Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default targets) -- FAILED.
- Build FAILED.
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj" (default target) (2) ->
- (PrepareForBuild target) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj" (default target) (3) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj" (default target) (6) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj" (default target) (6) ->
- (ClCompile target) ->
- C:\Program Files (x86)\Windows Kits\10\Include\10.0.18362.0\um\winbase.h(9305,5): warning C5105: macro expansion producing 'defined' has undefined behavior [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj" (default target) (7) ->
- (PrepareForBuild target) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj" (default target) (7) ->
- (ClCompile target) ->
- C:\Program Files (x86)\Windows Kits\10\Include\10.0.18362.0\um\winbase.h(9305,5): warning C5105: macro expansion producing 'defined' has undefined behavior [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
- (PrepareForBuild target) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj" (default target) (8) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj" (default target) (8) ->
- (ClCompile target) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\type_traits(1706,98): warning C4244: 'argument': conversion from 'unsigned __int64' to 'int', possible loss of data [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
- (PrepareForBuild target) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
- (ClCompile target) ->
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1131,9): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1586,13): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(2820,5): warning C4297: 'clip_n_mmproj_embd': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj" (default target) (9) ->
- (PrepareForBuild target) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default target) (13) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default target) (14) ->
- C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj]
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
- "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj" (default target) (9) ->
- (ClCompile target) ->
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,79): error C2039: 'system_clock': is not a member of 'std::chrono' [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,79): error C3083: 'system_clock': the symbol to the left of a '::' must be a type [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,93): error C2039: 'now': is not a member of 'std::chrono' [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,93): error C3861: 'now': identifier not found [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,25): error C2672: 'std::chrono::duration_cast': no matching overloaded function found [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
- 16 Warning(s)
- 5 Error(s)
- Time Elapsed 00:00:43.43
- *** CMake build failed
- [end of output]
- note: This error originates from a subprocess, and is likely not a problem with pip.
- ERROR: Failed building wheel for llama_cpp_python
- Failed to build llama_cpp_python
- ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama_cpp_python)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement