Advertisement
Guest User

Untitled

a guest
Mar 5th, 2025
77
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 69.54 KB | None | 0 0
  1. c:\cUI\python_embeded>python -m pip install C:\Users\MPD_01\Downloads\llama_cpp_python-0.3.7.tar.gz
  2. Processing c:\users\mpd_01\downloads\llama_cpp_python-0.3.7.tar.gz
  3. Installing build dependencies ... done
  4. Getting requirements to build wheel ... done
  5. Installing backend dependencies ... done
  6. Preparing metadata (pyproject.toml) ... done
  7. Requirement already satisfied: typing-extensions>=4.5.0 in c:\cui\python_embeded\lib\site-packages (from llama_cpp_python==0.3.7) (4.12.2)
  8. Requirement already satisfied: numpy>=1.20.0 in c:\cui\python_embeded\lib\site-packages (from llama_cpp_python==0.3.7) (1.26.4)
  9. Requirement already satisfied: diskcache>=5.6.1 in c:\cui\python_embeded\lib\site-packages (from llama_cpp_python==0.3.7) (5.6.3)
  10. Requirement already satisfied: jinja2>=2.11.3 in c:\cui\python_embeded\lib\site-packages (from llama_cpp_python==0.3.7) (3.1.5)
  11. Requirement already satisfied: MarkupSafe>=2.0 in c:\cui\python_embeded\lib\site-packages (from jinja2>=2.11.3->llama_cpp_python==0.3.7) (3.0.2)
  12. Building wheels for collected packages: llama_cpp_python
  13. Building wheel for llama_cpp_python (pyproject.toml) ... error
  14. error: subprocess-exited-with-error
  15.  
  16. × Building wheel for llama_cpp_python (pyproject.toml) did not run successfully.
  17. │ exit code: 1
  18. ╰─> [575 lines of output]
  19. *** scikit-build-core 0.11.0 using CMake 3.31.6 (wheel)
  20. *** Configuring CMake...
  21. 2025-03-05 11:07:41,991 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None
  22. loading initial cache file C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\CMakeInit.txt
  23. -- Building for: Visual Studio 17 2022
  24. -- Selecting Windows SDK version 10.0.18362.0 to target Windows 10.0.19045.
  25. -- The C compiler identification is MSVC 19.43.34808.0
  26. -- The CXX compiler identification is MSVC 19.43.34808.0
  27. -- Detecting C compiler ABI info
  28. -- Detecting C compiler ABI info - done
  29. -- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
  30. -- Detecting C compile features
  31. -- Detecting C compile features - done
  32. -- Detecting CXX compiler ABI info
  33. -- Detecting CXX compiler ABI info - done
  34. -- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.43.34808/bin/Hostx64/x64/cl.exe - skipped
  35. -- Detecting CXX compile features
  36. -- Detecting CXX compile features - done
  37. -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.48.1.windows.1")
  38. -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
  39. -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
  40. -- Looking for pthread_create in pthreads
  41. -- Looking for pthread_create in pthreads - not found
  42. -- Looking for pthread_create in pthread
  43. -- Looking for pthread_create in pthread - not found
  44. -- Found Threads: TRUE
  45. -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
  46. -- CMAKE_SYSTEM_PROCESSOR: AMD64
  47. -- CMAKE_GENERATOR_PLATFORM: x64
  48. -- Including CPU backend
  49. -- Found OpenMP_C: -openmp (found version "2.0")
  50. -- Found OpenMP_CXX: -openmp (found version "2.0")
  51. -- Found OpenMP: TRUE (found version "2.0")
  52. -- x86 detected
  53. -- Performing Test HAS_AVX_1
  54. -- Performing Test HAS_AVX_1 - Success
  55. -- Performing Test HAS_AVX2_1
  56. -- Performing Test HAS_AVX2_1 - Success
  57. -- Performing Test HAS_FMA_1
  58. -- Performing Test HAS_FMA_1 - Success
  59. -- Performing Test HAS_AVX512_1
  60. -- Performing Test HAS_AVX512_1 - Success
  61. -- Adding CPU backend variant ggml-cpu: /arch:AVX512 GGML_AVX512
  62. CMake Warning at vendor/llama.cpp/ggml/CMakeLists.txt:285 (message):
  63. GGML build version fixed at 1 likely due to a shallow clone.
  64.  
  65.  
  66. CMake Warning (dev) at CMakeLists.txt:13 (install):
  67. Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  68. Call Stack (most recent call first):
  69. CMakeLists.txt:97 (llama_cpp_python_install_target)
  70. This warning is for project developers. Use -Wno-dev to suppress it.
  71.  
  72. CMake Warning (dev) at CMakeLists.txt:21 (install):
  73. Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  74. Call Stack (most recent call first):
  75. CMakeLists.txt:97 (llama_cpp_python_install_target)
  76. This warning is for project developers. Use -Wno-dev to suppress it.
  77.  
  78. CMake Warning (dev) at CMakeLists.txt:13 (install):
  79. Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  80. Call Stack (most recent call first):
  81. CMakeLists.txt:98 (llama_cpp_python_install_target)
  82. This warning is for project developers. Use -Wno-dev to suppress it.
  83.  
  84. CMake Warning (dev) at CMakeLists.txt:21 (install):
  85. Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  86. Call Stack (most recent call first):
  87. CMakeLists.txt:98 (llama_cpp_python_install_target)
  88. This warning is for project developers. Use -Wno-dev to suppress it.
  89.  
  90. -- Configuring done (6.5s)
  91. -- Generating done (0.1s)
  92. -- Build files have been written to: C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build
  93. *** Building project with Visual Studio 17 2022...
  94. Change Dir: 'C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build'
  95.  
  96. Run Build Command(s): "C:/Program Files/Microsoft Visual Studio/2022/Community/MSBuild/Current/Bin/amd64/MSBuild.exe" ALL_BUILD.vcxproj /p:Configuration=Release /p:Platform=x64 /p:VisualStudioVersion=17.0 /v:n
  97. MSBuild version 17.13.15+18b3035f6 for .NET Framework
  98. Build started 03/05/2025 11:07:48.
  99.  
  100. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" on node 1 (default targets).
  101. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj" (2) on node 1 (default targets).
  102. PrepareForBuild:
  103. Creating directory "x64\Release\ZERO_CHECK\".
  104. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj]
  105. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  106. Creating directory "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\".
  107. InitializeBuildStatus:
  108. Creating "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  109. Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
  110. CustomBuild:
  111. 1>Checking Build System
  112. FinalizeBuildStatus:
  113. Deleting file "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
  114. Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate".
  115. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj" (default targets).
  116. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj" (3) on node 1 (default targets).
  117. PrepareForBuild:
  118. Creating directory "build_info.dir\Release\".
  119. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj]
  120. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  121. Creating directory "build_info.dir\Release\build_info.tlog\".
  122. InitializeBuildStatus:
  123. Creating "build_info.dir\Release\build_info.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  124. Touching "build_info.dir\Release\build_info.tlog\unsuccessfulbuild".
  125. CustomBuild:
  126. Generating build details from Git
  127. -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.48.1.windows.1")
  128. Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/common/CMakeLists.txt
  129. ClCompile:
  130. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"build_info.dir\Release\\" /Fd"build_info.dir\Release\build_info.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\build-info.cpp"
  131. build-info.cpp
  132. Lib:
  133. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\Lib.exe /OUT:"build_info.dir\Release\build_info.lib" /NOLOGO /MACHINE:X64 /machine:x64 "build_info.dir\Release\build-info.obj"
  134. build_info.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.dir\Release\build_info.lib
  135. FinalizeBuildStatus:
  136. Deleting file "build_info.dir\Release\build_info.tlog\unsuccessfulbuild".
  137. Touching "build_info.dir\Release\build_info.tlog\build_info.lastbuildstate".
  138. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj" (default targets).
  139. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) on node 1 (default targets).
  140. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) on node 1 (default targets).
  141. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj" (6) on node 1 (default targets).
  142. PrepareForBuild:
  143. Creating directory "ggml-base.dir\Release\".
  144. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj]
  145. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  146. Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\".
  147. Creating directory "ggml-base.dir\Release\ggml-base.tlog\".
  148. InitializeBuildStatus:
  149. Creating "ggml-base.dir\Release\ggml-base.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  150. Touching "ggml-base.dir\Release\ggml-base.tlog\unsuccessfulbuild".
  151. CustomBuild:
  152. Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/CMakeLists.txt
  153. ClCompile:
  154. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BUILD /D GGML_SHARED /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D ggml_base_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c11 /Fo"ggml-base.dir\Release\\" /Fd"ggml-base.dir\Release\vc143.pdb" /external:W1 /Gd /TC /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml.c" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-alloc.c" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-quants.c"
  155. ggml.c
  156. C:\Program Files (x86)\Windows Kits\10\Include\10.0.18362.0\um\winbase.h(9305,5): warning C5105: macro expansion producing 'defined' has undefined behavior [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj]
  157. (compiling source file '../../../../../../pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/ggml.c')
  158.  
  159. ggml-alloc.c
  160. ggml-quants.c
  161. Generating Code...
  162. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BUILD /D GGML_SHARED /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D ggml_base_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"ggml-base.dir\Release\\" /Fd"ggml-base.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-backend.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-opt.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-threading.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\gguf.cpp"
  163. ggml-backend.cpp
  164. ggml-opt.cpp
  165. ggml-threading.cpp
  166. gguf.cpp
  167. Generating Code...
  168. MakeDirsForLink:
  169. Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\Release\".
  170. Link:
  171. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml-base.dll" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/bin/Release/ggml-base.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-base.lib" /MACHINE:X64 /machine:x64 /DLL "ggml-base.dir\Release\ggml.obj"
  172. "ggml-base.dir\Release\ggml-alloc.obj"
  173. "ggml-base.dir\Release\ggml-backend.obj"
  174. "ggml-base.dir\Release\ggml-opt.obj"
  175. "ggml-base.dir\Release\ggml-threading.obj"
  176. "ggml-base.dir\Release\ggml-quants.obj"
  177. "ggml-base.dir\Release\gguf.obj"
  178. Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-base.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-base.exp
  179. ggml-base.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml-base.dll
  180. FinalizeBuildStatus:
  181. Deleting file "ggml-base.dir\Release\ggml-base.tlog\unsuccessfulbuild".
  182. Touching "ggml-base.dir\Release\ggml-base.tlog\ggml-base.lastbuildstate".
  183. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj" (default targets).
  184. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj" (7) on node 1 (default targets).
  185. PrepareForBuild:
  186. Creating directory "ggml-cpu.dir\Release\".
  187. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj]
  188. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  189. Creating directory "ggml-cpu.dir\Release\ggml-cpu.tlog\".
  190. InitializeBuildStatus:
  191. Creating "ggml-cpu.dir\Release\ggml-cpu.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  192. Touching "ggml-cpu.dir\Release\ggml-cpu.tlog\unsuccessfulbuild".
  193. CustomBuild:
  194. Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/CMakeLists.txt
  195. MakeDirsForCl:
  196. Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.dir\Release\ggml-cpu".
  197. ClCompile:
  198. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BACKEND_BUILD /D GGML_BACKEND_SHARED /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D GGML_USE_CPU_AARCH64 /D GGML_AVX512 /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_cpu_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c11 /Fo"ggml-cpu.dir\Release\/ggml-cpu/ggml-cpu.c.obj" /Fd"ggml-cpu.dir\Release\vc143.pdb" /external:W1 /Gd /TC /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu.c"
  199. ggml-cpu.c
  200. C:\Program Files (x86)\Windows Kits\10\Include\10.0.18362.0\um\winbase.h(9305,5): warning C5105: macro expansion producing 'defined' has undefined behavior [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj]
  201. (compiling source file '../../../../../../pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/ggml-cpu/ggml-cpu.c')
  202.  
  203. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BACKEND_BUILD /D GGML_BACKEND_SHARED /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D GGML_USE_CPU_AARCH64 /D GGML_AVX512 /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_cpu_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c++17 /Fo"ggml-cpu.dir\Release\/ggml-cpu/ggml-cpu.cpp.obj" /Fd"ggml-cpu.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu.cpp"
  204. ggml-cpu.cpp
  205. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BACKEND_BUILD /D GGML_BACKEND_SHARED /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D GGML_USE_CPU_AARCH64 /D GGML_AVX512 /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_cpu_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c++17 /Fo"ggml-cpu.dir\Release\\" /Fd"ggml-cpu.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu-aarch64.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu-hbm.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu-traits.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\amx\amx.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\amx\mmq.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\llamafile\sgemm.cpp"
  206. ggml-cpu-aarch64.cpp
  207. ggml-cpu-hbm.cpp
  208. ggml-cpu-traits.cpp
  209. amx.cpp
  210. mmq.cpp
  211. sgemm.cpp
  212. Generating Code...
  213. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_BACKEND_BUILD /D GGML_BACKEND_SHARED /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D GGML_USE_CPU_AARCH64 /D GGML_AVX512 /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_cpu_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c11 /Fo"ggml-cpu.dir\Release\\" /Fd"ggml-cpu.dir\Release\vc143.pdb" /external:W1 /Gd /TC /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-cpu\ggml-cpu-quants.c"
  214. ggml-cpu-quants.c
  215. Link:
  216. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml-cpu.dll" /INCREMENTAL:NO /NOLOGO "Release\ggml-base.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/bin/Release/ggml-cpu.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-cpu.lib" /MACHINE:X64 /machine:x64 /DLL "ggml-cpu.dir\Release\ggml-cpu-aarch64.obj"
  217. "ggml-cpu.dir\Release\ggml-cpu-hbm.obj"
  218. "ggml-cpu.dir\Release\ggml-cpu-quants.obj"
  219. "ggml-cpu.dir\Release\ggml-cpu-traits.obj"
  220. "ggml-cpu.dir\Release\amx.obj"
  221. "ggml-cpu.dir\Release\mmq.obj"
  222. "ggml-cpu.dir\Release\sgemm.obj"
  223. "ggml-cpu.dir\Release\/ggml-cpu/ggml-cpu.c.obj"
  224. "ggml-cpu.dir\Release\/ggml-cpu/ggml-cpu.cpp.obj"
  225. Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-cpu.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml-cpu.exp
  226. ggml-cpu.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml-cpu.dll
  227. FinalizeBuildStatus:
  228. Deleting file "ggml-cpu.dir\Release\ggml-cpu.tlog\unsuccessfulbuild".
  229. Touching "ggml-cpu.dir\Release\ggml-cpu.tlog\ggml-cpu.lastbuildstate".
  230. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj" (default targets).
  231. PrepareForBuild:
  232. Creating directory "ggml.dir\Release\".
  233. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
  234. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  235. Creating directory "ggml.dir\Release\ggml.tlog\".
  236. InitializeBuildStatus:
  237. Creating "ggml.dir\Release\ggml.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  238. Touching "ggml.dir\Release\ggml.tlog\unsuccessfulbuild".
  239. CustomBuild:
  240. Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/ggml/src/CMakeLists.txt
  241. ClCompile:
  242. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_USE_CPU /D GGML_BUILD /D GGML_SHARED /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D _XOPEN_SOURCE=600 /D GGML_BACKEND_SHARED /D "CMAKE_INTDIR=\"Release\"" /D ggml_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"ggml.dir\Release\\" /Fd"ggml.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\ggml-backend-reg.cpp"
  243. ggml-backend-reg.cpp
  244. Link:
  245. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml.dll" /INCREMENTAL:NO /NOLOGO "Release\ggml-cpu.lib" "Release\ggml-base.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/bin/Release/ggml.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml.lib" /MACHINE:X64 /machine:x64 /DLL "ggml.dir\Release\ggml-backend-reg.obj"
  246. Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/ggml/src/Release/ggml.exp
  247. ggml.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\ggml.dll
  248. FinalizeBuildStatus:
  249. Deleting file "ggml.dir\Release\ggml.tlog\unsuccessfulbuild".
  250. Touching "ggml.dir\Release\ggml.tlog\ggml.lastbuildstate".
  251. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default targets).
  252. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj" (8) on node 1 (default targets).
  253. PrepareForBuild:
  254. Creating directory "llama.dir\Release\".
  255. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj]
  256. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  257. Creating directory "llama.dir\Release\llama.tlog\".
  258. InitializeBuildStatus:
  259. Creating "llama.dir\Release\llama.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  260. Touching "llama.dir\Release\llama.tlog\unsuccessfulbuild".
  261. CustomBuild:
  262. Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/src/CMakeLists.txt
  263. ClCompile:
  264. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\common" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_BUILD /D LLAMA_SHARED /D _CRT_SECURE_NO_WARNINGS /D GGML_USE_CPU /D GGML_SHARED /D GGML_BACKEND_SHARED /D "CMAKE_INTDIR=\"Release\"" /D llama_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"llama.dir\Release\\" /Fd"llama.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-adapter.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-arch.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-batch.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-chat.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-context.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-grammar.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-hparams.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-impl.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-kv-cache.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-mmap.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-model-loader.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-model.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-quant.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-sampling.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-vocab.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\unicode.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\unicode-data.cpp"
  265. llama.cpp
  266. llama-adapter.cpp
  267. llama-arch.cpp
  268. llama-batch.cpp
  269. llama-chat.cpp
  270. llama-context.cpp
  271. llama-grammar.cpp
  272. llama-hparams.cpp
  273. llama-impl.cpp
  274. llama-kv-cache.cpp
  275. llama-mmap.cpp
  276. llama-model-loader.cpp
  277. llama-model.cpp
  278. llama-quant.cpp
  279. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\type_traits(1706,98): warning C4244: 'argument': conversion from 'unsigned __int64' to 'int', possible loss of data [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj]
  280. (compiling source file '../../../../../pip-req-build-jyivbl21/vendor/llama.cpp/src/llama-quant.cpp')
  281. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\type_traits(1706,98):
  282. the template instantiation context (the oldest one first) is
  283. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-quant.cpp(114,17):
  284. see reference to function template instantiation 'std::thread &std::vector<std::thread,std::allocator<std::thread>>::emplace_back<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,uint8_t*,float*,size_t&>(llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &,ggml_type &,uint8_t *&&,float *&&,size_t &)' being compiled
  285. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\llama-quant.cpp(114,29):
  286. see the first reference to 'std::vector<std::thread,std::allocator<std::thread>>::emplace_back' in 'llama_tensor_dequantize_impl'
  287. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\vector(909,24):
  288. see reference to function template instantiation '_Ty &std::vector<_Ty,std::allocator<_Ty>>::_Emplace_one_at_back<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,unsigned char*,float*,size_t&>(llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &,ggml_type &,unsigned char *&&,float *&&,size_t &)' being compiled
  289. with
  290. [
  291. _Ty=std::thread
  292. ]
  293. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\vector(830,20):
  294. see reference to function template instantiation '_Ty &std::vector<_Ty,std::allocator<_Ty>>::_Emplace_back_with_unused_capacity<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,unsigned char*,float*,size_t&>(llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &,ggml_type &,unsigned char *&&,float *&&,size_t &)' being compiled
  295. with
  296. [
  297. _Ty=std::thread
  298. ]
  299. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\vector(848,27):
  300. see reference to function template instantiation 'void std::_Default_allocator_traits<_Alloc>::construct<_Ty,llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,unsigned char*,float*,size_t&>(_Alloc &,_Objty *const ,llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &,ggml_type &,unsigned char *&&,float *&&,size_t &)' being compiled
  301. with
  302. [
  303. _Alloc=std::allocator<std::thread>,
  304. _Ty=std::thread,
  305. _Objty=std::thread
  306. ]
  307. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\xmemory(732,82):
  308. see reference to function template instantiation 'std::thread::thread<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,_T,float*,size_t&,0>(_Fn,ggml_type &,_T &&,float *&&,size_t &)' being compiled
  309. with
  310. [
  311. _T=uint8_t *,
  312. _Fn=llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &
  313. ]
  314. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\thread(93,9):
  315. see reference to function template instantiation 'void std::thread::_Start<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>&,ggml_type&,_Ty,float*,size_t&>(_Fn,ggml_type &,_Ty &&,float *&&,size_t &)' being compiled
  316. with
  317. [
  318. _Ty=uint8_t *,
  319. _Fn=llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728> &
  320. ]
  321. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\thread(76,40):
  322. see reference to function template instantiation 'unsigned int (__cdecl *std::thread::_Get_invoke<std::thread::_Start::_Tuple,0,1,2,3,4>(std::integer_sequence<size_t,0,1,2,3,4>) noexcept)(void *) noexcept' being compiled
  323. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\thread(67,17):
  324. see reference to function template instantiation 'unsigned int std::thread::_Invoke<_Tuple,0,1,2,3,4>(void *) noexcept' being compiled
  325. with
  326. [
  327. _Tuple=std::thread::_Start::_Tuple
  328. ]
  329. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\thread(60,14):
  330. see reference to function template instantiation 'void std::invoke<llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>,ggml_type,uint8_t*,float*,unsigned __int64>(_Callable &&,_Ty1 &&,uint8_t *&&,float *&&,unsigned __int64 &&) noexcept(false)' being compiled
  331. with
  332. [
  333. _Callable=llama_tensor_dequantize_impl::<lambda_99809f70a3686e6322ff64c5ea2ca728>,
  334. _Ty1=ggml_type
  335. ]
  336.  
  337. llama-sampling.cpp
  338. llama-vocab.cpp
  339. unicode.cpp
  340. unicode-data.cpp
  341. Generating Code...
  342. MakeDirsForLink:
  343. Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\Release\".
  344. Link:
  345. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\llama.dll" /INCREMENTAL:NO /NOLOGO ..\ggml\src\Release\ggml.lib "..\ggml\src\Release\ggml-cpu.lib" "..\ggml\src\Release\ggml-base.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/bin/Release/llama.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/src/Release/llama.lib" /MACHINE:X64 /machine:x64 /DLL llama.dir\Release\llama.obj
  346. "llama.dir\Release\llama-adapter.obj"
  347. "llama.dir\Release\llama-arch.obj"
  348. "llama.dir\Release\llama-batch.obj"
  349. "llama.dir\Release\llama-chat.obj"
  350. "llama.dir\Release\llama-context.obj"
  351. "llama.dir\Release\llama-grammar.obj"
  352. "llama.dir\Release\llama-hparams.obj"
  353. "llama.dir\Release\llama-impl.obj"
  354. "llama.dir\Release\llama-kv-cache.obj"
  355. "llama.dir\Release\llama-mmap.obj"
  356. "llama.dir\Release\llama-model-loader.obj"
  357. "llama.dir\Release\llama-model.obj"
  358. "llama.dir\Release\llama-quant.obj"
  359. "llama.dir\Release\llama-sampling.obj"
  360. "llama.dir\Release\llama-vocab.obj"
  361. llama.dir\Release\unicode.obj
  362. "llama.dir\Release\unicode-data.obj"
  363. Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/src/Release/llama.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/src/Release/llama.exp
  364. llama.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\bin\Release\llama.dll
  365. FinalizeBuildStatus:
  366. Deleting file "llama.dir\Release\llama.tlog\unsuccessfulbuild".
  367. Touching "llama.dir\Release\llama.tlog\llama.lastbuildstate".
  368. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj" (default targets).
  369. PrepareForBuild:
  370. Creating directory "llava.dir\Release\".
  371. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
  372. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  373. Creating directory "llava.dir\Release\llava.tlog\".
  374. InitializeBuildStatus:
  375. Creating "llava.dir\Release\llava.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  376. Touching "llava.dir\Release\llava.tlog\unsuccessfulbuild".
  377. CustomBuild:
  378. Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/examples/llava/CMakeLists.txt
  379. ClCompile:
  380. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\..\.." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\..\..\common" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\common" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D GGML_USE_CPU /D GGML_SHARED /D GGML_BACKEND_SHARED /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"llava.dir\Release\\" /Fd"llava.dir\Release\llava.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\llava.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp"
  381. llava.cpp
  382. clip.cpp
  383. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1131,9): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
  384. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1131,9):
  385. __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function
  386.  
  387. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1586,13): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
  388. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1586,13):
  389. __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function
  390.  
  391. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(2820,5): warning C4297: 'clip_n_mmproj_embd': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
  392. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(2820,5):
  393. __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function
  394.  
  395. Generating Code...
  396. Lib:
  397. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\Lib.exe /OUT:"llava.dir\Release\llava.lib" /NOLOGO /MACHINE:X64 /machine:x64 llava.dir\Release\llava.obj
  398. llava.dir\Release\clip.obj
  399. llava.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.lib
  400. FinalizeBuildStatus:
  401. Deleting file "llava.dir\Release\llava.tlog\unsuccessfulbuild".
  402. Touching "llava.dir\Release\llava.tlog\llava.lastbuildstate".
  403. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default targets).
  404. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj" (9) on node 1 (default targets).
  405. PrepareForBuild:
  406. Creating directory "common.dir\Release\".
  407. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  408. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  409. Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\Release\".
  410. Creating directory "common.dir\Release\common.tlog\".
  411. InitializeBuildStatus:
  412. Creating "common.dir\Release\common.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  413. Touching "common.dir\Release\common.tlog\unsuccessfulbuild".
  414. CustomBuild:
  415. Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/common/CMakeLists.txt
  416. ClCompile:
  417. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\CL.exe /c /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\." /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\include" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\src\..\common" /I"C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D LLAMA_SHARED /D GGML_USE_CPU /D GGML_SHARED /D GGML_BACKEND_SHARED /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c++17 /Fo"common.dir\Release\\" /Fd"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\Release\common.pdb" /external:W1 /Gd /TP /errorReport:queue /utf-8 /bigobj "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\arg.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\common.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\console.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\json-schema-to-grammar.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\ngram-cache.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\sampling.cpp" "C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\speculative.cpp"
  418. arg.cpp
  419. common.cpp
  420. console.cpp
  421. json-schema-to-grammar.cpp
  422. log.cpp
  423. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,79): error C2039: 'system_clock': is not a member of 'std::chrono' [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  424. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\__msvc_chrono.hpp(286,11):
  425. see declaration of 'std::chrono'
  426.  
  427. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,79): error C3083: 'system_clock': the symbol to the left of a '::' must be a type [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  428. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,93): error C2039: 'now': is not a member of 'std::chrono' [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  429. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\__msvc_chrono.hpp(286,11):
  430. see declaration of 'std::chrono'
  431.  
  432. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,93): error C3861: 'now': identifier not found [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  433. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,25): error C2672: 'std::chrono::duration_cast': no matching overloaded function found [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  434. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\__msvc_chrono.hpp(87,19):
  435. could be '_To std::chrono::duration_cast(const std::chrono::duration<_Rep,_Period> &) noexcept(<expr>)'
  436.  
  437. ngram-cache.cpp
  438. sampling.cpp
  439. speculative.cpp
  440. Generating Code...
  441. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj" (default targets) -- FAILED.
  442. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (13) on node 1 (default targets).
  443. PrepareForBuild:
  444. Creating directory "llava_shared.dir\Release\".
  445. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj]
  446. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  447. Creating directory "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\".
  448. Creating directory "llava_shared.dir\Release\llava_shared.tlog\".
  449. InitializeBuildStatus:
  450. Creating "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  451. Touching "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild".
  452. CustomBuild:
  453. Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/examples/llava/CMakeLists.txt
  454. Link:
  455. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\llava.dll" /INCREMENTAL:NO /NOLOGO ..\..\src\Release\llama.lib ..\..\ggml\src\Release\ggml.lib "..\..\ggml\src\Release\ggml-cpu.lib" "..\..\ggml\src\Release\ggml-base.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/examples/llava/Release/llava.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/examples/llava/Release/llava.lib" /MACHINE:X64 /machine:x64 /DLL C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj
  456. C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj
  457. Creating library C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/examples/llava/Release/llava.lib and object C:/Users/MPD_01/AppData/Local/Temp/tmpqnxcasyy/build/vendor/llama.cpp/examples/llava/Release/llava.exp
  458. llava_shared.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\llava.dll
  459. FinalizeBuildStatus:
  460. Deleting file "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild".
  461. Touching "llava_shared.dir\Release\llava_shared.tlog\llava_shared.lastbuildstate".
  462. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default targets).
  463. Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (14) on node 1 (default targets).
  464. PrepareForBuild:
  465. Creating directory "llava_static.dir\Release\".
  466. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj]
  467. Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
  468. Creating directory "llava_static.dir\Release\llava_static.tlog\".
  469. InitializeBuildStatus:
  470. Creating "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
  471. Touching "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild".
  472. CustomBuild:
  473. Building Custom Rule C:/Users/MPD_01/AppData/Local/Temp/pip-req-build-jyivbl21/vendor/llama.cpp/examples/llava/CMakeLists.txt
  474. Lib:
  475. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib" /NOLOGO /MACHINE:X64 /machine:x64 C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj
  476. C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj
  477. llava_static.vcxproj -> C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib
  478. FinalizeBuildStatus:
  479. Deleting file "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild".
  480. Touching "llava_static.dir\Release\llava_static.tlog\llava_static.lastbuildstate".
  481. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default targets).
  482. Done Building Project "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default targets) -- FAILED.
  483.  
  484. Build FAILED.
  485.  
  486. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  487. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj" (default target) (2) ->
  488. (PrepareForBuild target) ->
  489. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ZERO_CHECK.vcxproj]
  490.  
  491.  
  492. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  493. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj" (default target) (3) ->
  494. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\build_info.vcxproj]
  495.  
  496.  
  497. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  498. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  499. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
  500. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj" (default target) (6) ->
  501. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj]
  502.  
  503.  
  504. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  505. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  506. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
  507. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj" (default target) (6) ->
  508. (ClCompile target) ->
  509. C:\Program Files (x86)\Windows Kits\10\Include\10.0.18362.0\um\winbase.h(9305,5): warning C5105: macro expansion producing 'defined' has undefined behavior [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj]
  510.  
  511.  
  512. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  513. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  514. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
  515. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj" (default target) (7) ->
  516. (PrepareForBuild target) ->
  517. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj]
  518.  
  519.  
  520. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  521. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  522. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
  523. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj" (default target) (7) ->
  524. (ClCompile target) ->
  525. C:\Program Files (x86)\Windows Kits\10\Include\10.0.18362.0\um\winbase.h(9305,5): warning C5105: macro expansion producing 'defined' has undefined behavior [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj]
  526.  
  527.  
  528. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  529. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  530. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
  531. (PrepareForBuild target) ->
  532. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
  533.  
  534.  
  535. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  536. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  537. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj" (default target) (8) ->
  538. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj]
  539.  
  540.  
  541. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  542. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  543. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj" (default target) (8) ->
  544. (ClCompile target) ->
  545. C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\type_traits(1706,98): warning C4244: 'argument': conversion from 'unsigned __int64' to 'int', possible loss of data [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\src\llama.vcxproj]
  546.  
  547.  
  548. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  549. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  550. (PrepareForBuild target) ->
  551. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
  552.  
  553.  
  554. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  555. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  556. (ClCompile target) ->
  557. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1131,9): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
  558. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(1586,13): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
  559. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\examples\llava\clip.cpp(2820,5): warning C4297: 'clip_n_mmproj_embd': function assumed not to throw an exception but does [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
  560.  
  561.  
  562. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  563. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj" (default target) (9) ->
  564. (PrepareForBuild target) ->
  565. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  566.  
  567.  
  568. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  569. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default target) (13) ->
  570. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj]
  571.  
  572.  
  573. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  574. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default target) (14) ->
  575. C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(544,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj]
  576.  
  577.  
  578. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\ALL_BUILD.vcxproj" (default target) (1) ->
  579. "C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj" (default target) (9) ->
  580. (ClCompile target) ->
  581. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,79): error C2039: 'system_clock': is not a member of 'std::chrono' [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  582. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,79): error C3083: 'system_clock': the symbol to the left of a '::' must be a type [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  583. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,93): error C2039: 'now': is not a member of 'std::chrono' [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  584. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,93): error C3861: 'now': identifier not found [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  585. C:\Users\MPD_01\AppData\Local\Temp\pip-req-build-jyivbl21\vendor\llama.cpp\common\log.cpp(28,25): error C2672: 'std::chrono::duration_cast': no matching overloaded function found [C:\Users\MPD_01\AppData\Local\Temp\tmpqnxcasyy\build\vendor\llama.cpp\common\common.vcxproj]
  586.  
  587. 16 Warning(s)
  588. 5 Error(s)
  589.  
  590. Time Elapsed 00:00:43.43
  591.  
  592.  
  593. *** CMake build failed
  594. [end of output]
  595.  
  596. note: This error originates from a subprocess, and is likely not a problem with pip.
  597. ERROR: Failed building wheel for llama_cpp_python
  598. Failed to build llama_cpp_python
  599. ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama_cpp_python)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement