Guest User

Dockerfile logs llama-cpp-python

a guest
Jun 20th, 2023
51
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 10.44 KB | Software | 0 0
  1. #0 7.160 × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  2. #0 7.160 │ exit code: 1
  3. #0 7.160 ╰─> [108 lines of output]
  4. #0 7.160
  5. #0 7.160
  6. #0 7.160 --------------------------------------------------------------------------------
  7. #0 7.160 -- Trying 'Ninja' generator
  8. #0 7.160 --------------------------------
  9. #0 7.160 ---------------------------
  10. #0 7.160 ----------------------
  11. #0 7.160 -----------------
  12. #0 7.160 ------------
  13. #0 7.160 -------
  14. #0 7.160 --
  15. #0 7.160 Not searching for unused variables given on the command line.
  16. #0 7.160 -- The C compiler identification is GNU 11.3.0
  17. #0 7.160 -- Detecting C compiler ABI info
  18. #0 7.160 -- Detecting C compiler ABI info - done
  19. #0 7.160 -- Check for working C compiler: /usr/bin/cc - skipped
  20. #0 7.160 -- Detecting C compile features
  21. #0 7.160 -- Detecting C compile features - done
  22. #0 7.160 -- The CXX compiler identification is GNU 11.3.0
  23. #0 7.160 -- Detecting CXX compiler ABI info
  24. #0 7.160 -- Detecting CXX compiler ABI info - done
  25. #0 7.160 -- Check for working CXX compiler: /usr/bin/c++ - skipped
  26. #0 7.160 -- Detecting CXX compile features
  27. #0 7.160 -- Detecting CXX compile features - done
  28. #0 7.160 -- Configuring done (0.2s)
  29. #0 7.160 -- Generating done (0.0s)
  30. #0 7.160 -- Build files have been written to: /tmp/pip-install-miiw0h6r/llama-cpp-python_ff4becb16e044f0db17bec826aea220f/_cmake_test_compile/build
  31. #0 7.160 --
  32. #0 7.160 -------
  33. #0 7.160 ------------
  34. #0 7.160 -----------------
  35. #0 7.160 ----------------------
  36. #0 7.160 ---------------------------
  37. #0 7.160 --------------------------------
  38. #0 7.160 -- Trying 'Ninja' generator - success
  39. #0 7.160 --------------------------------------------------------------------------------
  40. #0 7.160
  41. #0 7.160 Configuring Project
  42. #0 7.160 Working directory:
  43. #0 7.160 /tmp/pip-install-miiw0h6r/llama-cpp-python_ff4becb16e044f0db17bec826aea220f/_skbuild/linux-x86_64-3.10/cmake-build
  44. #0 7.160 Command:
  45. #0 7.160 /tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/cmake/data/bin/cmake /tmp/pip-install-miiw0h6r/llama-cpp-python_ff4becb16e044f0db17bec826aea220f -G Ninja -DCMAKE_MAKE_PROGRAM:FILEPATH=/tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/ninja/data/bin/ninja --no-warn-unused-cli -DCMAKE_INSTALL_PREFIX:PATH=/tmp/pip-install-miiw0h6r/llama-cpp-python_ff4becb16e044f0db17bec826aea220f/_skbuild/linux-x86_64-3.10/cmake-install -DPYTHON_VERSION_STRING:STRING=3.10.9 -DSKBUILD:INTERNAL=TRUE -DCMAKE_MODULE_PATH:PATH=/tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/skbuild/resources/cmake -DPYTHON_EXECUTABLE:PATH=/opt/conda/envs/textgen/bin/python -DPYTHON_INCLUDE_DIR:PATH=/opt/conda/envs/textgen/include/python3.10 -DPYTHON_LIBRARY:PATH=/opt/conda/envs/textgen/lib/libpython3.10.so -DPython_EXECUTABLE:PATH=/opt/conda/envs/textgen/bin/python -DPython_ROOT_DIR:PATH=/opt/conda/envs/textgen -DPython_FIND_REGISTRY:STRING=NEVER -DPython_INCLUDE_DIR:PATH=/opt/conda/envs/textgen/include/python3.10 -DPython3_EXECUTABLE:PATH=/opt/conda/envs/textgen/bin/python -DPython3_ROOT_DIR:PATH=/opt/conda/envs/textgen -DPython3_FIND_REGISTRY:STRING=NEVER -DPython3_INCLUDE_DIR:PATH=/opt/conda/envs/textgen/include/python3.10 -DCMAKE_MAKE_PROGRAM:FILEPATH=/tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/ninja/data/bin/ninja -DLLAMA_CUBLAS=on -DCMAKE_BUILD_TYPE:STRING=Release -DLLAMA_CUBLAS=on
  46. #0 7.160
  47. #0 7.160 Not searching for unused variables given on the command line.
  48. #0 7.160 -- The C compiler identification is GNU 11.3.0
  49. #0 7.160 -- The CXX compiler identification is GNU 11.3.0
  50. #0 7.160 -- Detecting C compiler ABI info
  51. #0 7.160 -- Detecting C compiler ABI info - done
  52. #0 7.160 -- Check for working C compiler: /usr/bin/cc - skipped
  53. #0 7.160 -- Detecting C compile features
  54. #0 7.160 -- Detecting C compile features - done
  55. #0 7.160 -- Detecting CXX compiler ABI info
  56. #0 7.160 -- Detecting CXX compiler ABI info - done
  57. #0 7.160 -- Check for working CXX compiler: /usr/bin/c++ - skipped
  58. #0 7.160 -- Detecting CXX compile features
  59. #0 7.160 -- Detecting CXX compile features - done
  60. #0 7.160 -- Found Git: /usr/bin/git (found version "2.34.1")
  61. #0 7.160 fatal: not a git repository (or any of the parent directories): .git
  62. #0 7.160 fatal: not a git repository (or any of the parent directories): .git
  63. #0 7.160 CMake Warning at vendor/llama.cpp/CMakeLists.txt:113 (message):
  64. #0 7.160 Git repository not found; to enable automatic generation of build info,
  65. #0 7.160 make sure Git is installed and the project is a Git repository.
  66. #0 7.160
  67. #0 7.160
  68. #0 7.160 -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
  69. #0 7.160 -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
  70. #0 7.160 -- Found Threads: TRUE
  71. #0 7.160 -- Found CUDAToolkit: /usr/local/cuda/include (found version "11.8.89")
  72. #0 7.160 -- cuBLAS found
  73. #0 7.160 -- The CUDA compiler identification is NVIDIA 11.8.89
  74. #0 7.160 -- Detecting CUDA compiler ABI info
  75. #0 7.160 -- Detecting CUDA compiler ABI info - done
  76. #0 7.160 -- Check for working CUDA compiler: /usr/local/cuda/bin/nvcc - skipped
  77. #0 7.160 -- Detecting CUDA compile features
  78. #0 7.160 -- Detecting CUDA compile features - done
  79. #0 7.160 -- CMAKE_SYSTEM_PROCESSOR: x86_64
  80. #0 7.160 -- x86 detected
  81. #0 7.160 -- GGML CUDA sources found, configuring CUDA architecture
  82. #0 7.160 -- Configuring done (1.8s)
  83. #0 7.160 CMake Error in vendor/llama.cpp/CMakeLists.txt:
  84. #0 7.160 CUDA_ARCHITECTURES is set to "native", but no GPU was detected.
  85. #0 7.160
  86. #0 7.160
  87. #0 7.160 CMake Error in vendor/llama.cpp/CMakeLists.txt:
  88. #0 7.160 CUDA_ARCHITECTURES is set to "native", but no GPU was detected.
  89. #0 7.160
  90. #0 7.160
  91. #0 7.160 CMake Error in vendor/llama.cpp/CMakeLists.txt:
  92. #0 7.160 CUDA_ARCHITECTURES is set to "native", but no GPU was detected.
  93. #0 7.160
  94. #0 7.160
  95. #0 7.160 -- Generating done (0.0s)
  96. #0 7.160 CMake Generate step failed. Build files cannot be regenerated correctly.
  97. #0 7.160 Traceback (most recent call last):
  98. #0 7.160 File "/tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/skbuild/setuptools_wrap.py", line 666, in setup
  99. #0 7.160 env = cmkr.configure(
  100. #0 7.160 File "/tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/skbuild/cmaker.py", line 357, in configure
  101. #0 7.160 raise SKBuildError(msg)
  102. #0 7.160
  103. #0 7.160 An error occurred while configuring with CMake.
  104. #0 7.160 Command:
  105. #0 7.160 /tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/cmake/data/bin/cmake /tmp/pip-install-miiw0h6r/llama-cpp-python_ff4becb16e044f0db17bec826aea220f -G Ninja -DCMAKE_MAKE_PROGRAM:FILEPATH=/tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/ninja/data/bin/ninja --no-warn-unused-cli -DCMAKE_INSTALL_PREFIX:PATH=/tmp/pip-install-miiw0h6r/llama-cpp-python_ff4becb16e044f0db17bec826aea220f/_skbuild/linux-x86_64-3.10/cmake-install -DPYTHON_VERSION_STRING:STRING=3.10.9 -DSKBUILD:INTERNAL=TRUE -DCMAKE_MODULE_PATH:PATH=/tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/skbuild/resources/cmake -DPYTHON_EXECUTABLE:PATH=/opt/conda/envs/textgen/bin/python -DPYTHON_INCLUDE_DIR:PATH=/opt/conda/envs/textgen/include/python3.10 -DPYTHON_LIBRARY:PATH=/opt/conda/envs/textgen/lib/libpython3.10.so -DPython_EXECUTABLE:PATH=/opt/conda/envs/textgen/bin/python -DPython_ROOT_DIR:PATH=/opt/conda/envs/textgen -DPython_FIND_REGISTRY:STRING=NEVER -DPython_INCLUDE_DIR:PATH=/opt/conda/envs/textgen/include/python3.10 -DPython3_EXECUTABLE:PATH=/opt/conda/envs/textgen/bin/python -DPython3_ROOT_DIR:PATH=/opt/conda/envs/textgen -DPython3_FIND_REGISTRY:STRING=NEVER -DPython3_INCLUDE_DIR:PATH=/opt/conda/envs/textgen/include/python3.10 -DCMAKE_MAKE_PROGRAM:FILEPATH=/tmp/pip-build-env-n1zspwcj/overlay/lib/python3.10/site-packages/ninja/data/bin/ninja -DLLAMA_CUBLAS=on -DCMAKE_BUILD_TYPE:STRING=Release -DLLAMA_CUBLAS=on
  106. #0 7.160 Source directory:
  107. #0 7.160 /tmp/pip-install-miiw0h6r/llama-cpp-python_ff4becb16e044f0db17bec826aea220f
  108. #0 7.160 Working directory:
  109. #0 7.160 /tmp/pip-install-miiw0h6r/llama-cpp-python_ff4becb16e044f0db17bec826aea220f/_skbuild/linux-x86_64-3.10/cmake-build
  110. #0 7.160 Please see CMake's output for more information.
  111. #0 7.160
  112. #0 7.160 [end of output]
  113. #0 7.160
  114. #0 7.160 note: This error originates from a subprocess, and is likely not a problem with pip.
  115. #0 7.160 ERROR: Failed building wheel for llama-cpp-python
  116. #0 7.160 ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
  117. #0 7.160
  118. #0 7.160 Found existing installation: llama-cpp-python 0.1.64
  119. #0 7.160 Uninstalling llama-cpp-python-0.1.64:
  120. #0 7.160 Successfully uninstalled llama-cpp-python-0.1.64
  121. #0 7.160 Collecting llama-cpp-python
  122. #0 7.160 Downloading llama_cpp_python-0.1.65.tar.gz (1.5 MB)
  123. #0 7.160 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 18.5 MB/s eta 0:00:00
  124. #0 7.160 Installing build dependencies: started
  125. #0 7.160 Installing build dependencies: finished with status 'done'
  126. #0 7.160 Getting requirements to build wheel: started
  127. #0 7.160 Getting requirements to build wheel: finished with status 'done'
  128. #0 7.160 Preparing metadata (pyproject.toml): started
  129. #0 7.160 Preparing metadata (pyproject.toml): finished with status 'done'
  130. #0 7.160 Requirement already satisfied: typing-extensions>=4.5.0 in /opt/conda/envs/textgen/lib/python3.10/site-packages (from llama-cpp-python) (4.6.3)
  131. #0 7.160 Requirement already satisfied: numpy>=1.20.0 in /opt/conda/envs/textgen/lib/python3.10/site-packages (from llama-cpp-python) (1.24.1)
  132. #0 7.160 Requirement already satisfied: diskcache>=5.6.1 in /opt/conda/envs/textgen/lib/python3.10/site-packages (from llama-cpp-python) (5.6.1)
  133. #0 7.160 Building wheels for collected packages: llama-cpp-python
  134. #0 7.160 Building wheel for llama-cpp-python (pyproject.toml): started
  135. #0 7.160 Building wheel for llama-cpp-python (pyproject.toml): finished with status 'error'
  136. #0 7.160 Failed to build llama-cpp-python
Advertisement
Add Comment
Please, Sign In to add comment