Advertisement
Guest User

Untitled

a guest
Jun 13th, 2022
82
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 6.03 KB | None | 0 0
  1. #10 10.19 Building wheels for collected packages: sacremoses, tokenizers
  2. #10 10.19 Building wheel for sacremoses (setup.py): started
  3. #10 10.42 Building wheel for sacremoses (setup.py): finished with status 'done'
  4. #10 10.42 Created wheel for sacremoses: filename=sacremoses-0.0.53-py3-none-any.whl size=895260 sha256=308835c7bdec1c0d3b2206c9c168981de33cd5142e6d47ea10f9a3743e5ccb87
  5. #10 10.42 Stored in directory: /root/.cache/pip/wheels/87/39/dd/a83eeef36d0bf98e7a4d1933a4ad2d660295a40613079bafc9
  6. #10 10.42 Building wheel for tokenizers (pyproject.toml): started
  7. #10 10.92 Building wheel for tokenizers (pyproject.toml): finished with status 'error'
  8. #10 10.92 error: subprocess-exited-with-error
  9. #10 10.92
  10. #10 10.92 × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  11. #10 10.92 │ exit code: 1
  12. #10 10.92 ╰─> [51 lines of output]
  13. #10 10.92 running bdist_wheel
  14. #10 10.92 running build
  15. #10 10.92 running build_py
  16. #10 10.92 creating build
  17. #10 10.92 creating build/lib.linux-aarch64-cpython-37
  18. #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers
  19. #10 10.92 copying py_src/tokenizers/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers
  20. #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/models
  21. #10 10.92 copying py_src/tokenizers/models/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/models
  22. #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/decoders
  23. #10 10.92 copying py_src/tokenizers/decoders/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/decoders
  24. #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/normalizers
  25. #10 10.92 copying py_src/tokenizers/normalizers/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/normalizers
  26. #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/pre_tokenizers
  27. #10 10.92 copying py_src/tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/pre_tokenizers
  28. #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/processors
  29. #10 10.92 copying py_src/tokenizers/processors/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/processors
  30. #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/trainers
  31. #10 10.92 copying py_src/tokenizers/trainers/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/trainers
  32. #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/implementations
  33. #10 10.92 copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
  34. #10 10.92 copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
  35. #10 10.92 copying py_src/tokenizers/implementations/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
  36. #10 10.92 copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
  37. #10 10.92 copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
  38. #10 10.92 copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
  39. #10 10.92 copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
  40. #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/tools
  41. #10 10.92 copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-aarch64-cpython-37/tokenizers/tools
  42. #10 10.92 copying py_src/tokenizers/tools/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/tools
  43. #10 10.92 copying py_src/tokenizers/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers
  44. #10 10.92 copying py_src/tokenizers/models/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/models
  45. #10 10.92 copying py_src/tokenizers/decoders/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/decoders
  46. #10 10.92 copying py_src/tokenizers/normalizers/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/normalizers
  47. #10 10.92 copying py_src/tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/pre_tokenizers
  48. #10 10.92 copying py_src/tokenizers/processors/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/processors
  49. #10 10.92 copying py_src/tokenizers/trainers/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/trainers
  50. #10 10.92 copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-aarch64-cpython-37/tokenizers/tools
  51. #10 10.92 running build_ext
  52. #10 10.92 running build_rust
  53. #10 10.92 error: can't find Rust compiler
  54. #10 10.92
  55. #10 10.92 If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
  56. #10 10.92
  57. #10 10.92 To update pip, run:
  58. #10 10.92
  59. #10 10.92 pip install --upgrade pip
  60. #10 10.92
  61. #10 10.92 and then retry package installation.
  62. #10 10.92
  63. #10 10.92 If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
  64. #10 10.92 [end of output]
  65. #10 10.92
  66. #10 10.92 note: This error originates from a subprocess, and is likely not a problem with pip.
  67. #10 10.93 ERROR: Failed building wheel for tokenizers
  68. #10 10.93 Successfully built sacremoses
  69. #10 10.93 Failed to build tokenizers
  70. #10 10.93 ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement