Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- #10 10.19 Building wheels for collected packages: sacremoses, tokenizers
- #10 10.19 Building wheel for sacremoses (setup.py): started
- #10 10.42 Building wheel for sacremoses (setup.py): finished with status 'done'
- #10 10.42 Created wheel for sacremoses: filename=sacremoses-0.0.53-py3-none-any.whl size=895260 sha256=308835c7bdec1c0d3b2206c9c168981de33cd5142e6d47ea10f9a3743e5ccb87
- #10 10.42 Stored in directory: /root/.cache/pip/wheels/87/39/dd/a83eeef36d0bf98e7a4d1933a4ad2d660295a40613079bafc9
- #10 10.42 Building wheel for tokenizers (pyproject.toml): started
- #10 10.92 Building wheel for tokenizers (pyproject.toml): finished with status 'error'
- #10 10.92 error: subprocess-exited-with-error
- #10 10.92
- #10 10.92 × Building wheel for tokenizers (pyproject.toml) did not run successfully.
- #10 10.92 │ exit code: 1
- #10 10.92 ╰─> [51 lines of output]
- #10 10.92 running bdist_wheel
- #10 10.92 running build
- #10 10.92 running build_py
- #10 10.92 creating build
- #10 10.92 creating build/lib.linux-aarch64-cpython-37
- #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers
- #10 10.92 copying py_src/tokenizers/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers
- #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/models
- #10 10.92 copying py_src/tokenizers/models/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/models
- #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/decoders
- #10 10.92 copying py_src/tokenizers/decoders/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/decoders
- #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/normalizers
- #10 10.92 copying py_src/tokenizers/normalizers/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/normalizers
- #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/pre_tokenizers
- #10 10.92 copying py_src/tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/pre_tokenizers
- #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/processors
- #10 10.92 copying py_src/tokenizers/processors/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/processors
- #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/trainers
- #10 10.92 copying py_src/tokenizers/trainers/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/trainers
- #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/implementations
- #10 10.92 copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
- #10 10.92 copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
- #10 10.92 copying py_src/tokenizers/implementations/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
- #10 10.92 copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
- #10 10.92 copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
- #10 10.92 copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
- #10 10.92 copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-aarch64-cpython-37/tokenizers/implementations
- #10 10.92 creating build/lib.linux-aarch64-cpython-37/tokenizers/tools
- #10 10.92 copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-aarch64-cpython-37/tokenizers/tools
- #10 10.92 copying py_src/tokenizers/tools/__init__.py -> build/lib.linux-aarch64-cpython-37/tokenizers/tools
- #10 10.92 copying py_src/tokenizers/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers
- #10 10.92 copying py_src/tokenizers/models/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/models
- #10 10.92 copying py_src/tokenizers/decoders/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/decoders
- #10 10.92 copying py_src/tokenizers/normalizers/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/normalizers
- #10 10.92 copying py_src/tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/pre_tokenizers
- #10 10.92 copying py_src/tokenizers/processors/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/processors
- #10 10.92 copying py_src/tokenizers/trainers/__init__.pyi -> build/lib.linux-aarch64-cpython-37/tokenizers/trainers
- #10 10.92 copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-aarch64-cpython-37/tokenizers/tools
- #10 10.92 running build_ext
- #10 10.92 running build_rust
- #10 10.92 error: can't find Rust compiler
- #10 10.92
- #10 10.92 If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
- #10 10.92
- #10 10.92 To update pip, run:
- #10 10.92
- #10 10.92 pip install --upgrade pip
- #10 10.92
- #10 10.92 and then retry package installation.
- #10 10.92
- #10 10.92 If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
- #10 10.92 [end of output]
- #10 10.92
- #10 10.92 note: This error originates from a subprocess, and is likely not a problem with pip.
- #10 10.93 ERROR: Failed building wheel for tokenizers
- #10 10.93 Successfully built sacremoses
- #10 10.93 Failed to build tokenizers
- #10 10.93 ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement