Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compile error when installing dependencies #510

Open
Golden-Pigeon opened this issue Nov 22, 2024 · 0 comments
Open

Compile error when installing dependencies #510

Golden-Pigeon opened this issue Nov 22, 2024 · 0 comments

Comments

@Golden-Pigeon
Copy link

Hi,

I am installing dependencies of threestudio. When I run

pip install -r requirements.txt

it reports

      warning: variable does not need to be mutable
         --> tokenizers-lib/src/models/unigram/model.rs:265:21
          |
      265 |                 let mut target_node = &mut best_path_ends_at[key_pos];
          |                     ----^^^^^^^^^^^
          |                     |
          |                     help: remove this `mut`
          |
          = note: `#[warn(unused_mut)]` on by default
      
      warning: variable does not need to be mutable
         --> tokenizers-lib/src/models/unigram/model.rs:282:21
          |
      282 |                 let mut target_node = &mut best_path_ends_at[starts_at + mblen];
          |                     ----^^^^^^^^^^^
          |                     |
          |                     help: remove this `mut`
      
      warning: variable does not need to be mutable
         --> tokenizers-lib/src/pre_tokenizers/byte_level.rs:200:59
          |
      200 |     encoding.process_tokens_with_offsets_mut(|(i, (token, mut offsets))| {
          |                                                           ----^^^^^^^
          |                                                           |
          |                                                           help: remove this `mut`
      
      error: casting `&T` to `&mut T` is undefined behavior, even if the reference is unused, consider instead using an `UnsafeCell`
         --> tokenizers-lib/src/models/bpe/trainer.rs:526:47
          |
      522 |                     let w = &words[*i] as *const _ as *mut _;
          |                             -------------------------------- casting happend here
      ...
      526 |                         let word: &mut Word = &mut (*w);
          |                                               ^^^^^^^^^
          |
          = note: for more information, visit <https://doc.rust-lang.org/book/ch15-05-interior-mutability.html>
          = note: `#[deny(invalid_reference_casting)]` on by default
      
      warning: `tokenizers` (lib) generated 3 warnings
      error: could not compile `tokenizers` (lib) due to 1 previous error; 3 warnings emitted
      
      Caused by:
        process didn't exit successfully: `/home/goldenpigeon/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/bin/rustc --crate-name tokenizers --edition=2018 tokenizers-lib/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts,future-incompat --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -C embed-bitcode=no --cfg 'feature="cached-path"' --cfg 'feature="clap"' --cfg 'feature="cli"' --cfg 'feature="default"' --cfg 'feature="dirs"' --cfg 'feature="esaxx_fast"' --cfg 'feature="http"' --cfg 'feature="indicatif"' --cfg 'feature="onig"' --cfg 'feature="progressbar"' --cfg 'feature="reqwest"' -C metadata=aaa2cd5d7dfce673 -C extra-filename=-aaa2cd5d7dfce673 --out-dir /tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps -C strip=debuginfo -L dependency=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps --extern aho_corasick=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libaho_corasick-6819f9c0f208f4d4.rmeta --extern cached_path=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libcached_path-95b8d18cbfe30da5.rmeta --extern clap=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libclap-ee8c809676573c0b.rmeta --extern derive_builder=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libderive_builder-2cf0c2b14b17021a.rmeta --extern dirs=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libdirs-f8af9d98ad42dae4.rmeta --extern esaxx_rs=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libesaxx_rs-bfb9fc640192a9f4.rmeta --extern getrandom=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libgetrandom-40d247db9d893f14.rmeta --extern indicatif=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libindicatif-1d4b60db00bacf96.rmeta --extern itertools=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libitertools-c10fb430a799eac1.rmeta --extern lazy_static=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/liblazy_static-8ee1e75538e71923.rmeta --extern log=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/liblog-ea3dcf671b4236c9.rmeta --extern macro_rules_attribute=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libmacro_rules_attribute-6c49530c9cf71393.rmeta --extern monostate=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libmonostate-8aa1309231b40c7a.rmeta --extern onig=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libonig-9421b43f70bfe3a6.rmeta --extern paste=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libpaste-e3e6d4f1f4e74923.so --extern rand=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/librand-d20a15c16879fe79.rmeta --extern rayon=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/librayon-da6832d0e2ce840a.rmeta --extern rayon_cond=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/librayon_cond-513befa336542fd5.rmeta --extern regex=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libregex-ddf7be787b2d69c6.rmeta --extern regex_syntax=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libregex_syntax-5fb1cd753e21b1f1.rmeta --extern reqwest=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libreqwest-1f7524130098fb9a.rmeta --extern serde=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libserde-6f74e65bcbef9bb5.rmeta --extern serde_json=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libserde_json-e99132afb71e6487.rmeta --extern spm_precompiled=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libspm_precompiled-ef809ba4788e94bc.rmeta --extern thiserror=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libthiserror-4032830e66b4a411.rmeta --extern unicode_normalization_alignments=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libunicode_normalization_alignments-cf9478eb15d340a5.rmeta --extern unicode_segmentation=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libunicode_segmentation-951c18202e614e2d.rmeta --extern unicode_categories=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/deps/libunicode_categories-2313480cae691b82.rmeta -L 'native=/home/linuxbrew/.linuxbrew/Cellar/openssl@3/3.3.2/lib' -L native=/home/linuxbrew/.linuxbrew/opt/bzip2/lib -L native=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/build/zstd-sys-94bd49a72730d2b5/out -L native=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/build/esaxx-rs-f04dd04b530734ca/out -L native=/tmp/pip-install-4bf9ti3y/tokenizers_0f51dfae9f324a0aacf39200eebf9c29/target/release/build/onig_sys-794d8b35f6e60033/out` (exit status: 1)
      error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --` failed with code 101
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Successfully built nerfacc nvdiffrast envlight clip
Failed to build tokenizers
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (tokenizers)

It seems that it encounters a compilation error when install tokenlizers v0.13.3, which required by transformers==4.28.1 in requirements.txt.

Later I tried the latest version of transformers and it is successfully installed. Would you change the version of transformers in requirements.txt to a newer one?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant