Skip to content

v0.28.2

Latest
Compare
Choose a tag to compare
@PawelPeczek-Roboflow PawelPeczek-Roboflow released this 27 Nov 14:02
· 31 commits to main since this release
3d15bcb

🔧 Fixed issue with inference package installation

26.11.2024 there was a release 0.20.4 of tokenizers library which is dependency of inference dependencies introducing breaking change for those inference clients who use Python 3.8 - causing the following errors while installation of recent (and older) versions of inference:

👉 MacOS
Downloading tokenizers-0.20.4.tar.gz (343 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... error
error: subprocess-exited-with-error

× Preparing metadata (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [6 lines of output]

    Cargo, the Rust package manager, is not installed or is not on PATH.
    This package requires Rust and Cargo to compile extensions. Install it through
    the system's package manager or via https://rustup.rs/

    Checking for Rust toolchain....
    [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details
👉 Linux

After installation, the following error was presented

/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/utils/import_utils.py:1778: in _get_module
  return importlib.import_module("." + module_name, self.__name__)
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/importlib/__init__.py:127: in import_module
  return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:[101](https://github.com/roboflow/inference/actions/runs/12049175470/job/33595408508#step:7:102)4: in _gcd_import
  ???
<frozen importlib._bootstrap>:991: in _find_and_load
  ???
<frozen importlib._bootstrap>:961: in _find_and_load_unlocked
  ???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
  ???
<frozen importlib._bootstrap>:1014: in _gcd_import
  ???
<frozen importlib._bootstrap>:991: in _find_and_load
  ???
<frozen importlib._bootstrap>:975: in _find_and_load_unlocked
  ???
<frozen importlib._bootstrap>:671: in _load_unlocked
  ???
<frozen importlib._bootstrap_external>:843: in exec_module
  ???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
  ???
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/models/__init__.py:15: in <module>
  from . import (
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/models/mt5/__init__.py:36: in <module>
  from ..t5.tokenization_t5_fast import T5TokenizerFast
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/models/t5/tokenization_t5_fast.py:23: in <module>
  from ...tokenization_utils_fast import PreTrainedTokenizerFast
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py:26: in <module>
  import tokenizers.pre_tokenizers as pre_tokenizers_fast
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/tokenizers/__init__.py:78: in <module>
  from .tokenizers import (
E   ImportError: /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/tokenizers/tokenizers.abi3.so: undefined symbol: PyInterpreterState_Get

The above exception was the direct cause of the following exception:
tests/inference/models_predictions_tests/test_owlv2.py:4: in <module>
  from inference.models.owlv2.owlv2 import OwlV2
inference/models/owlv2/owlv2.py:11: in <module>
  from transformers import Owlv2ForObjectDetection, Owlv2Processor
<frozen importlib._bootstrap>:[103](https://github.com/roboflow/inference/actions/runs/12049175470/job/33595408508#step:7:104)9: in _handle_fromlist
  ???
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/utils/import_utils.py:1766: in __getattr__
  module = self._get_module(self._class_to_module[name])
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/utils/import_utils.py:1780: in _get_module
  raise RuntimeError(
E   RuntimeError: Failed to import transformers.models.owlv2 because of the following error (look up to see its traceback):
E   /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/tokenizers/tokenizers.abi3.so: undefined symbol: PyInterpreterState_Get

Caution

We are fixing the problem in inference 0.28.2, but it is not possible to be fixed older releases - for those who need to fix that
in their environments, please modify the build such that installing inference you also install tokenizers<=0.20.3.

pip install inference "tokenizers<=0.20.3"

🔧 Fixed issue with CUDA and stream management API

While running inference server and using stream management API to run Workflows against video inside docker container, it was not possible to use CUDA due to bug present from the very start of the feature. We are fixing it now.

Full Changelog: v0.28.1...v0.28.2