Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error Node (BeamSearch_node) has input size 12 not in range [min=5, max=10] #2

Open
sergiosolorzano opened this issue Aug 10, 2023 · 1 comment

Comments

@sergiosolorzano
Copy link

sergiosolorzano commented Aug 10, 2023

I have tried with python 3.10 and 3.11 to create an onnx whisper-tiny model.
I create a conda env for this and follow microsoft's olive repo.

For example, I first clone the Olive repo and switch branch git checkout tags/v0.2.0 (i tried them all)
cd Olive
create conda env for python 3.11
python -m pip install .

Then in examples/whisper:
python -m pip install -r requirements.txt
python -m pip uninstall -y onnxruntime ort-nightly
python -m pip install ort-nightly --index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/ORT-Nightly/pypi/simple/

Throws error when I build in python 11:
(env_olive311) sergio@Ubuntu-2204-oai:~/PythonWorkspace/Olive/examples/whisper$ python prepare_whisper_configs.py --model_name openai/whisper-tiny.en Traceback (most recent call last): File "/home/sergio/PythonWorkspace/Olive/examples/whisper/prepare_whisper_configs.py", line 231, in <module> main() File "/home/sergio/PythonWorkspace/Olive/examples/whisper/prepare_whisper_configs.py", line 39, in main whisper_model = get_ort_whisper_for_conditional_generation(args.model_name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/sergio/anaconda3/envs/env_olive311/lib/python3.11/site-packages/olive/hf_utils.py", line 59, in get_ort_whisper_for_conditional_generation decoder = WhisperDecoder(model, None, model.config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: WhisperDecoder.__init__() takes 3 positional arguments but 4 were given (env_olive311) sergio@Ubuntu-2204-oai:~/PythonWorkspace/Olive/examples/whisper$


If I manage to build the onnx model cloning with python 3.10 doing the same process, I get the error below when using the model in Microsoft's demo https://github.com/onnxruntime/Whisper-HybridLoop-Onnx-Demo/tree/main/AudioNoteTranscription

OnnxRuntimeException: [ErrorCode:InvalidGraph] Load model from C:/AR-VR-Github/UnitySentisStableDiffusion-And-Whisper/Assets/StreamingAssets/whisper/model.onnx failed:This is an invalid model. In Node, ("BeamSearch_node", BeamSearch, "com.microsoft", -1) : ("log_mel": tensor(float),"max_length": tensor(int32),"min_length": tensor(int32),"num_beams": tensor(int32),"num_return_sequences": tensor(int32),"length_penalty": tensor(float),"repetition_penalty": tensor(float),"","","","","",) -> ("sequences",) , Error Node (BeamSearch_node) has input size 12 not in range [min=5, max=10].
Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess (System.IntPtr nativeStatus) (at <36441e0316944e7eb9fd86bf4a9a5a82>:0)
Microsoft.ML.OnnxRuntime.InferenceSession.Init (System.String modelPath, Microsoft.ML.OnnxRuntime.SessionOptions options, Microsoft.ML.OnnxRuntime.PrePackedWeightsContainer prepackedWeightsContainer) (at <36441e0316944e7eb9fd86bf4a9a5a82>:0)
Microsoft.ML.OnnxRuntime.InferenceSession..ctor (System.String modelPath, Microsoft.ML.OnnxRuntime.SessionOptions options) (at <36441e0316944e7eb9fd86bf4a9a5a82>:0)

Since this involves both repos, I have posted at microsoft/Olive#477

@DimQ1
Copy link

DimQ1 commented Dec 4, 2023

resolved by using Microsoft.ML.OnnxRuntime 1.16.3 and remove from app Microsoft.ML.OnnxRuntime.Azure

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants