Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem converting tinyllama to onnx model with optimum-cli #2076

Open
2 of 4 tasks
hayyaw opened this issue Oct 22, 2024 · 0 comments
Open
2 of 4 tasks

Problem converting tinyllama to onnx model with optimum-cli #2076

hayyaw opened this issue Oct 22, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@hayyaw
Copy link

hayyaw commented Oct 22, 2024

System Info

main branch newest
local pip install

Who can help?

@michaelbenayoun

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction (minimal, reproducible, runnable)

optimum-cli export onnx --model /home/wangzhiqun/TinyLlama-1.1B-Chat-v1.0 --task text-generation --batch_size 1 --sequence_length 128 tinyllama_onnx_file

Expected behavior

To specify the batch_size and sequence_length, I use the following "optimum-cli export onnx --model /home/wangzhiqun/TinyLlama-1.1B-Chat-v1.0 --task text-generation --batch_size 1 --sequence_length 128 tinyllama_onnx_file". But the exported onnx model still holds the shape [batch_size, sequence_length]. How can I specify the fixed dimensions?

@hayyaw hayyaw added the bug Something isn't working label Oct 22, 2024
@hayyaw hayyaw changed the title Problem converting tinyllama onnx model Problem converting tinyllama to onnx model with optimum-cli Oct 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant