-
Notifications
You must be signed in to change notification settings - Fork 379
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I pulled the latest code, and the model is reporting errors everywhere #6955
Comments
Tagged this with QNN. @yangh0597 I noticed that the executorch modules are not calling something from an anaconda environment (/opt/executorch/examples/models/llama/export_llama.py), werhereas runpy is (/opt/anaconda3/envs/et_qnn/lib/python3.10/runpy.py). Did you install ET to your anaconda environment after pulling in the latest code? |
|
Yeah, it looks like canonicalize_program was removed from utils.py a couple days ago here: 4086509#diff-0439f6a7c1a3a3cfb222cd6409b6754f17a1ce782dd231de1d12bbf957d588f7L205 But this is imported in llama export here: https://github.com/pytorch/executorch/blob/main/examples/models/llama/export_llama_lib.py?lines=765 @haowhsu-quic, it looks like your PR #6657 broke llama export for QNN, can you have a look? cc @cccclai |
Hi @metascroy, sorry for the inconvenience. executorch/backends/qualcomm/utils/utils.py Line 215 in b132c96
I wasn't aware of this part when submitting PR, will fire another one to fix it. |
Me too
|
🐛 Describe the bug
I pulled the latest code, and the model is reporting errors everywhere. Two days ago it was fine.Looks like the code forgot to commit
command is:
error is
I checked the/opt/executorch/backends/qualcomm/utils/utils. Py files, do not canonicalize_program this method
Versions
main
The text was updated successfully, but these errors were encountered: