-
Notifications
You must be signed in to change notification settings - Fork 27k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mps failure with tts: IndexError: tuple index out of range in pytorch_utils.py #33786
Comments
Hey@ajkessel I Think this can work
|
It seems like everything I'm running on my Macbook Pro M1 with the transformers lib is broken now. I'm using Python 3.10. This patch fixes it! Thanks!!! |
@ajkessel This seem to be broken for me on any of the official examples I've used for Llama and Qwen inference models. |
I tried @Swastik-Swarup-Dash 's workaround, got this error:
To the extent it's relevant:
Although at least with this workaround, setting |
I'm seeing the same issue with MPS inference. CPU inference works fine. @Swastik-Swarup-Dash — maybe you could make a pull request with your patch! |
@zachmayer let me give a try |
@ajkessel You can try this
You can set the environment variable within your script using the os module:
Run the TTS Model
Maybe this can work and make sure your macOs version is upto date MPS support is only available in macOS 12.3 and later.
|
With transformers==4.45.1 and tortoise-tts, I get the same With transformers==4.31.0 (the version requested by tortoise-tts), instead I get |
cc @eustlb seems like |
For what it's worth, all this same code works fine for me on a Windows box with cuda (both in Linux (WSL) and native Windows). So even if it's not a mps issue, it seems to be Mac-specific. |
You’re a lifesaver! I’ve been struggling for the past few days with Florence 2 workflow on MPS, which suddenly stopped working, I encountered the same error, and using the method you provided to patch the pytorch_utils.isin_mps_friendly , I was able to solve it! Thank you so much! |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
I'm such a novice, that I don't even know where to implement this line of code... anyone have any suggestions?? (am using Florence 2 on MPS, as well" |
Did you use the ComfyUI_Florence2 ? You can find the node.py file and add the following content at the beginning of the code: from transformers import pytorch_utils The main purpose is to replace the original pytorch_utils.isin_mps_friendly with patched_isin_mps_friendly, so that all subsequent calls to pytorch_utils.isin_mps_friendly will use the patched version. Hope this helps you out ! |
the suggestion by @Swastik-Swarup-Dash was added to |
@gante I updated Florence2, and it throws the same issue; maybe I missed something. Could you help me with this?
|
System Info
transformers
version: 4.46.0.dev0Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I'm not sure if this is a transformers bug, a coqui-ai bug, or just a lack of mps support for what I'm trying to do.
Same result whether
PYTORCH_ENABLE_MPS_FALLBACK
is set or not.Python code:
result:
I've also reported this as issue 3998 on coqui-ai.
Expected behavior
Successful execution.
The text was updated successfully, but these errors were encountered: