-
Notifications
You must be signed in to change notification settings - Fork 256
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IPEX v2.5.10+xpu pytorch-triton-xpu instructions does not work #754
Comments
Thanks for finding and report the issue. Seemed something of the documents about the triton xpu version. Let me check it.
|
Hi @simonlui , could you please share the process you've done? I suppose you miss the step to install and activate dpcpp |
I ran into it with ComfyUI and trying to include
I am pretty sure I have everything installed correctly and I ran |
Using
So yeah, I have no clue. It feels like something is up with my environment or configuration with a |
Describe the bug
According to https://intel.github.io/intel-extension-for-pytorch/xpu/2.5.10+xpu/tutorials/known_issues.html, the workaround for installing
triton
for IPEX is this command.However, after running and installing that package and trying to run
torch.compile
, this is the truncated output.It seems like this command only worked for oneAPI Base Toolkit 2024 components. oneAPI Base Toolkit 2025 components which IPEX v2.5.10+xpu installs is using libsycl.so.8 instead of libsycl.so.7 as can be seen doing an
ldd
command onsycl-ls
So I am pretty sure the instructions are incorrect here. Or I am doing something wrong. For reference, if you install the latest
pytorch-triton-xpu
by leaving out the version check, you get the following backtrace instead.Versions
The text was updated successfully, but these errors were encountered: