Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

xFormers wasn't build with CUDA support #1358

Open
criogennn opened this issue Nov 30, 2024 · 2 comments
Open

xFormers wasn't build with CUDA support #1358

criogennn opened this issue Nov 30, 2024 · 2 comments

Comments

@criogennn
Copy link

criogennn commented Nov 30, 2024

trying start training on windows with RTX 3050 8GB

[email protected] is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info
cutlassF-pt is not supported because:
xFormers wasn't build with CUDA support

-xformers 0.0.28.post3
-pytorch 2.5.1
-pytorch-cuda 12.1
-python 3.11.10

python -m xformers.info give me this

xFormers 0.0.28.post3
memory_efficient_attention.ckF: unavailable
memory_efficient_attention.ckB: unavailable
memory_efficient_attention.ck_decoderF: unavailable
memory_efficient_attention.ck_splitKF: unavailable
memory_efficient_attention.cutlassF-pt: available
memory_efficient_attention.cutlassB-pt: available
[email protected]: unavailable
[email protected]: unavailable
[email protected]: unavailable
[email protected]: unavailable
memory_efficient_attention.triton_splitKF: available
indexing.scaled_index_addF: available
indexing.scaled_index_addB: available
indexing.index_select: available
sequence_parallel_fused.write_values: available
sequence_parallel_fused.wait_values: available
sequence_parallel_fused.cuda_memset_32b_async: available
sp24.sparse24_sparsify_both_ways: available
sp24.sparse24_apply: available
sp24.sparse24_apply_dense_output: available
sp24._sparse24_gemm: available
[email protected]: available
[email protected]: available
swiglu.dual_gemm_silu: available
swiglu.gemm_fused_operand_sum: available
swiglu.fused.p.cpp: available
is_triton_available: True
pytorch.version: 2.5.1
pytorch.cuda: available
gpu.compute_capability: 8.6
gpu.name: NVIDIA GeForce RTX 3050
dcgm_profiler: unavailable
build.info: available
build.cuda_version: None
build.hip_version: None
build.python_version: 3.11.10
build.torch_version: 2.5.1
build.env.TORCH_CUDA_ARCH_LIST: None
build.env.PYTORCH_ROCM_ARCH: None
build.env.XFORMERS_BUILD_TYPE: None
build.env.XFORMERS_ENABLE_DEBUG_ASSERTIONS: None
build.env.NVCC_FLAGS: None
build.env.XFORMERS_PACKAGE_FROM: None
source.privacy: open source

how i can install or build xformers with CUDA support?

@danielhanchen
Copy link
Contributor

Apologies on the delay - another option is to ignore xformers ie pip uninstall xformers then just use Unsloth as is

@criogennn
Copy link
Author

Apologies on the delay - another option is to ignore xformers ie pip uninstall xformers then just use Unsloth as is

thx for help. i am already switch on Ubuntu and now got this issue #1376

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants