Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPUAI-1250 - Flash Attention v2.04 module rotary cannot be used code fixed #47

Open
wants to merge 1 commit into
base: flash_attention_for_rocm
Choose a base branch
from

Conversation

xiaoxiangAMD
Copy link

@xiaoxiangAMD xiaoxiangAMD commented Mar 1, 2024

background
as we have installed the flash-attention on the machine or a docker image, we found the module rotary_emb could not be used

flash-attention install description
please refer to https://github.com/ROCm/flash-attention

rotary_emb install description
1 first clone the repository:https://github.com/ROCm/flash-attention.git
2 enter the directory: /csrc/rotary/
3 run the command on a terminal or a shell :python setup.py install (python3 is needed)

command for the test
run the import code after you have run python:from flash_attn.layers.rotary import apply_rotary_emb_func

result expected
image
in the directory /tests,we get the test script for the rotary ,the result is as follows::
image

the success

@xiaoxiangAMD xiaoxiangAMD changed the base branch from main to flash_attention_for_rocm March 1, 2024 09:20
@sabreshao
Copy link
Collaborator

Please give more test result beside importing.

@sabreshao sabreshao self-assigned this Mar 1, 2024
@xiaoxiangAMD
Copy link
Author

the test result for rotary module is given, the test source is in the file test_rotary.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants