Skip to content

Commit

Permalink
Lock version of flash attention
Browse files Browse the repository at this point in the history
  • Loading branch information
binkjakub committed Dec 29, 2024
1 parent 595648a commit 946966d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ all: check test
install: cuda := 124
install:
pip install -r requirements.txt --find-links https://download.pytorch.org/whl/cu$(cuda)
pip install flash-attn --no-build-isolation
pip install flash-attn==2.6.3 --no-build-isolation

install_cpu:
pip install --find-links https://download.pytorch.org/whl/cpu -r requirements.txt
Expand Down

0 comments on commit 946966d

Please sign in to comment.