Skip to content

GQA Flash Attention with Attention Mask #1587

GQA Flash Attention with Attention Mask

GQA Flash Attention with Attention Mask #1587

Triggered via pull request November 5, 2023 18:29
Status Success
Total duration 15s
Artifacts
cuda build_x64_RelWithDebInfo
0s
cuda build_x64_RelWithDebInfo
dml build_x64_RelWithDebInfo
2s
dml build_x64_RelWithDebInfo
training build_x64_RelWithDebInfo
2s
training build_x64_RelWithDebInfo
kernelDocumentation build_x64_RelWithDebInfo
3s
kernelDocumentation build_x64_RelWithDebInfo
Fit to window
Zoom out
Zoom in