Skip to content

GQA Flash Attention with Attention Mask #1598

GQA Flash Attention with Attention Mask

GQA Flash Attention with Attention Mask #1598

Triggered via pull request November 7, 2023 04:34
Status Success
Total duration 13s
Artifacts
cuda build_x64_RelWithDebInfo
0s
cuda build_x64_RelWithDebInfo
dml build_x64_RelWithDebInfo
3s
dml build_x64_RelWithDebInfo
training build_x64_RelWithDebInfo
0s
training build_x64_RelWithDebInfo
kernelDocumentation build_x64_RelWithDebInfo
0s
kernelDocumentation build_x64_RelWithDebInfo
Fit to window
Zoom out
Zoom in