[CUDA] cuDNN Flash Attention #32434
Annotations
1 error and 10 warnings
reviewdog: Too many results (annotations) in diff.
You may miss some annotations due to GitHub limitation for annotation created by logging command.
Please check GitHub Actions log console to see all results.
Limitation:
- 10 warning annotations and 10 error annotations per step
- 50 annotations per job (sum of annotations from all the steps)
- 50 annotations per run (separate from the job annotations, these annotations aren't created by users)
Source: https://github.com/orgs/community/discussions/26680#discussioncomment-3252835
|
onnxruntime/contrib_ops/cuda/bert/attention_impl.cu#L568
[cpplint] reported by reviewdog 🐶
Using deprecated casting style. Use static_cast<int>(...) instead [readability/casting] [4]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/attention_impl.cu:568: Using deprecated casting style. Use static_cast<int>(...) instead [readability/casting] [4]
|
onnxruntime/contrib_ops/cuda/bert/attention_impl.cu#L569
[cpplint] reported by reviewdog 🐶
Using deprecated casting style. Use static_cast<int>(...) instead [readability/casting] [4]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/attention_impl.cu:569: Using deprecated casting style. Use static_cast<int>(...) instead [readability/casting] [4]
|
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu#L30
[cpplint] reported by reviewdog 🐶
Missing username in TODO; it should look like "// TODO(my_username): Stuff." [readability/todo] [2]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu:30: Missing username in TODO; it should look like "// TODO(my_username): Stuff." [readability/todo] [2]
|
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu#L96
[cpplint] reported by reviewdog 🐶
Redundant blank line at the end of a code block should be deleted. [whitespace/blank_line] [3]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu:96: Redundant blank line at the end of a code block should be deleted. [whitespace/blank_line] [3]
|
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu#L177
[cpplint] reported by reviewdog 🐶
Missing spaces around = [whitespace/operators] [4]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu:177: Missing spaces around = [whitespace/operators] [4]
|
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu#L178
[cpplint] reported by reviewdog 🐶
Missing spaces around = [whitespace/operators] [4]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu:178: Missing spaces around = [whitespace/operators] [4]
|
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu#L179
[cpplint] reported by reviewdog 🐶
Missing spaces around = [whitespace/operators] [4]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu:179: Missing spaces around = [whitespace/operators] [4]
|
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu#L180
[cpplint] reported by reviewdog 🐶
Missing spaces around = [whitespace/operators] [4]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu:180: Missing spaces around = [whitespace/operators] [4]
|
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu#L181
[cpplint] reported by reviewdog 🐶
Missing spaces around = [whitespace/operators] [4]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu:181: Missing spaces around = [whitespace/operators] [4]
|
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu#L182
[cpplint] reported by reviewdog 🐶
Missing spaces around = [whitespace/operators] [4]
Raw Output:
onnxruntime/contrib_ops/cuda/bert/cudnn_fmha/cudnn_flash_attention.cu:182: Missing spaces around = [whitespace/operators] [4]
|
This job succeeded
Loading