Skip to content

Commit

Permalink
Fix SparseAttention cos/sin cache dimension checks (#20609)
Browse files Browse the repository at this point in the history
### Description
This PR fixes the dimension checks for the cos/sin caches used in the
rotary embeddings in the `SparseAttention` operator.

### Motivation and Context
This PR ports over the same changes from [this
PR](#20547) for
`GroupQueryAttention`.
  • Loading branch information
kunal-vaishnavi authored May 8, 2024
1 parent 58d7b12 commit 274d162
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions onnxruntime/contrib_ops/cuda/sparse/sparse_attention_helper.h
Original file line number Diff line number Diff line change
Expand Up @@ -202,13 +202,13 @@ Status CheckInputs(void* params,
"head_size shall be a multiple of 16. Got head_size = ",
head_size);
}
if (cos_dims[0] < max_sequence_length) {
if (cos_dims[0] < total_sequence_length) {
return ORT_MAKE_STATUS(ONNXRUNTIME, INVALID_ARGUMENT,
"cos_cache dimension 0 should be of max_sequence_length.");
"cos_cache dimension 0 should be not be less than total_sequence_length.");
}
if (sin_dims[0] < max_sequence_length) {
if (sin_dims[0] < total_sequence_length) {
return ORT_MAKE_STATUS(ONNXRUNTIME, INVALID_ARGUMENT,
"sin_cache dimension 0 should be of max_sequence_length.");
"sin_cache dimension 0 should be not be less than total_sequence_length.");
}
if (cos_dims[1] > (head_size / 16) * 8 || cos_dims[1] % 8 != 0) {
return ORT_MAKE_STATUS(ONNXRUNTIME, INVALID_ARGUMENT,
Expand Down

0 comments on commit 274d162

Please sign in to comment.