Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix SparseAttention cos/sin cache dimension checks (#20609)
### Description This PR fixes the dimension checks for the cos/sin caches used in the rotary embeddings in the `SparseAttention` operator. ### Motivation and Context This PR ports over the same changes from [this PR](#20547) for `GroupQueryAttention`.
- Loading branch information