Skip to content

Commit

Permalink
Update multihead_attention.cu
Browse files Browse the repository at this point in the history
  • Loading branch information
wangyems authored Jan 23, 2024
1 parent 67be406 commit 63efcf4
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion onnxruntime/contrib_ops/rocm/bert/multihead_attention.cu
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,8 @@ Status MultiHeadAttention<T>::ComputeInternal(OpKernelContext* context) const {
key_padding_mask, relative_position_bias,
past_key, past_value, past_seq_len,
&attn,
num_heads_, is_unidirectional_, mask_filter_value_, scale_,
num_heads_, false, /*is_unidirectional_*/

Check warning on line 126 in onnxruntime/contrib_ops/rocm/bert/multihead_attention.cu

View workflow job for this annotation

GitHub Actions / cpplint

[cpplint] onnxruntime/contrib_ops/rocm/bert/multihead_attention.cu#L126

Line ends in whitespace. Consider deleting these extra spaces. [whitespace/end_of_line] [4]
Raw output
onnxruntime/contrib_ops/rocm/bert/multihead_attention.cu:126:  Line ends in whitespace.  Consider deleting these extra spaces.  [whitespace/end_of_line] [4]
mask_filter_value_, scale_,
past_present_share_buffer_, false, device_prop.maxThreadsPerBlock));

if (attn_type_ == kDecoderMaskedMultiHeadAttention && attn.sequence_length != 1) {
Expand Down

0 comments on commit 63efcf4

Please sign in to comment.