Skip to content

Commit

Permalink
update docstring
Browse files Browse the repository at this point in the history
  • Loading branch information
kuacakuaca committed Sep 4, 2024
1 parent 83b23c9 commit c2a301f
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions i6_models/parts/conformer/mhsa_rel_pos.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@ class ConformerMHSARelPosV1Config(ModelConfiguration):
Attributes:
input_dim: input dim and total dimension for query/key and value projections, should be divisible by `num_att_heads`
num_att_heads: number of attention heads
with_bias: whether to add bias to qkv and output lienar projections
with_bias: whether to add bias to qkv and output linear projections
att_weights_dropout: attention weights dropout
learnable_pos_emb: whether to use learnable relative positional embeddings instead of fixed sinusoidal ones
rel_pos_clip: maximal relative postion for embedding
with_linear_pos: whether to linearly transform the positional embeddings
separate_pos_emb_per_head: whether to apply separate linear transformation on positional embeddings for each head
separate_pos_emb_per_head: whether to create head-dependent positional embeddings
with_pos_bias: whether to add additional position bias terms to the attention scores
pos_emb_dropout: dropout for the positional embeddings
dropout: multi-headed self attention output dropout
Expand Down

0 comments on commit c2a301f

Please sign in to comment.