We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Also check for a complete list: https://github.com/rwth-i6/returnn_common/milestone/1
Sequential
ModuleList
Loop.State
make_root_net_dict
reduce(x, mode="mean", ...)
reduce_mean(x, ...)
mark_as_loss
Conv1d
Conv2d
Conv3d
Conv
Pool
activation
Linear
relu
eval
nn.dot
nn.dropout
axis
nn.SelfAttention
att_dropout
SelfAttentionStep
initial_state
Loop.last
...Step
VariableLayer
name_scope
Dim
nn.Random
nn.scoped
The text was updated successfully, but these errors were encountered:
albertz
curufinwe
JackTemaki
papar22
mmz33
michelwi
patrick-wilken
Atticus1806
No branches or pull requests
Also check for a complete list: https://github.com/rwth-i6/returnn_common/milestone/1
Sequential
module #33ModuleList
module #34Loop.State
incomplete #35make_root_net_dict
) #44reduce(x, mode="mean", ...)
byreduce_mean(x, ...)
, etc? #46mark_as_loss
? #56Conv1d
,Conv2d
,Conv3d
instead ofConv
? Same forPool
? #61activation
option fromLinear
,Conv
#62activation
function private, prefer all direct wrappers likerelu
etc #63eval
function private, or rename it? #64nn.dot
should have better signature #67nn.dropout
should haveaxis
mandatory #77nn.SelfAttention
defaultatt_dropout
? #78SelfAttentionStep
lacksinitial_state
#79Loop.last
(or maybe only for states) #80...Step
variants with on-seq variants? #81VariableLayer
? #82name_scope
layer option not used ideally #112Dim
internals and API should be refactored returnn#975nn.Random
for multiple ops #148nn.scoped
#159The text was updated successfully, but these errors were encountered: