Skip to content

Commit

Permalink
Simona/inits (#163)
Browse files Browse the repository at this point in the history
The [Pytorch recommendation
](https://discuss.pytorch.org/t/conv-weight-data-vs-conv-weight/83047/2)
is to use weight/bias itself instead of using .data when initializing,
which was used in the past but it was phased out at some point. The
reason they state is that it could cause problems with autograd not
tracking operations correctly, which AFAIK has not caused issues for us,
but it's probably worth changing to keep up with the standard.
  • Loading branch information
spetravic authored Aug 25, 2023
1 parent 204cb99 commit c24f990
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions emote/nn/initialization.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,15 +5,15 @@

def ortho_init_(m, gain=np.sqrt(2)):
if isinstance(m, nn.Linear):
nn.init.orthogonal_(m.weight.data, gain)
nn.init.constant_(m.bias.data, 0.0)
nn.init.orthogonal_(m.weight, gain)
nn.init.constant_(m.bias, 0.0)
if isinstance(m, nn.Conv2d):
nn.init.orthogonal_(m.weight.data, gain)
nn.init.orthogonal_(m.weight, gain)
if m.bias is not None:
nn.init.constant_(m.bias.data, 0.0)
nn.init.constant_(m.bias, 0.0)


def xavier_uniform_init_(m, gain):
if isinstance(m, nn.Linear):
nn.init.xavier_uniform_(m.weight.data, gain)
nn.init.constant_(m.bias.data, 0.0)
nn.init.xavier_uniform_(m.weight, gain)
nn.init.constant_(m.bias, 0.0)

0 comments on commit c24f990

Please sign in to comment.