Skip to content

Commit

Permalink
pre-layernorm for attention
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed Apr 4, 2022
1 parent 42a4c51 commit 7b46b01
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 1 deletion.
4 changes: 4 additions & 0 deletions palm_pytorch/palm_pytorch.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,10 @@ def forward(self, x):
n, device, h = x.shape[1], x.device, self.heads
q, k, v = self.to_qkv(x).chunk(3, dim = -1)

# pre layernorm

x = self.norm(x)

# split heads

q, k, v = map(lambda t: rearrange(t, 'b n (h d) -> (b h) n d', h = h), (q, k, v))
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
setup(
name = 'PaLM-pytorch',
packages = find_packages(exclude=[]),
version = '0.0.1',
version = '0.0.2',
license='MIT',
description = 'PaLM: Scaling Language Modeling with Pathways - Pytorch',
author = 'Phil Wang',
Expand Down

0 comments on commit 7b46b01

Please sign in to comment.