Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to calculate the relative positional embeddings from a row offset and column offset? #8

Open
songkq opened this issue Sep 28, 2019 · 3 comments

Comments

@songkq
Copy link

songkq commented Sep 28, 2019

Thanks for sharing the great idea.
When I read the paper, I have some issues about how to compute the relative positional embeddings, i.e. r_(a-i, b-j) , from a row offset a-i and a column offset b-j.
Is there any explicit formula for the calculating process?
Looking forward to your reply.

@leaderj1001
Copy link
Owner

Thanks for your comments !
The paper is implemented referring to the following paper.

Attention Augmented Convolutional Networks Link

Thank you.

@Jimmy880
Copy link

Jimmy880 commented Dec 17, 2019

@leaderj1001
Hello, I wonder whether you implement the relative position embedding in the selfattention.
I notice that you just exploit two random tensors to represent position in h and w direction while the paper said "The row and column offsets are associated with an embedding r_{a−i} and r_{b−j}".
I am confused about whether the so called "embedding" should be implemented as the nn.Embedding operation in pytorch.

@siyuan2018
Copy link

@leaderj1001
Hello, I wonder whether you implement the relative position embedding in the selfattention.
I notice that you just exploit two random tensors to represent position in h and w direction while the paper said "The row and column offsets are associated with an embedding r_{a−i} and r_{b−j}".
I am confused about whether the so called "embedding" should be implemented as the nn.Embedding operation in pytorch.

Hi, I am having the same question here. Did you figure out how to compute the relative position embedding?Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants