-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding TranS model to scoring.py
#30
Conversation
Thank you very much for the PR, we will review it ASAP! In the meantime, I noticed that the CI failed due to formatting: just FYI you can run this checks locally by running the commands |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks again for the contribution, this looks very good! I just have a few minor comments/improvements.
besskge/scoring.py
Outdated
relation_initializer = 3 * relation_initializer | ||
# self.relation_embedding[..., :embedding_size] : relation embedding | ||
# self.relation_embedding[..., :embedding_size] : bar relation embedding | ||
# self.relation_embedding[..., :embedding_size] : hat relation embedding |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comments here should probably read something like
# self.relation_embedding[..., embedding_size:2*embedding_size] : bar relation embedding
# self.relation_embedding[..., 2*embedding_size:] : hat relation embedding
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed
besskge/scoring.py
Outdated
R_h = head_emb_main * (tail_emb_tilde + self.offset) | ||
R_t = tail_emb_main * (head_emb_tilde + self.offset) | ||
R_r = r_bar * head_emb_main + r + r_hat * tail_emb_main | ||
return -self.reduce_embedding(R_h - R_t + R_r) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is perfectly correct, but we could maybe save a couple of tensor multiplications by refactoring the scoring function as
head_emb_main * (tail_emb_tilde + r_bar + self.offset) - tail_emb_main * (head_emb_tilde - r_hat + self.offset) + r
besskge/scoring.py
Outdated
R_h = head_emb_main * (tail_emb_tilde + self.offset).unsqueeze(1) | ||
R_t = tail_emb_main.unsqueeze(1) * (head_emb_tilde + self.offset) | ||
R_r = r_bar.unsqueeze(1) * head_emb_main + r.unsqueeze(1) + r_hat.unsqueeze(1) * tail_emb_main.unsqueeze(1) | ||
return -self.reduce_embedding(R_h - R_t + R_r) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above, this could become
head_emb_main * (tail_emb_tilde + r_bar + self.offset).unsqueeze(1) - tail_emb_main.unsqueeze(1) * (head_emb_tilde - r_hat.unsqueeze(1) + self.offset) + r.unsqueeze(1)
besskge/scoring.py
Outdated
R_h = head_emb_main.unsqueeze(1) * (tail_emb_tilde + self.offset) | ||
R_t = tail_emb_main * (head_emb_tilde + self.offset).unsqueeze(1) | ||
R_r = r_bar.unsqueeze(1) * head_emb_main.unsqueeze(1) + r.unsqueeze(1) + r_hat.unsqueeze(1) * tail_emb_main | ||
return -self.reduce_embedding(R_h - R_t + R_r) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similar to above
docs/source/KGbib.bib
Outdated
@@ -184,3 +184,11 @@ @article{InterHT | |||
journal = {arXiv preprint arXiv:2202.04897}, | |||
year = {2022} | |||
} | |||
|
|||
@inproceedings{TranS, | |||
title ={TranS: Transition-based Knowledge Graph Embedding with Synthetic Relation Representation}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I realize we are not very consistent with this, but I would suggest putting the whole title (or at least TranS) between double braces {{...}}
otherwise the capitalization could get messed up when building the docs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great thanks @AlCatt91, I have fixed this now.
besskge/scoring.py
Outdated
== self.relation_embedding.shape[-1] / 3 | ||
== embedding_size | ||
), ( | ||
"Trans requires `2*embedding_size` embedding parameters for each entity" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing capital S in Trans
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed
Looks great, thanks for the fixes! One last thing: could you please format |
Remove whitespace
@AlCatt91 Hopefully this should be ready to run now :) |
This is perfect! I'll proceed to merge, thanks again for the PR :) |
Here we have added the TranS model to the BESS library.
Link to the TranS paper: https://aclanthology.org/2022.findings-emnlp.86.pdf