Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

感觉code author理解错了原文的loss?? #30

Open
lyancynthia opened this issue Nov 15, 2021 · 2 comments
Open

感觉code author理解错了原文的loss?? #30

lyancynthia opened this issue Nov 15, 2021 · 2 comments

Comments

@lyancynthia
Copy link

看的是EGES_model.py。�
def make_skipgram_loss() 这个函数像代码作者这样写,我理解的意思是,最大化了node v的embedding Hv和node v的context node u的id的共现概率??
可是原文公式(8)的loss是L(v,u,y)=−[ylog(σ(HvT Zu))+(1−y)log(1−σ(HvT Zu))]啊??

求问是我理解错了还是代码作者理解错了呀....

另外求问_dataset版的py文件是干什么用的呀?

@nanffiy
Copy link

nanffiy commented Dec 27, 2021

看的是EGES_model.py。� def make_skipgram_loss() 这个函数像代码作者这样写,我理解的意思是,最大化了node v的embedding Hv和node v的context node u的id的共现概率?? 可是原文公式(8)的loss是L(v,u,y)=−[ylog(σ(HvT Zu))+(1−y)log(1−σ(HvT Zu))]啊??

求问是我理解错了还是代码作者理解错了呀....

另外求问_dataset版的py文件是干什么用的呀?

_dataset版本在 读入数据那部分使用了tf.data这样便于大数据集的训练与学习

@Fyhyuky-FONTA
Copy link

Fyhyuky-FONTA commented Oct 8, 2024

看的是EGES_model.py。� def make_skipgram_loss() 这个函数像代码作者这样写,我理解的意思是,最大化了node v的embedding Hv和node v的context node u的id的共现概率?? 可是原文公式(8)的loss是L(v,u,y)=−[ylog(σ(HvT Zu))+(1−y)log(1−σ(HvT Zu))]啊??

求问是我理解错了还是代码作者理解错了呀....

另外求问_dataset版的py文件是干什么用的呀?

是写错了,作者的损失函数是用了额外的权重向量辅助计算,但是原论文里面是用结点之间经过注意力聚合的参数矩阵做的

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants