You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry to post this, I have a big gap knowledge in deep learning. Hope, you will not mind me.
May I move on to my point?
My work is to get the representation of graphomer model for using in downsteam
Every time, I run the input graph into model to get the last hidden state of model, I got the different number in last hidden state every time.
I have also encode the Euclidian in shortest path
Does it normal or not to have different last hidden state different number everytime? or I have to train the model first?
Thank you for visit my post and clarify my problem if I misunderstand please, forgive for me.
The text was updated successfully, but these errors were encountered:
feyhong1112
changed the title
[Question]Do Transformer Encoder need to train for get the new representation of graph?
[Question]Do Graphformer Encoder need to train for get the new representation of graph?
Oct 31, 2024
Sorry to post this, I have a big gap knowledge in deep learning. Hope, you will not mind me.
May I move on to my point?
My work is to get the representation of graphomer model for using in downsteam
Every time, I run the input graph into model to get the last hidden state of model, I got the different number in last hidden state every time.
I have also encode the Euclidian in shortest path
Does it normal or not to have different last hidden state different number everytime? or I have to train the model first?
Thank you for visit my post and clarify my problem if I misunderstand please, forgive for me.
The text was updated successfully, but these errors were encountered: