Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Word Embddings in Non-static Mode #37

Open
ndrmahmoudi opened this issue May 18, 2017 · 2 comments
Open

Word Embddings in Non-static Mode #37

ndrmahmoudi opened this issue May 18, 2017 · 2 comments

Comments

@ndrmahmoudi
Copy link

ndrmahmoudi commented May 18, 2017

Hi @yoonkim,

Thanks a lot for the code and useful comments. I have one questions about word embedding after training process in non-static mode. Is there a way to export the back-propagated word wectors in you code? Or even more general question, is it possible to just back-propagate word vectors with labelled dataset to see how their dimensionality changes (specially in sentiment classification and polarity of words)?

Regards,
Nader

@haobangpig
Copy link

haobangpig commented Dec 4, 2017

Hi, @ndrmahmoudi did you solve this question? I am confusing this problem right now. Do you have any idea?

@ghost
Copy link

ghost commented Jun 13, 2019

Hello. Did you guys solve the questions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants