Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory error while loading wiki-news model for incremental learning #4

Open
vdpappu opened this issue Aug 14, 2018 · 3 comments
Open

Comments

@vdpappu
Copy link

vdpappu commented Aug 14, 2018

Hi Eric,
Thanks for the excellent enhancement. I am trying to use your repo for incremental learning. I am getting a memory error while running the script. My machine has 32gb ram and I am able to load the pre-trained model otherwise for inference tasks.

image

Pre-trained model size: 6.8gb
Command executed:
./fasttext skipgram -input /home/aaa/Downloads/datasets/nlu/sed_sof_corpus.txt -inputModel /home/aaa/Downloads/datasets/wiki-news-300d-1M-subword.bin -output sed_sof_trlearn -incr

@vdpappu vdpappu changed the title Memory error while loading the wiki-news model for incremental learning Memory error while loading wiki-news model for incremental learning Aug 14, 2018
@ericxsun
Copy link
Owner

that's weird. I have not met such error, even my dataset is larger than your. I'll check it.

You could try to add more info to locate the line of code which caused such error. Any effort will be appreciated.

@vdpappu
Copy link
Author

vdpappu commented Aug 20, 2018 via email

@vdpappu
Copy link
Author

vdpappu commented Aug 23, 2018

I am able to run the same with in a Mac. May be some issue with my old machine. However, its not taking the set learning rate. It always shows 0.0000.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants