-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A request for evaluation script #2
Comments
Hi, The evaluation code should be very straightforward: For topic coherence evaluation, you could pass the output topics to the evaluation pipeline of the gensim library. For document clustering, you could refer to the README file. I hope these help! Best, |
Hello, Thanks for your reply.
And I called it using Best wishes, |
Hi, The results look quite off from those reported in the paper, and I believe there are a few points you could double check:
I hope these are helpful. Best, |
Hello, Thanks for your quick reply. Best, |
Hello, I have fixed the issue by evaluating topic coherence with top-5 words, but the UMass score dropped to -6.49. As for the UCI score, I cannot find out how to evaluate the topics on Wikipedia with gensim library. Should I pass the whole Wiki data into the Best, |
Hello,
Could you please share the script used for evaluation? The obtained results may differ with different evaluation scripts.
Thanks a lot!
Best wishes,
Lei
The text was updated successfully, but these errors were encountered: