Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluation of Code-to-Code translation #151

Open
pkuzqh opened this issue Feb 26, 2023 · 0 comments
Open

Evaluation of Code-to-Code translation #151

pkuzqh opened this issue Feb 26, 2023 · 0 comments

Comments

@pkuzqh
Copy link

pkuzqh commented Feb 26, 2023

Hi,
It is a little strange in the evaluation of code-to-code translation subtask. In the evaluation script, it directly uses 'split' to tokenize the code, which is affecting the calculation of the BLEU score.. It is more proper to use some tools (javalang, tree-sitter) to tokenize the code. I can provide some related code and the tokenized data if you need.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant