You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am curious as to why this data set is not open for contribution to keep it evolving. Yes, "164 hand-written programming problems" is a good start, but more is certainly better, especially that all the problems seems to be focusing on algorithms. By opening this for contribution, you crowd source this problem. Obviously, contributions have to meet a certain standard to avoid degrading the quality of the data set, but this is not hard to achieve.
Another problem that might happen from allowing contribution is it might make hard to reference in papers, but surely a simple versioning system can solve this problem. Authors can then say something like: We achieved X% accuracy on HumanEval version Y.
The text was updated successfully, but these errors were encountered:
I am curious as to why this data set is not open for contribution to keep it evolving. Yes, "164 hand-written programming problems" is a good start, but more is certainly better, especially that all the problems seems to be focusing on algorithms. By opening this for contribution, you crowd source this problem. Obviously, contributions have to meet a certain standard to avoid degrading the quality of the data set, but this is not hard to achieve.
Another problem that might happen from allowing contribution is it might make hard to reference in papers, but surely a simple versioning system can solve this problem. Authors can then say something like: We achieved X% accuracy on HumanEval version Y.
The text was updated successfully, but these errors were encountered: