Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build benchmark software / evaluator that uses historic competition games #99

Open
Vadman97 opened this issue Apr 18, 2019 · 0 comments
Assignees
Labels
ai necessary feature This is a feature that needs implementing

Comments

@Vadman97
Copy link
Owner

Use difficult positions that have occurred in grandmaster games to see if we can find them
Use this to time performance of parallel at different depths / at with different cores

@Vadman97 Vadman97 added the necessary feature This is a feature that needs implementing label Apr 18, 2019
@Vadman97 Vadman97 added this to the Ready for Demo milestone Apr 18, 2019
@Vadman97 Vadman97 self-assigned this Apr 18, 2019
@Vadman97 Vadman97 added the ai label Apr 18, 2019
@Vadman97 Vadman97 removed this from the Ready for Demo milestone Apr 22, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai necessary feature This is a feature that needs implementing
Projects
None yet
Development

No branches or pull requests

2 participants