diff --git a/CALL_FOR_SUBMISSIONS.md b/CALL_FOR_SUBMISSIONS.md index 0e21f0e9c..c5fff46d7 100644 --- a/CALL_FOR_SUBMISSIONS.md +++ b/CALL_FOR_SUBMISSIONS.md @@ -17,7 +17,7 @@ Submissions can compete under two hyperparameter tuning rulesets (with separate - **Registration deadline to express non-binding intent to submit: February 28th, 2024**.\ Please fill out the (mandatory but non-binding) [**registration form**](https://forms.gle/K7ty8MaYdi2AxJ4N8). - **Submission deadline: April 04th, 2024** *(moved by a week from the initial March 28th, 2024)* -- [tentative] Announcement of all results: July 15th, 2024 +- [Announcement of all results](https://mlcommons.org/2024/08/mlc-algoperf-benchmark-competition/): August 1st, 2024 For a detailed and up-to-date timeline see the [Competition Rules](/COMPETITION_RULES.md). diff --git a/README.md b/README.md index 0cb8b7aca..5a1f10a33 100644 --- a/README.md +++ b/README.md @@ -27,10 +27,7 @@ --- > [!IMPORTANT] -> Submitters are no longer required to self-report results. -> We are currently in the process of evaluating and scoring received submissions. -> Results coming soon! -> For other key dates please see [Call for Submissions](CALL_FOR_SUBMISSIONS.md). +> The results of the inaugural AlgoPerf: Training Algorithms benchmark competition have been announced. See the [MLCommons blog post](https://mlcommons.org/2024/08/mlc-algoperf-benchmark-competition/) for an overview and the [results page](https://mlcommons.org/benchmarks/algorithms/) for more details on the results. We are currently preparing an in-depth analysis of the results in the form of a paper and plan the next iteration of the benchmark competition. ## Table of Contents