-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Where do we store the output of the benchmark for the current main
?
#73
Comments
@msschwartz21 I think this is for you |
The benchmarking data from the workflows that run on main are saved in Lines 664 to 733 in a8ba026
This action doesn't explicitly generate any kind of diff table like we do on the PR workflow, but that's something we could add if needed. |
Ok thanks. I think what we have in place is already great and will help us to avoid introducing slowdowns. If we realize that we are working on improving performance a lot we might want to come back to this. |
I'm wondering where we are storing the results of the benchmarks for the current
main
, I'm a bit lost.I would like to look at the table that the benchmark workflow produces, supposedly named
output.json
: https://github.com/Janelia-Trackathon-2023/traccuracy/blob/main/.github/workflows/benchmark-report.ymlThings I do find:
gh-pages
there is a csv file with all the benchmark results ever obtained by github actions.The text was updated successfully, but these errors were encountered: