You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
I run a benchmark and store the results in JSON format. I'd like to have an option to aggregate the repetitions of the same benchmark from multiple JSON files.
Describe the solution you'd like
compare.py merge a.json b.json
Describe alternatives you've considered
I can write such a script myself. But maybe such solution already exist and I just can't find it.
The text was updated successfully, but these errors were encountered:
i think that would be something that would need to be contributed. but i wouldn't add it to compare.py, i'd create a new aggregate.py or something.
but how would the metrics be aggregated? when you say "repetitions" i don't think you mean it in the same sense it's used in the library. are these JSON files different versions of the code or just reruns of the same code?
By repetitions I mean --benchmark_repetitions=N. Let's say I have one result with 10 repetitions recorded, and one result with 20 repetitions recorded. The aggregated result should have 30 repetitions.
If the result files contain also stats over repetitions they would need to be recomputed or discarded.
I plan to run the same code but over long period of time, possibly interleaved with some other operations.
Is your feature request related to a problem? Please describe.
I run a benchmark and store the results in JSON format. I'd like to have an option to aggregate the repetitions of the same benchmark from multiple JSON files.
Describe the solution you'd like
Describe alternatives you've considered
I can write such a script myself. But maybe such solution already exist and I just can't find it.
The text was updated successfully, but these errors were encountered: