You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Then we'd try to get benchmarks from a particular machine (type) into separate folders. They'd be named either by a hash of the system data, or some more human-readable subset of the system data. I'd really like to get to a place where we can detect regressions on travis, at least when we hit previously-seen hardware.
The long-long-term goal would be some kind of benchmarking-helper library that we could use both in cheetah and other performance-sensitive projects.
The text was updated successfully, but these errors were encountered:
openbenchmarking.org does a good job of recording system specs as benchmark metadata. http://openbenchmarking.org/result/1405134-PL-AIOLAPTOP47
I'd like to do the same for our benchmarks. Either transliterate their code to python (if it's simple enough) or vendor the php.
https://gitorious.org/phoronix/phoronix-test-suite/source/master:pts-core/objects/phodevi/components
Then we'd try to get benchmarks from a particular machine (type) into separate folders. They'd be named either by a hash of the system data, or some more human-readable subset of the system data. I'd really like to get to a place where we can detect regressions on travis, at least when we hit previously-seen hardware.
The long-long-term goal would be some kind of benchmarking-helper library that we could use both in cheetah and other performance-sensitive projects.
The text was updated successfully, but these errors were encountered: