Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Julia package for benchmarking #1698

Open
Jingru923 opened this issue Aug 6, 2024 · 2 comments
Open

Use Julia package for benchmarking #1698

Jingru923 opened this issue Aug 6, 2024 · 2 comments
Assignees
Labels
core Issues related to the computational core in Julia improvement Improvements of the usability of existing functionality test Relates to unit testing

Comments

@Jingru923
Copy link
Contributor

In issue #462, regression test is set up for comparing output of a run with the benchmark.

@visr suggests to add runtime into benchmark and make use one of existing Julia package to illustrate the benchmark result better

@Jingru923 Jingru923 added test Relates to unit testing core Issues related to the computational core in Julia improvement Improvements of the usability of existing functionality labels Aug 6, 2024
@Jingru923 Jingru923 added this to Ribasim Aug 6, 2024
@github-project-automation github-project-automation bot moved this to To do in Ribasim Aug 6, 2024
@Huite
Copy link
Contributor

Huite commented Aug 6, 2024

This one? https://github.com/JuliaCI/PkgBenchmark.jl

I'm aware of asv in Python, and it seems there's quite a lot involved into getting consistent and commensurate benchmarks.

@visr
Copy link
Member

visr commented Aug 6, 2024

PkgBenchmark.jl has been around for a while but doesn't seem very actively maintained. Also for CI integration it recommends an unmaintained package.

PkgJogger.jl seems like a better maintained alternative that has nicer local workflows, with easier CI integration.

Though less mature, Lilith's work in https://chairmarks.lilithhafner.com/v1.2.2/regressions seems very nice as well. Perhaps I'd try that first.

In terms of published benchmark tracking, there is https://github.com/SciML/SciMLBenchmarks.jl but comparing commits over time like https://lux.csail.mit.edu/benchmarks/ also looks nice.

@visr visr changed the title Investigate Julia package for benchmarking Use Julia package for benchmarking Sep 3, 2024
@visr visr self-assigned this Oct 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core Issues related to the computational core in Julia improvement Improvements of the usability of existing functionality test Relates to unit testing
Projects
Status: To do
Development

No branches or pull requests

3 participants