Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Address partly a test failing due to stochasticity #129

Merged
merged 5 commits into from
Apr 29, 2024

Conversation

Bachibouzouk
Copy link
Collaborator

Related to #99

It needs to be run from the root of the repository
Issue: it was not easy to find that there were extra dependencies for
testing

Solution: mention it explicitely in the CONTRIBUTING.md
issue: the test was failing randomly when accessing whether the sampled
coincidence were following a normal distribution, due to the fact we
apply math.ceil to the random.gauss to get integer number of appliances
to be switched on simultaneously

solution: get an experimental probability density function and fit a
normal distribution to it, look how large is the error of mean and std
as a proxy of visual inspection of the graph
@trevorb1
Copy link

Hi @Bachibouzouk; thanks, the tests all worked fine for me, now! :)

Copy link
Contributor

@FLomb FLomb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The updated test looks good, but the -0.5 correction in calc_coincident_switch_on looks still wrong to me. I have attached two new detailed examples of what I get with or without it. I have done the test not only for the histogram, which we were skeptical about due to the subjectivity of bins' dimensions, but also for a kernel distribution of the same RAMP results. Both visualisations agree on the better performance without the -0.5 correction.

ramp/core/core.py Show resolved Hide resolved
@Bachibouzouk Bachibouzouk force-pushed the review/failing_tests branch from b6f9043 to 4559bcf Compare April 29, 2024 09:52
@Bachibouzouk
Copy link
Collaborator Author

The updated test looks good, but the -0.5 correction in calc_coincident_switch_on looks still wrong to me.

Sorry @FLomb - I had not reworked this PR since our last discussion, now I removed the correction

@Bachibouzouk Bachibouzouk merged commit 821749a into joss-paper Apr 29, 2024
1 check passed
@Bachibouzouk Bachibouzouk deleted the review/failing_tests branch April 29, 2024 09:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants