Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flaky test on Windows for REM #2437

Closed
wants to merge 1 commit into from
Closed

Flaky test on Windows for REM #2437

wants to merge 1 commit into from

Conversation

purva-thakre
Copy link
Contributor

@purva-thakre purva-thakre commented Jul 3, 2024

Description

Fixes #2431 by increasing the sample size. Increasing the sample size does not increase the run time by a lot.

Sample Size Time (in Seconds)
1000 0.02572114800204872
10000 0.2219564580009319

This PR also splits up the docs build portion of the github workflows from everything else. This lets us rerun the pytest workflows without requiring us to rerun the docs job.


License

  • I license this contribution under the terms of the GNU GPL, version 3 and grant Unitary Fund the right to provide additional permissions as described in section 7 of the GNU GPL, version 3.

Before opening the PR, please ensure you have completed the following where appropriate.

Copy link

codecov bot commented Jul 3, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 98.70%. Comparing base (e61c058) to head (6c5f350).

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2437   +/-   ##
=======================================
  Coverage   98.70%   98.70%           
=======================================
  Files          88       88           
  Lines        4083     4083           
=======================================
  Hits         4030     4030           
  Misses         53       53           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@purva-thakre purva-thakre marked this pull request as ready for review July 3, 2024 13:26
@purva-thakre
Copy link
Contributor Author

I think increasing the sample size should fix the failures i.e this PR is technically ready.

Will try to rerun the workflow a couple of times to make sure of this. I can't test it locally because I don't have access to a windows device.

@purva-thakre
Copy link
Contributor Author

windows-python3.10 is failing due to installation issues.

https://github.com/unitaryfund/mitiq/actions/runs/9778773532/job/27001295996?pr=2437#step:4:892

@purva-thakre
Copy link
Contributor Author

running the tests 4 times did not lead to any failures in the linked issue.

Noticed a macos failure which is different from the test in this PR.
https://github.com/unitaryfund/mitiq/actions/runs/9778773532/job/27007243754?pr=2437#step:6:4363

@purva-thakre
Copy link
Contributor Author

as discussed during the community call today, we want to wait for #2441 to be merged before rebasing this PR.

@jordandsullivan and @cosenal raised a point that we should look into what's causing this failure instead of simply raising the sample size to ensure the test passes.

@purva-thakre
Copy link
Contributor Author

The function in the failing unit test relies on np.random.choice. The performance of numpy on windows is affected as discussed in the numpy repo.

choices = np.random.choice(num_values, size=samples, p=probability_vector)

I still cannot understand why the test would fail/pass at random on windows/python3.11 or windows/python3.12,

@purva-thakre
Copy link
Contributor Author

closing due to #2431 (comment)

@purva-thakre purva-thakre deleted the rem_failure branch July 10, 2024 01:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Flaky test in rem/tests/test_inverse_confusion_matrix.py
1 participant