Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Flaky test in permissions_spec.js in functional-tests dashboards repo #284

Closed
DarshitChanpura opened this issue Aug 31, 2022 · 7 comments · Fixed by #561
Closed

[BUG] Flaky test in permissions_spec.js in functional-tests dashboards repo #284

DarshitChanpura opened this issue Aug 31, 2022 · 7 comments · Fixed by #561
Assignees
Labels
bug Something isn't working

Comments

@DarshitChanpura
Copy link
Member

DarshitChanpura commented Aug 31, 2022

Intermittent test failure for a test in permission_spec.js with unknown root cause.

More specifically cy.wait(@getPermissions) gets stuck on waiting for mock command to finish execution.

Failed once during 2.2.0 build and failed recently during 2.2.1 build.

How can one reproduce the bug?
It is difficult because of the flakiness but you can run this command to run the tests in the file:

yarn cypress:run-plugin-tests-with-security --spec "cypress/integration/plugins/security/permissions_spec.js"

What is the expected behavior?
Tests should not have intermittent failures.

Do you have any screenshots?
security.zip

@DarshitChanpura DarshitChanpura added bug Something isn't working untriaged labels Aug 31, 2022
@DarshitChanpura
Copy link
Member Author

This seems related: cypress-io/cypress#18524

@peternied peternied transferred this issue from opensearch-project/security Sep 6, 2022
@peternied
Copy link
Member

This seems like its an intermittent failure - sounds like cypress isn't working correctly for this scenario.

@RyanL1997
Copy link
Contributor

We just confirmed that this issue is still existing in the current release of 2.5.0. I have re-run manually on my local machine, and it seems like it only gets unstable during the running of workflows.

@peternied
Copy link
Member

peternied commented Feb 22, 2023

@RyanL1997 Could you include detail on how you fixed the flakiness and how you confirmed it won't reproduce again?

This could look like;

Found a timing issues where an multiple asynchronous actions X and Y were racing #561 - Confirmed this by running the test 10 times in GitHub action on my fork and saw zero failures, [link to test run report].

@RyanL1997
Copy link
Contributor

@RyanL1997 Could you include detail on how you fixed the flakiness and how you confirmed it won't reproduce again?

This could look like;

Found a timing issues where an multiple asynchronous actions X and Y were racing #561 - Confirmed this by running the test 10 times in GitHub action on my fork and saw zero failures, [link to test run report].

I have run this test 30 times in a row on my local machine and none of them failed. However, since the original error is hard to reproduce it, it is hard to compare the failing rate to this fix. But I do have a run in my own fork (Like 8 times?), and none of them failed by the original flakiness. It only failed couple times for the windows runner connections issues of github:

https://github.com/RyanL1997/security-dashboards-plugin/actions/runs/4230492547

Here is the experiment I did. I was trying to run it 10 times in a row, but it got cancelled at 9th time, cuz I was doing a rebase, and it discarded the changes on main.

@peternied
Copy link
Member

Thanks - just in case this issue gets re-opened we've got some context, flaky tests are difficult to deal with at the best of times

@RyanL1997
Copy link
Contributor

Thanks - just in case this issue gets re-opened we've got some context, flaky tests are difficult to deal with at the best of times

Ya, 100% agree. I will keep tracking on this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants