Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a build step before testing pytest-mpl? #232

Open
velle opened this issue Sep 26, 2024 · 3 comments · May be fixed by #234
Open

Is there a build step before testing pytest-mpl? #232

velle opened this issue Sep 26, 2024 · 3 comments · May be fixed by #234

Comments

@velle
Copy link

velle commented Sep 26, 2024

Excuse my ignorance. The very few times I have cloned the source of a python project, I have been able to run pytest or tox succesfully without doing anything else.

When I clone pytest-mpl, and pytest, a lot of tests fail, and quite different types of failures. Same with tox.

Is there some kind of build step I should complete before running tox or pytest?

@Cadair
Copy link
Contributor

Cadair commented Sep 26, 2024

What platform are you on? What tests fail?

@velle
Copy link
Author

velle commented Sep 26, 2024

Ubuntu 22.04. Python 3.10.12. matplotlib version 3.5.1 (if it matters).

If I simply run pytest, then 17 fails, 95 passed, 23 skipped. See failures below:

FAILED tests/test_baseline_path.py::test_config[dir1-None-None-dir1-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_baseline_path.py::test_config[dir1-dir2-None-dir2-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_baseline_path.py::test_config[dir1-dir2-dir3-dir3-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_baseline_path.py::test_config[None-None-dir3-dir3-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_mpl.py::test_formats[png-True-False] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_results_always.py::test_config[None-None-False] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_results_always.py::test_config[True-None-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_results_always.py::test_config[False-None-False] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_results_always.py::test_config[False-True-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_results_always.py::test_config[True-True-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_use_full_test_name.py::test_config[None-None-test_mpl-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_use_full_test_name.py::test_config[False-None-test_mpl-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_use_full_test_name.py::test_config[True-None-test_config.TestClass.test_mpl-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_use_full_test_name.py::test_config[False-True-test_config.TestClass.test_mpl-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_use_full_test_name.py::test_config[None-True-test_config.TestClass.test_mpl-True] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/subtests/test_subtest.py::test_default - subtests.helpers.MatchError: Summary item result_image for subtests.subtest.test_special.test_hdiff_imatch_savefig does not match.
FAILED tests/subtests/test_subtest.py::test_html_images_only - subtests.helpers.MatchError: Summary item status_msg for subtests.subtest.test_special.test_hdiff_imatch_savefig does not match.

results.json

@Cadair
Copy link
Contributor

Cadair commented Sep 26, 2024

It looks like some of the CI is upset as well, but not with the same errors. There shouldn't be anything you have to do, especially with tox, setup wise. Looks like things might just need a cleanup to get all the tests passing again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants