You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is an interesting plugin, thank you! I’ve recently started to add more TAP based tests to a Python project and unifying test results for reporting using TAP is a great prospect.
However, I wanted to discuss the option of outputting additional test information which might conflict with TAP based reporting — I am not sure at this point.
For example, running tests from this template repo usually prints a bunch more information:
> make test
pre-commit run pytest --hook-stage push --files tests/
Run unit tests...........................................................Passed
- hook id: pytest
- duration: 1.25s
============================= test session starts ==============================
platform darwin -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 -- /Volumes/Dev/python-package-template/.venv/bin/python
cachedir: .pytest_cache
hypothesis profile 'default-with-verbose-verbosity' -> max_examples=500, verbosity=Verbosity.verbose, database=DirectoryBasedExampleDatabase(PosixPath('/Volumes/Dev/python-package-template/.hypothesis/examples'))
rootdir: /Volumes/Dev/python-package-template
configfile: pyproject.toml
plugins: cov-5.0.0, hypothesis-6.111.2, env-1.1.3, custom-exit-code-0.3.0, tap-3.4, doctestplus-1.2.1
collected 3 items
src/package/something.py::package.something.Something.do_something PASSED [ 33%]
tests/test_something.py::test_something PASSED [ 66%]
docs/source/index.rst::index.rst PASSED [100%]
---------- coverage: platform darwin, python 3.13.0-final-0 ----------
Name Stmts Miss Branch BrPart Cover Missing
----------------------------------------------------------------------
src/package/__init__.py 1 0 0 0 100%
src/package/something.py 4 0 0 0 100%
----------------------------------------------------------------------
TOTAL 5 0 0 0 100%
Required test coverage of 100.0% reached. Total coverage: 100.00%
============================ Hypothesis Statistics =============================
tests/test_something.py::test_something:
- during generate phase (0.00 seconds):
- Typical runtimes: ~ 0-1 ms, of which < 1ms in data generation
- 2 passing examples, 0 failing examples, 0 invalid examples
- Stopped because nothing left to do
============================== 3 passed in 0.08s ===============================
Notice the Hypothesis and Coverage statistics here. In contrast, running using the --tap option gives me this:
> make test
pre-commit run pytest --hook-stage push --files tests/
Run unit tests...........................................................Passed
- hook id: pytest
- duration: 0.72s
TAP version 13
1..3
ok 1 src/package/something.py::[doctest] package.something.Something.do_something
ok 2 tests/test_something.py::test_something
ok 3 docs/source/index.rst::[doctest] index.rst
where additional information is missing.
Looking at the TAP spec v13 it seems that YAML blocks or comments would come in handy here?
The text was updated successfully, but these errors were encountered:
I'm not sure where pytest is storing/writing that informational data. At the most basic level, the reporter could be updated to convert that information to diagnostics to show in the tap stream.
This is an interesting plugin, thank you! I’ve recently started to add more TAP based tests to a Python project and unifying test results for reporting using TAP is a great prospect.
However, I wanted to discuss the option of outputting additional test information which might conflict with TAP based reporting — I am not sure at this point.
For example, running tests from this template repo usually prints a bunch more information:
Notice the Hypothesis and Coverage statistics here. In contrast, running using the
--tap
option gives me this:where additional information is missing.
Looking at the TAP spec v13 it seems that YAML blocks or comments would come in handy here?
The text was updated successfully, but these errors were encountered: