Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parameterized tests cannot be ordered #55

Open
fnahuelc opened this issue Nov 12, 2021 · 12 comments
Open

Parameterized tests cannot be ordered #55

fnahuelc opened this issue Nov 12, 2021 · 12 comments
Labels
enhancement New feature or request

Comments

@fnahuelc
Copy link

Considering the example in pytest documentation:
https://pytest.org/en/6.2.x/example/parametrize.html#a-quick-port-of-testscenarios

If I include ordering:

scenario1 = ("basic", {"attribute": "value"})
scenario2 = ("advanced", {"attribute": "value2"})


class TestSampleWithScenarios:
    scenarios = [scenario1, scenario2]

    @pytest.mark.order(2)
    def test_demo1(self, attribute):
        assert isinstance(attribute, str)

    @pytest.mark.order(1)
    def test_demo2(self, attribute):
        assert isinstance(attribute, str)

I would expect this order:

test_scenarios.py::TestSampleWithScenarios::test_demo2[basic] PASSED     [ 25%]
test_scenarios.py::TestSampleWithScenarios::test_demo1[basic] PASSED     [ 50%]
test_scenarios.py::TestSampleWithScenarios::test_demo2[advanced] PASSED  [ 75%]
test_scenarios.py::TestSampleWithScenarios::test_demo1[advanced] PASSED  [100%]

But I get instead:

test_scenarios.py::TestSampleWithScenarios::test_demo2[basic] PASSED     [ 25%]
test_scenarios.py::TestSampleWithScenarios::test_demo2[advanced] PASSED  [ 50%]
test_scenarios.py::TestSampleWithScenarios::test_demo1[basic] PASSED     [ 75%]
test_scenarios.py::TestSampleWithScenarios::test_demo1[advanced] PASSED  [100%]

I have tried with --order-scope=class and --order-group-scope=class but they dont fix this.

@mrbean-bremen
Copy link
Member

This is actually what is expected. The order marker is related to the test where it is defined, not to the test scenarios, e.g. it will add all tests of the same name together. There is currently no possibilty I can think of that will produce what you want, but I will have another look later.

@mrbean-bremen mrbean-bremen added the enhancement New feature or request label Nov 12, 2021
@mrbean-bremen
Copy link
Member

I will see if I can add some functionality that provides that, as it really makes sense. This may take a while though, as I'm currently busy with other stuff.

@fnahuelc
Copy link
Author

Thanks a lot for considering!
Yes, It would be very useful. I thought about looking for other plugins to administer the tests in different scenarios but I could not found any with this.
I think the previous version of pytest-ordering worked in a way that spawned the tests as I put in the example.

@mrbean-bremen
Copy link
Member

I think the previous version of pytest-ordering worked in a way that spawned the tests as I put in the example.

I just checked, but I couldn't find a version of pytest-order or pytest-ordering that behaves that way. Out of interest: what version have you used that behaved the way? Provided you still have that information of course...

@mrbean-bremen
Copy link
Member

I have looked at this some more, and I don't think that there is a good way to implement this, simply because it is a somewhat special case. What happens is that parametrized tests with class or module scope within the same class or module that have exactly the same argument names, are clumped together and sorted by the parameters only, not by the test name. In principle this could be done in sorted tests, if the tests are sorted adjacent (e.g. with order numbers 1 and 2, but not with 1 and 3 with another test between), but that would complicate the code quite a bit.

I will leave this open to see if this is wanted by more people, and I may reconsider in this case, but at the moment I tend to not implement this behavior.

@ipfilip
Copy link

ipfilip commented Nov 15, 2022

Hello, I think I have hit a wall here with pytest-order.
I am using a module / session scoped parametrized fixture and order within test classes:

import pytest

@pytest.fixture(scope="session", params=["using first", "using second", "using third"])
def setup(pytestconfig, request):
    # this fixture needs a long time to setup
    return str(request.param) + " setup"

class TestClass:

    @pytest.mark.order(1)
    def test_step_1(self, setup):
        assert isinstance(setup, str)

    @pytest.mark.order(2)
    def test_step_2(self, setup):
        assert isinstance(setup, str)

    @pytest.mark.order(3)
    def test_step_3(self, setup):
        assert isinstance(setup, str)

if __name__ == "__main__":
    pytest.main(["-v", __file__])

This results in:

test_try.py::TestClass::test_step_1[using first] PASSED       [ 11%] 
test_try.py::TestClass::test_step_1[using second] PASSED      [ 22%]
test_try.py::TestClass::test_step_1[using third] PASSED       [ 33%]
test_try.py::TestClass::test_step_2[using first] PASSED       [ 44%] 
test_try.py::TestClass::test_step_2[using second] PASSED      [ 55%]
test_try.py::TestClass::test_step_2[using third] PASSED       [ 66%] 
test_try.py::TestClass::test_step_3[using first] PASSED       [ 77%]
test_try.py::TestClass::test_step_3[using second] PASSED      [ 88%] 
test_try.py::TestClass::test_step_3[using third] PASSED       [100%]

This requires the setup fixture to be setup for 9 times (exponentially more) as oposed to 3 times if the test were run as:

test_try.py::TestClass::test_step_1[using first] PASSED       [ 11%]
test_try.py::TestClass::test_step_2[using first] PASSED       [ 22%] 
test_try.py::TestClass::test_step_3[using first] PASSED       [ 33%]
test_try.py::TestClass::test_step_1[using second] PASSED      [ 44%]
test_try.py::TestClass::test_step_2[using second] PASSED      [ 55%] 
test_try.py::TestClass::test_step_3[using second] PASSED      [ 66%] 
test_try.py::TestClass::test_step_1[using third] PASSED       [ 77%] 
test_try.py::TestClass::test_step_2[using third] PASSED       [ 88%]
test_try.py::TestClass::test_step_3[using third] PASSED       [100%] 

One would expect a "session" scoped fixture to be setup only once per session, but that is not the case when an instance of that fixture with a different parameter is needed. In that case the fixture is destroyed and the other one is setup as mentioned in pytest-dev/pytest#3161 (comment) .

@mrbean-bremen
Copy link
Member

@ipfilip - thanks, this is a valid point. I still don't have a good idea how to change this, but I may get back to this.

@ipfilip
Copy link

ipfilip commented Nov 18, 2022

Thans for the reply.
I expected the '--order-scope=class' would make it so only tests within their class would get rearanged, not mixing up instances of the same class. I guess that would need reading the scope of the used fixtures, getting their parameter ids and grouping them accordingly.
Sure sounds tough.

@sergiy-kozak
Copy link

@ipfilip - thanks, this is a valid point. I still don't have a good idea how to change this, but I may get back to this.

Hi, I'm bumping against the above issue now when session-scoped fixture called multiple times for the same parameter value, e.g. using first in above example . Is there any workaround known for this to achieve both ordering and "default" pytest session fixture value lifecycle?

@mrbean-bremen
Copy link
Member

I don't know of a workaround, sorry. I looks like I have to re-visit this issue...

@sergiy-kozak
Copy link

sergiy-kozak commented Jun 7, 2024

I don't know of a workaround, sorry. I looks like I have to re-visit this issue...

no problem, I think this is understandable enough. Re-ordering tests like that (with parameterized fixture session or module scoped) goes actually against pytest design for fixture value lifecycle in some kind of ways. You would probably need to go deeper in pytest guts to figure out if that can be possible at all in "standard" pytest plugin way. Right now my workaround is to have test module-level fixture value lazy-populated cache based on the parameter set (used as the key in dict, which implies also restriction on how fixture value is produced and value set of parameters. Fixture itself would be like that:

fixture_value_cache = {}

@pytest.fixture(scope="module", params=[p1, p2, .....])
def fixt_value_p(request):
    param = request.param
    fixt_value_cache_k = compute_key(param)
    if fixt_value_cache_k not in fixture_value_cache:
          fixt_v = compute_value(param)
          fixture_value_cache[fixt_value_cache_k] = fixt_v
    yield fixture_value_cache[fixt_value_cache_k]
    # optional post-use steps

pytest would keep going with its own lifecycle, but resulting fixture value is going to be one and the same object for distinct parameters set.

@mrbean-bremen
Copy link
Member

You would probably need to go deeper in pytest guts to figure out if that can be possible at all in "standard" pytest plugin way

Yes, that was what I was also thinking. I'm not sure yet if I'll find a good way to do this, but I see the need for a solution.
Good that you found a workaround, even if this essentially has to repeat the work that pytest fixtures are supposed to do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants