New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixture execution order: higher-scoped fixtures are not necessarily run before lower-scoped fixtures #2405

Closed
jaraco opened this Issue May 12, 2017 · 19 comments

Comments

Projects
None yet
6 participants
@jaraco
Contributor

jaraco commented May 12, 2017

I've encountered what appears to be a violation of the expectation that higher-scoped fixtures are run before lower scoped fixtures.

In conftest.py, I have a module-scoped fixture.

@pytest.fixture(scope='module')
def clean_data():
    ...

In the test suite, I wish to reference that fixture:

pytestmark = pytest.mark.usefixtures('clean_data')

And an autouse test at the class scope:

class TestStuff:
    @pytest.fixture(scope='class', autouse=True)
    def add_data(cls):
        ...

But when the tests run, it appears that add_data runs before clean_data. Perhaps that's because of the autouse feature. Is that expected for all autouse fixtures to be run before marked fixtures even when the autouse is at a finer scope?

I was able to work around the issue by removing the pytestmark and instead specifying clean_data as a parameter to each add_data across several classes in the module. I don't like this workaround because it violates the goal of signaling the cleanup exactly once at the beginning of the module (akin to a setup_module method).

If this fixture execution order is by design, is there a better way to achieve what I'm after (essentially the setup_module and setup_class behavior from unittest)?

@jaraco

This comment has been minimized.

Contributor

jaraco commented May 12, 2017

I'm using pytest 3.0.7.

@nicoddemus

This comment has been minimized.

Member

nicoddemus commented May 12, 2017

I believe it is a bug, I also would expect the same behavior you describe.

It might be just a matter that auto-use fixtures are added to the fixtures of the test items without sorting by scope

Actually according to this code it should be doing that already, so it's definitely a bug IMO.

@jaraco

This comment has been minimized.

Contributor

jaraco commented May 12, 2017

Okay. Thanks for the review. I think I owe you a reproducible sample. I'll work on that next.

@jaraco

This comment has been minimized.

Contributor

jaraco commented May 13, 2017

Turns out I was able to make a replication script from the description above:

import pytest

data = {}

@pytest.fixture(scope='module')
def clean_data():
    data.clear()

pytestmark = pytest.mark.usefixtures('clean_data')

class TestDataValues:
    @pytest.fixture(scope='class', autouse=True)
    def add_data(cls):
        data.update(value=True)

    def test_value(self):
        assert data.get('value')

Adding a no-op test before the TestDataValues works around the issue:

def test_nothing():
    pass

Or adding clean_data as a parameter to the test_value method also works around the issue.

@nicoddemus

This comment has been minimized.

Member

nicoddemus commented Jul 5, 2017

Did some more digging, sorry for the delay.

The culprit IIUC is how mark transfer works (@RonnyPfannschmidt will love this).

pytestmark = pytest.mark.usefixtures('clean_data') doesn't declare that clean_data should be auto-used at the module level as it might suggest, but it will in fact transfer the marker to all objects of the module at collection time.

In other words, pytestmark = ... is the equivalent to writing this:

    @pytest.mark.usefixtures('clean_data')
    def test_value(self):

Which is equivalent to:

    def test_value(self, clean_data):

(btw you mention that the above fixes the issue for you, but I obtain the same error in my tests)

autouse fixtures are added to the beginning of "active" fixtures of an item, when determining fixture closure here, so the fixtures executed end up being:

    def test_value(self, add_data, clean_data):

Which explains the behavior: first data is added and right afterwards removed, so the test fails.

Now if you change the clean_data fixture to autouse, then things work because now the node location of the fixture declaration is taken in account, and clean_data gets executed first as one would expect at first glance.

So it all boils down to how markers are currently transferred. @RonnyPfannschmidt is leading a quest to sort all this marker business out and remove the marker transfer mechanism to an approach where the markers are attached to the collection node instead, so you don't lose the marker hierarchy when marking classes and pytestmark in modules.

Back to the workaround, I realize that clean_data is probably declared elsewhere and you want to mark all tests in that module to use it, so this is another solution:

@pytest.mark.fixture(session='module', autouse=True)
def _clean_data(clean_data):
    pass

This will attach the "autouse" fixture to the module node, which is what one would expect that pytestmark does in the first place. This is not pretty of course, but at least this clearly suggests a workaround involving the clean_data fixture and provides a docstring to explain or at least point to this issue. Also, if someone removes clean_data in the future they will be forced to remove the _clean_data workaround as well.

@RonnyPfannschmidt if you agree with my diagnosis, I believe we should label this as mark-issue.

@RonnyPfannschmidt

This comment has been minimized.

Member

RonnyPfannschmidt commented Jul 6, 2017

@nicoddemus i wont be able o fllow any in deepth diagnois tasks the next 2 weeks - but i trust your judgement

@nicoddemus

This comment has been minimized.

Member

nicoddemus commented Jul 6, 2017

next 2 weeks

Hooray. 😁

but i trust your judgement

OK I will change the labels, you can revisit this whenever you have the time.

@davidhalter

This comment has been minimized.

davidhalter commented Nov 24, 2017

I got a similar issue with pytest-django. However IMO this is caused by pytest. I can also open a new issue for this one. Just let me know if you don't think they are related.

This is the test code:

import pytest                                  
                                               
                                               
@pytest.fixture(scope='function', autouse=True)
def autouse_func():                            
    print('in_func')                           
    yield                                      
    print('out_func')                          
                                               
                                               
@pytest.fixture(scope='session')               
def session():                                 
    print('in_session')                        
    yield                                      
    print('out_session')                       
                                               
                                               
def test_autouse(session):                     
    pass                                       

And this the result:

in_func
in_session
.out_func
out_session

IMO this should never happen. The session fixtures should always be executed first, otherwise you can have big problems with e.g. transactions, especially if used with autouse. However it's also worth noting that I think in general the execution order should be session fixtures first, even if specified in another order. It only makes sense if you think about it. The session fixture might already have been used by another test function.

@RonnyPfannschmidt

This comment has been minimized.

Member

RonnyPfannschmidt commented Nov 24, 2017

@nicoddemus i came to the conclusion that this is more likely a fixture issue, because we sort autouse ixtures first, even thos the scopes require a different order - it shouldnt matter where the usage is defined - a fixture with a higher scope should be executed first

@pavelrad

This comment has been minimized.

pavelrad commented Feb 26, 2018

Is this going to be fixed?

@The-Compiler

This comment has been minimized.

Member

The-Compiler commented Feb 26, 2018

@pavelrad If someone finds the time to work on it, sure - are you volunteering? 😉

@nicoddemus

This comment has been minimized.

Member

nicoddemus commented Feb 26, 2018

I still hope (fingers crossed) that this is related to marks so when we land the full mark refactoring this will be solved, otherwise we will need to find some time to investigate this further.

@RonnyPfannschmidt

This comment has been minimized.

Member

RonnyPfannschmidt commented Feb 27, 2018

@nicoddemus i believe its related to fixtures only

@nicoddemus

This comment has been minimized.

Member

nicoddemus commented Feb 27, 2018

Hmm I see, so you think just sorting autouse fixtures by scope would be sufficient in this case?

@RonnyPfannschmidt

This comment has been minimized.

Member

RonnyPfannschmidt commented Feb 27, 2018

off hand no idea, im currently not fluent in that bit of the codebase - i recall the last time we tried to change something there ended horrible

@nicoddemus

This comment has been minimized.

Member

nicoddemus commented Feb 28, 2018

@jaraco I just realized a workaround for your original problem, just change your clean_data fixture to:

@pytest.fixture(scope='module')
def clean_data():
    yield
    data.clear()

IOW, make data.clear() execute during the teardown stage of the fixture (which I think is what it was intended in the first place)?

@nicoddemus nicoddemus changed the title from Unexpected fixture execution order to Fixture execution order: higher-scoped fixtures are not necessarily run before lower-scoped fixtures Feb 28, 2018

@nicoddemus

This comment has been minimized.

Member

nicoddemus commented Feb 28, 2018

Btw: the original assumption that higher-scoped fixtures should run before lower-scoped ones is still valid and should be fixed (I updated the title of the issue to reflect that).

@jaraco

This comment has been minimized.

Contributor

jaraco commented Feb 28, 2018

execute during the teardown stage of the fixture (which I think is what it was intended in the first place)

Well, no. The intention is to ensure before a test is run that no state lingers from a previous test, possibly from another module. But probably is a better design to remove the state as a teardown operation.

I appreciate the work on this as it appears to be both subtle and tricky.

@nicoddemus

This comment has been minimized.

Member

nicoddemus commented Feb 28, 2018

Well, no. The intention is to ensure before a test is run that no state lingers from a previous test, possibly from another module

I see thanks!

nicoddemus added a commit to nicoddemus/pytest that referenced this issue Mar 20, 2018

@nicoddemus nicoddemus closed this Mar 21, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment