Dynamically creating Pytest test fixtures in runtime

July 7, 2017    

(In a hurry? Go to solution)
At work we have started to redesign our test environment, which we run with Python 3 and pytest. The test environment is built for large scale system testing in simulated real life environments.

We have a yaml configuration file that describes resources that will be used in the test environment (they can for example describe 2 server providers, OpenStack and baremetal).

In our current environment whenever we want to use one of the resources, we have a pytest fixture called env that we include in a test case. In env you can find all neccesary resources tucked away in lists and dicts. For example:

def test_case(env):
    print(env.providers['openstack_provider'])

As our tests get more complex it gets harder to understand which resources a test case requires, and because a test case and a corresponding yaml file is separate you need to have good knowledge of which resources are used in a specific test case.

We also have an issue with some resources having to be setup/teardown more often that once per test session. Our current test environment doesn’t allow for that.

 

So we set out to modify our test environment to provide us with better control over our dynamic resources. The idea was to specify test cases the following way:

def test_case(openstack_provider, ubuntu_server1, coreos1):
    pass

Maximum dependency injection with the added benefit that you can easily see which resources needs to be defined in the environment yaml file.
How to achieve this with pytest then?

 

Pytest fixtures and Python decorators

The way pytest works with dependency injection is as mentioned earlier, test fixtures.
A test fixture is, as described on Wikipedia, something used to consistently test some item, device, or piece of software
In pytest are most oftenly defined using Python decorators in this form:

@pytest.fixture(scope='session')
def env(request):
    pass

This is unfortunately a very static way of typing a fixture, and seeing as how the yaml file could contain N amount of resources, we need to dynamically create fixtures. To do so we need to look at how a Python decorator works.
The above Python decorator applied on the function env will be rewritten into the following code by Python (more info on this can be found here):

def env(request):
    pass
env = pytest.fixture(scope='session')(env)

This allows for us to dynamically create new fixtures! Lets try then!

servers = ['server1', 'server2']
def create_server(env, request):
    s = env.servers[request.param]
    s.setup()
    yield s
    s.destroy()

for srv in servers:
    pytest.fixture(scope='session', params=[srv], name=srv)(create_server)

Looks easy enough, right? Well, we are almost there, but not really. When we ran this we realised that only the last fixture (server2 in this case) was available to our test cases.

After some digging in the pytest source code, we found out that pytest sets metadata in a special variable on the specified function. In our case of executing pytest.fixture on the same function twice, we were overwriting the old metadata which made that fixture disappear.

The solution we came up with resembles the pattern for decorators being described in the stackoverflow question linked earlier in this post. We call them function factories (might possibly not be the right name), and they are a handy feature in Python. The pattern also allowed us to inject the resource identifier without passing it via pytest parameters.

servers = ['server1', 'server2']
def create_server_factory(server):
    def create_server(env, request):
        s = env.servers[server]
        s.setup()
        yield s
        s.destroy()
    return create_server

for srv in servers:
    pytest.fixture(scope='session', name=srv)(create_server_factory(srv))

This works better, but the issue with this is that pytest can’t find any of the fixtures now. Why? Because of how pytest scans for fixtures, they need to be part of  certain modules that are scanned by pytest, such asconftest.py files or test case files.
We solve this by injecting the function into the current module:s (conftest.py in our case) internal variable dict. This gives us the final solution below.

The final solution

import sys
servers = ['server1', 'server2']
def create_server_factory(server):
    def create_server(env, request):
        s = env.servers[server]
        s.setup()
        yield s
        s.destroy()
    return create_server

for srv in servers:
    fn = pytest.fixture(scope='session', name=srv)(create_server_factory(srv))
    setattr(sys.modules[__name__],"{}_func".format(srv), fn)

def test_case(server1, server2):
    print(server1, server2)

Some other things to consider

In our old solution, we constructed our environment (reading the yaml and constructing objects) in a fixture called env. This was not possible now as it is too late to introduce new fixtures if a fixture has already started running. We solved this by moving all setup out to the pytest hook pytest_configure which is called much earlier into the process.


  • Or Carmi

    Interesting read, thanks!

  • Arun Kaliraja Baskaran

    Nice.. i am trying to do something similar to one of my projects.. couple of questions though..

    In my case i want to create a workflow.. which wil be translated into a chain of fixtures executing one after the other which denotes the various preconditions that needs to be satisfied for the testcase i am writing.. now i have more than one chain of fixtures with testcases using one of those chains..
    So i dont need to dynamicaly create fixtures instead just add them to the testcase at run time..

    I was thinking if i will just mark the testcase to just say which chain to use (say with string saying the usecase i am executing..) and then at the pytest_generate_test phase or pytest_configure phase i can read the specified marker parameter and then dynamically add the fixture to the Testcases.. Is it possible to do so?