Hero Background

Next-Gen App & Browser Testing Cloud

Trusted by 2 Mn+ QAs & Devs to accelerate their release cycles

Next-Gen App & Browser Testing Cloud
  • Home
  • /
  • Blog
  • /
  • Using pytest asyncio to Reduce Test Execution Time
Selenium PythonTestMu AI SpartansVoices of Community

Using pytest asyncio to Reduce Test Execution Time

In this article, learn different ways of managing test execution time and how pytest asyncio helps reduce test execution time.

Author

Milos Kajkut

January 11, 2026

Longer test execution times can hinder development cycles, slow down continuous integration pipelines, and impede overall responsiveness in the software development process. Therefore, optimizing test execution time is critical in achieving streamlined, agile, and high-quality software development processes.

When discussing decreasing execution time for automated test cases, the first thought is parallel execution of test cases. But is a parallel execution the only solution for this problem or is the test execution bottleneck of the test automation framework? We should ask ourselves this question whenever we want to reduce test execution time.

Overview

pytest-asyncio is a pytest plugin that runs asynchronous tests, allowing concurrent I/O operations and improving overall test execution speed for async Python code.

How to Use asyncio With pytest?

To implement asynchronous testing with pytest, follow these structured steps that allow concurrent execution of preconditions and efficient test management within async-based Python testing frameworks.

  • Create Async Functions: Define two asynchronous precondition functions that perform setup tasks concurrently. Use await statements to manage non-blocking execution and ensure tasks finish independently without sequential delays.
  • Initialize Fixture: Create an asynchronous fixture responsible for setup and teardown processes. Include precondition calls and use asyncio.gather() to run them simultaneously within the fixture context.
  • Run Preconditions Concurrently: Use asyncio.gather() to launch multiple coroutines together. This ensures both preconditions execute in parallel, saving time compared to running each task sequentially.
  • Implement Async Tests: Write asynchronous test functions that consume the prepared fixture. These functions should await necessary operations and verify outputs after the preconditions complete successfully.
  • Use Asyncio Marker: Apply @pytest.mark.asyncio to enable async test execution. This decorator ensures pytest recognizes the test as asynchronous and handles event loops properly.
  • Enable Auto Mode: Configure asyncio_mode=auto in pytest.ini to simplify async handling. It removes the need for manually adding the @pytest.mark.asyncio decorator.
  • Execute Tests: Run pytest normally through the command line or IDE. The asynchronous coroutines execute concurrently, producing faster and more efficient test execution results overall.
  • Analyze Results: Review test logs to confirm that precondition functions run simultaneously. You’ll observe improved runtime performance and reduced total execution time due to concurrent execution.

Methods to Reduce Test Execution Time

When it comes to managing and optimizing test execution time in pytest, two main approaches are prominent: the classical setup and teardown mechanism and harnessing the potential of asynchronous programming using asyncio.

  • Using Setup and Teardown mechanism
  • Using asyncio mechanism

First, let’s talk about automated test cases. Each automated test case contains three main parts: Setup, Actual Test Case, and Teardown. In the Setup process, we perform all necessary actions to prepare the environment for test execution. Those actions include reading or writing data from a database, performing API requests, reading or writing data from specific files, etc. After the process, we can start with the Test Case execution process, where test steps and assertions will be executed. As a final stage, the process of Teardown will be executed to revert all changes made during the setup and test execution process.

Setup and Teardown in pytest

Setup and Teardown process can be performed by Fixtures. Fixtures in pytest are a very powerful feature, and by definition, they present a Python function that pytest executes before and after the actual test function. So, let’s look at how to make a Fixture:

@pytest.fixture()
def first_fixture():
    """" This fixture will return result from 2 + 2 """
    return 2 + 2

Fixture implementation is fully flexible (which can be concluded from the example above), but the right question is: “Where is the before and after part in this fixture?”. For the setup and teardown process, we need to implement the Fixture as a generator function using the yield keyword. So, everything before the yield keyword will be considered as setup, and everything after the yield keyword will be considered a teardown. This is how it looks in our example:

@pytest.fixture()
def first_fixture():
   """"This fixture will calculate 2 + 2 as setup process and print message as teardown process"""
   result = 2 + 2 # before actual test function
   yield result
   print("Start Teardown") # after actual test function

After implementing the Fixture, we are ready to write the first test case:

@pytest.mark.usefixtures("first_fixture")
def test_first(first_fixture):
   """First test"""
   result_from_fixture = first_fixture


   assert result_from_fixture == 4

So far, we have implemented one Fixture with setup and teardown process and one test case that uses the implemented Fixture. From this point, we can set up a problem that can be solved with pytest asyncio. Imagine a situation where you implemented 200 automated test cases and ran them in parallel, but still, test execution takes too long.

The first step is to detect bottlenecks in the framework or to give an answer to the question, “Where do we have time leakage? ”. For this purpose, we will set up the problem situation like this:

We will add one more test case and modify the test case with time.sleep() to simulate test case execution time:

@pytest.mark.usefixtures("first_fixture")
def test_first(self, first_fixture):
   """First test"""
   result_from_fixture = first_fixture
   time.sleep(2)
   assert result_from_fixture == 4

@pytest.mark.usefixtures("first_fixture")
def test_second(self, first_fixture):
   """Second test"""
   result_from_fixture = first_fixture
   time.sleep(2)
   assert result_from_fixture == 4

Also, to simulate precondition functions, we will add to the function that time execution will also be simulated with time.sleep():

def first_precondition_function():
   logging.info("First Precondition function called")
   time.sleep(3)
   logging.info("First Precondition function finished")


def second_precondition_function():
   logging.info("Second Precondition function called")
   time.sleep(3)
   logging.info("Second Precondition function finished")

At the end, let’s modify the Fixture to execute precondition functions:

@pytest.fixture()
def first_fixture():
   """"This fixture will calculate 2 + 2 as setup process and  print message as teardown process"""
   result = 2 + 2 # before actual test function

   # precondition functions
   logging.info("Precondition started")
   first_precondition_function()
   second_precondition_function()
   logging.info("Precondition finished")

   yield result

   logging.info("
Start Teardown")# after actual test function

From the execution report, we can see the following results:

============================= 2 passed in 16.19s ==============================
2023-11-05 14:40:19,102 [INFO] Precondition started
2023-11-05 14:40:19,103 [INFO] First Precondition function called
2023-11-05 14:40:22,103 [INFO] First Precondition function finished
2023-11-05 14:40:22,103 [INFO] Second Precondition function called
2023-11-05 14:40:25,104 [INFO] Second Precondition function finished
2023-11-05 14:40:25,104 [INFO] Precondition finished
PASSED                            	[ 50%]2023-11-05 14:40:27,107 [INFO] Start Teardown
2023-11-05 14:40:27,109 [INFO] Precondition started
2023-11-05 14:40:27,109 [INFO] First Precondition function called
2023-11-05 14:40:30,110 [INFO] First Precondition function finished
2023-11-05 14:40:30,110 [INFO] Second Precondition function called
2023-11-05 14:40:33,111 [INFO] Second Precondition function finished
2023-11-05 14:40:33,111 [INFO] Precondition finished
PASSED                           	[100%]2023-11-05 14:40:35,114 [INFO] Start Teardown

As a starting point, we should detect and count bottlenecks in the framework:

  • 1st Bottleneck: Assume 2 seconds for test execution is the best time we can achieve with the current test implementation. For the execution of two test cases, we need 4 seconds.
  • 2nd Bottleneck: Assume that 3 seconds for precondition execution is the best time we can achieve with the current implementation. For execution precondition functions, we need 6 seconds per test case.

We have set up the problem situation and can discuss possible solutions for the bottlenecks. The solution for the first bottleneck is obvious: we need to introduce parallel execution, and we will reduce test time execution from 4 seconds to 2 seconds. But, with parallel execution, we will not solve the second bottleneck, and there are an additional 6 seconds in total time execution.

============================== 2 passed in 9.54s ==============================

test_blog.py::TestBlog::test_second 2023-11-05 14:57:38,953 [INFO] Precondition started
2023-11-05 14:57:38,953 [INFO] First Precondition function called
2023-11-05 14:57:41,954 [INFO] First Precondition function finished
2023-11-05 14:57:41,954 [INFO] Second Precondition function called
2023-11-05 14:57:44,956 [INFO] Second Precondition function finished
2023-11-05 14:57:44,956 [INFO] Precondition finished

[gw0] [ 50%] PASSED test_blog.py::TestBlog::test_first 2023-11-05 14:57:46,959 [INFO] Start Teardown
2023-11-05 14:57:38,964 [INFO] Precondition started
2023-11-05 14:57:38,965 [INFO] First Precondition function called
2023-11-05 14:57:41,966 [INFO] First Precondition function finished
2023-11-05 14:57:41,966 [INFO] Second Precondition function called
2023-11-05 14:57:44,967 [INFO] Second Precondition function finished
2023-11-05 14:57:44,967 [INFO] Precondition finished

[gw1] [100%] PASSED test_blog.py::TestBlog::test_second 2023-11-05 14:57:46,970 [INFO] Start Teardown

Asyncio Module in Python

Asyncio is a powerful Python module that enables developers to write asynchronous code by leveraging the intuitive async/await syntax. This module is especially valuable when dealing with I/O-bound operations, including network requests, file operations, and other tasks requiring external resources. Introduced in Python 3.4, the asyncio module has emerged as an integral component of Python’s robust asynchronous programming capabilities. It provides a structured and consistent approach to asynchronous programming using coroutines, the event loop, and other related concepts.

How to use asyncio with pytest?

Our goal with pytest asyncio is to run two precondition functions concurrently. To achieve such a goal, it is necessary to create coroutines from precondition functions:

async def first_precondition_function():
   logging.info("First Precondition function called")
   await asyncio.sleep(3)
   logging.info("First Precondition function finished")


async def second_precondition_function():
   logging.info("Second Precondition function called")
   await asyncio.sleep(3)
   logging.info("Second Precondition function finished")

To run coroutines concurrently, we need to use await asyncio.gather:

@pytest.fixture()
async def first_fixture():
   """"This fixture will calculate 2 + 2 as setup process and print message as teardown process"""
   result = 2 + 2 # before actual test function

   # precondition functions
   logging.info("Precondition started")
   await asyncio.gather(first_precondition_function(), second_precondition_function())

   logging.info("Precondition finished")
   yield result
   logging.info("Start Teardown") # after actual test function

At the end, we need to make test functions also coroutines. To do that, pytest provides two possible solutions:

  • The first solution is to mark test functions with @pytest.mark.asyncio and use the keyword async:
  • @pytest.mark.asyncio
    @pytest.mark.usefixtures("first_fixture")
    async def test_first(self, first_fixture):
       """First test"""
       result_from_fixture = first_fixture
       time.sleep(2)
       assert result_from_fixture == 4
    
    @pytest.mark.asyncio
    @pytest.mark.usefixtures("first_fixture")
    async def test_second(self, first_fixture):
       """Second test"""
       result_from_fixture = first_fixture
       time.sleep(2)
       assert result_from_fixture == 4
    
  • The second solution is to set up the parameter asyncio_mode=auto in the pytest.ini file. In that case, we do not need to decorate the test function with marker @pytest.mark.asyncio:
[pytest]
asyncio_mode=auto

Now, see the results:

============================== 2 passed in 6.50s ==============================

test_blog.py::TestBlog::test_first 2023-11-07 22:04:22,972 [INFO] Precondition started
2023-11-07 22:04:22,972 [INFO] First Precondition function called
2023-11-07 22:04:22,973 [INFO] Second Precondition function called
2023-11-07 22:04:25,985 [INFO] First Precondition function finished
2023-11-07 22:04:25,985 [INFO] Second Precondition function finished
2023-11-07 22:04:25,986 [INFO] Precondition finished

[gw0] [ 50%] PASSED test_blog.py::TestBlog::test_first
[gw1] [100%] PASSED test_blog.py::TestBlog::test_second 2023-11-07 22:04:27,989 [INFO] Start Teardown
2023-11-07 22:04:22,970 [INFO] Precondition started
2023-11-07 22:04:22,971 [INFO] First Precondition function called
2023-11-07 22:04:22,971 [INFO] Second Precondition function called
2023-11-07 22:04:25,969 [INFO] First Precondition function finished
2023-11-07 22:04:25,969 [INFO] Second Precondition function finished
2023-11-07 22:04:25,970 [INFO] Precondition finished
2023-11-07 22:04:27,973 [INFO] Start Teardown

From the results, we see that by the concurrent approach, we reduced execution time by 3 seconds and, in total, 10 seconds.

...

Conclusion

In conclusion, we delved into two powerful strategies, each tailored to optimize the efficiency of test execution in pytest. The approach of setup and teardown provides a structured means of preparing the testing environment before execution and cleaning up afterward, ensuring a consistent and controlled context for each test. Adopting asynchronous programming through pytest asyncio introduces a dynamic paradigm, particularly advantageous for handling I/O-bound tasks and achieving concurrent execution.

By exploring both these approaches, you can balance meticulous test preparation and swift execution, ultimately reducing test execution time and enhancing the overall effectiveness of the testing process. Whether leveraging traditional setup practices or embracing the asynchronous capabilities of asyncio, pytest offers a robust framework for optimizing test workflows in diverse testing scenarios.

Author

Miloš Kajkut is a Test Automation Engineer with 6+ years of experience in manual and automated software testing across enterprise, web, mobile, and AI-driven systems. He specializes in test automation using Python, Pytest, Selenium, Appium, and Squish, and has built and refactored automation frameworks using Page Object and data pipeline–based designs. Miloš currently works on testing GenAI and LLM systems, focusing on evaluation frameworks, prompt validation, and AI reliability testing. He holds ISTQB Foundation certification and a Master’s degree in Engineering.

Close

Summarize with AI

ChatGPT IconPerplexity IconClaude AI IconGrok IconGoogle AI Icon

Did you find this page helpful?

More Related Hubs

TestMu AI forEnterprise

Get access to solutions built on Enterprise
grade security, privacy, & compliance

  • Advanced access controls
  • Advanced data retention rules
  • Advanced Local Testing
  • Premium Support options
  • Early access to beta features
  • Private Slack Channel
  • Unlimited Manual Accessibility DevTools Tests