Creating performance tests for pytest
using pytest-codspeed
pytest-codspeed
. This extension
will automatically enable the CodSpeed engine on your benchmarks and allow
reporting to CodSpeed.
pytest-codspeed
is backward compatible with the
pytest-benchmark
API. So if you already have benchmarks written with it, you
can start using CodSpeed right away!pytest-codspeed
as a development dependency:
pytest-codspeed
offers two approaches to create performance benchmarks
that integrate seamlessly with your existing test suite.
Use @pytest.mark.benchmark
to measure entire test functions automatically:
benchmark
fixture for precise control over what code gets measured:
--codspeed
pytest flag:
pytest-codspeed
locally will not produce any performance reporting.
It’s only useful for making sure that your benchmarks are working as expected.
If you want to get performance reporting, you should run the benchmarks in your
CI.main
branch and every
pull request:
uv
uv
as a development dependency:
actions/setup-python
to install python and not uv install
is critical
for tracing to work properly.pytest-test-groups
, a
pytest
plugin that allows you to split your benchmark execution across several
CI jobs.
Install pytest-test-groups
as a development dependency:
pytest-codspeed
is compatible with
pytest-xdist
, a pytest
plugin
allowing to distribute the execution across multiple processes. You can simply
enable the pytest-xdist
plugin on top of pytest-codspeed
. This will allow
you to run your benchmarks in parallel using multiple processes.
First, install pytest-xdist
as a development dependency:
pytest-xdist
flag:
pytest-codspeed
with
Nox
, a Python automation tool that allows
you to automate the execution of Python code across multiple environments.
Here is an example configuration file to run benchmarks with pytest-codspeed
using Nox
: