Quickstart
Get started with CodSpeed and performance testing in a few minutes
CodSpeed is a performance testing tool helping you to consistently measure the performance of your codebase locally and in your CI/CD pipeline. More details on how CodSpeed works here: What is CodSpeed?
This guide is a quick start to get a sample Python repository on GitHub up and running with CodSpeed:
- Create Performance Tests
- Connect your Repository
- Run the Tests in your CI
- Introduce a Performance Regression
Create Performance Tests
-
Install the CodSpeed plugin for
pytest
: -
Write a performance test using the
@pytest.mark.benchmark
marker:tests/test_sum_squares.py -
Run your performance tests locally:
Great! We set up our first performance test. Now, let’s track the performance in the CI with CodSpeed!
Connect your Repository
-
Go to settings and install the CodSpeed GitHub App by clicking on the “Import” button.
-
Select the organization or the user and add the repositories you want to use with CodSpeed:
-
Back in CodSpeed settings, enable your repositories by clicking on the “Enable” button:
-
Copy your
CODSPEED_TOKEN
and add it to your GitHub repository secrets
Run the Tests in your CI
-
Create a new GitHub Actions workflow file to run your benchmarks:
.github/workflows/codspeed.yml -
Create a Pull Request installing the workflow to the repository and wait for the report in the comments:
-
Merge it and congrats 🎉, CodSpeed is installed!
Introduce a Performance Regression
-
Let’s change the implementation of the
sum_squares
with a more concise and elegant one:tests/test_sum_squares.py -
Open a Pull Request and wait for the CodSpeed report:
Before merging, we can see that even if it’s more concise, this new implementation is slower. Merging it would have introduced a performance regression but we caught it before shipping it!
Next Steps
What is CodSpeed?
Find out more details on how CodSpeed works
Explore the Performance Metrics
Understand the performance metrics generated by CodSpeed
Enforce Performance Checks
Make sure you or team members never merge unexpected performance regressions
Dive in Trace Generation
Get detailed performance traces for your benchmarks