astral-sh
ruff
Blog
Docs
Changelog
Blog
Docs
Changelog
Overview
Branches
Benchmarks
Runs
[ty] Fix docstring rendering for literal blocks after doctests
#22676
Merged
Comparing
claude/fix-issue-2497-j5GU2
(
863995f
) with
main
(
336413b
)
CodSpeed Performance Gauge
0%
Untouched
23
Skipped
30
Benchmarks
Mode
CPU Simulation
Wall Time
Memory
Status
Untouched
Skipped
30 total
Archive selected
The benchmarks below were skipped, so their baseline results are used instead. If they were deleted in your codebase, archive them to remove them from the performance reports.
Learn more about archiving benchmarks
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/default-rules[pydantic/types.py]
crates/ruff_benchmark/benches/linter.rs::default_rules::benchmark_default_rules
Skipped
2.2 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
formatter[large/dataset.py]
crates/ruff_benchmark/benches/formatter.rs::formatter::benchmark_formatter
Skipped
9.4 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/default-rules[unicode/pypinyin.py]
crates/ruff_benchmark/benches/linter.rs::default_rules::benchmark_default_rules
Skipped
406.3 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
formatter[numpy/globals.py]
crates/ruff_benchmark/benches/formatter.rs::formatter::benchmark_formatter
Skipped
249.5 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
lexer[numpy/globals.py]
crates/ruff_benchmark/benches/lexer.rs::lexer::benchmark_lexer
Skipped
30.2 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
formatter[numpy/ctypeslib.py]
crates/ruff_benchmark/benches/formatter.rs::formatter::benchmark_formatter
Skipped
1.9 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/default-rules[numpy/ctypeslib.py]
crates/ruff_benchmark/benches/linter.rs::default_rules::benchmark_default_rules
Skipped
1 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/all-with-preview-rules[numpy/ctypeslib.py]
crates/ruff_benchmark/benches/linter.rs::preview_rules::benchmark_preview_rules
Skipped
5.2 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
lexer[large/dataset.py]
crates/ruff_benchmark/benches/lexer.rs::lexer::benchmark_lexer
Skipped
1.2 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
formatter[pydantic/types.py]
crates/ruff_benchmark/benches/formatter.rs::formatter::benchmark_formatter
Skipped
3.6 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/default-rules[numpy/globals.py]
crates/ruff_benchmark/benches/linter.rs::default_rules::benchmark_default_rules
Skipped
208.2 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
formatter[unicode/pypinyin.py]
crates/ruff_benchmark/benches/formatter.rs::formatter::benchmark_formatter
Skipped
690 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
lexer[pydantic/types.py]
crates/ruff_benchmark/benches/lexer.rs::lexer::benchmark_lexer
Skipped
510 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
lexer[numpy/ctypeslib.py]
crates/ruff_benchmark/benches/lexer.rs::lexer::benchmark_lexer
Skipped
227.9 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/all-with-preview-rules[numpy/globals.py]
crates/ruff_benchmark/benches/linter.rs::preview_rules::benchmark_preview_rules
Skipped
832.1 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
parser[numpy/globals.py]
crates/ruff_benchmark/benches/parser.rs::parser::benchmark_parser
Skipped
109.2 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
parser[numpy/ctypeslib.py]
crates/ruff_benchmark/benches/parser.rs::parser::benchmark_parser
Skipped
958.5 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
parser[large/dataset.py]
crates/ruff_benchmark/benches/parser.rs::parser::benchmark_parser
Skipped
5.2 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
lexer[unicode/pypinyin.py]
crates/ruff_benchmark/benches/lexer.rs::lexer::benchmark_lexer
Skipped
79.1 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/all-rules[numpy/globals.py]
crates/ruff_benchmark/benches/linter.rs::all_rules::benchmark_all_rules
Skipped
730.2 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/all-rules[numpy/ctypeslib.py]
crates/ruff_benchmark/benches/linter.rs::all_rules::benchmark_all_rules
Skipped
4.3 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/all-rules[large/dataset.py]
crates/ruff_benchmark/benches/linter.rs::all_rules::benchmark_all_rules
Skipped
18.5 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/all-rules[pydantic/types.py]
crates/ruff_benchmark/benches/linter.rs::all_rules::benchmark_all_rules
Skipped
8.5 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/all-with-preview-rules[unicode/pypinyin.py]
crates/ruff_benchmark/benches/linter.rs::preview_rules::benchmark_preview_rules
Skipped
2.2 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
linter/all-with-preview-rules[large/dataset.py]
crates/ruff_benchmark/benches/linter.rs::preview_rules::benchmark_preview_rules
Skipped
22.3 ms
*
1
2
Commits
Click on a commit to change the comparison range
Base
main
336413b
0%
[ty] Fix in_doctest flag not being reset when ending doctest
863995f
1 month ago
by claude
© 2026 CodSpeed Technology
Home
Terms
Privacy
Docs