langchain-ai
langchain
Blog
Docs
Changelog
Blog
Docs
Changelog
Overview
Branches
Benchmarks
Runs
fix(nomic,openai,perplexity): update pillow version to >= 12.1.1, <13.0.0
#35254
Comparing
to-curiosity:ffix/update-pillow-version
(
2d8ed01
) with
master
(
fb0233c
)
CodSpeed Performance Gauge
×17
Improvement
1
Untouched
7
Skipped
27
Benchmarks
Mode
CPU Simulation
Wall Time
Memory
Status
Improvement
Untouched
Skipped
35 total
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_nomic_embeddings_init_time
libs/partners/nomic/tests/unit_tests/test_standard.py
CodSpeed Performance Gauge
×17
1,493.4 µs
*
86.2 µs
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_stream_time
libs/partners/openai/tests/integration_tests/chat_models/test_base_standard.py::TestOpenAIStandard
CodSpeed Performance Gauge
0%
1.1 s
1.1 s
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_stream_time
libs/partners/openai/tests/integration_tests/chat_models/test_responses_standard.py::TestOpenAIStandard
CodSpeed Performance Gauge
0%
1.1 s
1.1 s
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_stream_time
libs/partners/openai/tests/integration_tests/chat_models/test_responses_standard.py::TestOpenAIResponses
CodSpeed Performance Gauge
0%
823.1 ms
822.7 ms
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_init_time
libs/partners/openai/tests/unit_tests/chat_models/test_responses_standard.py::TestOpenAIResponses
CodSpeed Performance Gauge
0%
12.3 ms
12.3 ms
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_init_time
libs/partners/openai/tests/unit_tests/chat_models/test_azure_standard.py::TestOpenAIStandard
CodSpeed Performance Gauge
0%
1.7 s
1.7 s
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_init_time
libs/partners/perplexity/tests/unit_tests/test_chat_models_standard.py::TestPerplexityStandard
CodSpeed Performance Gauge
0%
1.7 s
*
1.7 s
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_init_time
libs/partners/openai/tests/unit_tests/chat_models/test_base_standard.py::TestOpenAIStandard
CodSpeed Performance Gauge
0%
12.2 ms
12.2 ms
The benchmarks below were skipped, so their baseline results are used instead. If they were deleted in your codebase, archive them to remove them from the performance reports.
Learn more about archiving benchmarks
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_create_chat_prompt_init_time
libs/partners/prompty/tests/unit_tests/test_standard.py
Skipped
311.9 µs
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_chroma_init_time
libs/partners/chroma/tests/unit_tests/test_standard.py
Skipped
60 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_init_time
libs/partners/groq/tests/unit_tests/test_standard.py::TestGroqStandard
Skipped
1.6 s
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_init_time
libs/partners/deepseek/tests/unit_tests/test_chat_models.py::TestChatDeepSeekUnit
Skipped
1.6 s
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_qdrant_vectorstore_init_time
libs/partners/qdrant/tests/unit_tests/test_standard.py
Skipped
225 ms
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_init_time
libs/partners/xai/tests/unit_tests/test_chat_models_standard.py::TestXAIStandard
Skipped
3.3 s
*
Uses the
CPU Simulation instrument
to collect CPU performance metrics.
test_exa_retriever_init_time
libs/partners/exa/tests/unit_tests/test_standard.py
Skipped
316.4 µs
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[PydanticOutputParser]
libs/core/tests/benchmarks/test_imports.py
Skipped
563.5 ms
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[LangChainTracer]
libs/core/tests/benchmarks/test_imports.py
Skipped
472.1 ms
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[InMemoryVectorStore]
libs/core/tests/benchmarks/test_imports.py
Skipped
626.8 ms
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[BaseChatModel]
libs/core/tests/benchmarks/test_imports.py
Skipped
558.8 ms
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[CallbackManager]
libs/core/tests/benchmarks/test_imports.py
Skipped
488.6 ms
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[tool]
libs/core/tests/benchmarks/test_imports.py
Skipped
542.3 ms
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[HumanMessage]
libs/core/tests/benchmarks/test_imports.py
Skipped
269.4 ms
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[Runnable]
libs/core/tests/benchmarks/test_imports.py
Skipped
520.1 ms
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[InMemoryRateLimiter]
libs/core/tests/benchmarks/test_imports.py
Skipped
178.5 ms
*
Uses the
Wall Time instrument
to collect wall time performance metrics.
test_import_time[Document]
libs/core/tests/benchmarks/test_imports.py
Skipped
191.6 ms
*
1
2
Commits
Click on a commit to change the comparison range
Base
master
fb0233c
×17
Update pyproject.toml
2d8ed01
5 hours ago
by to-curiosity
© 2026 CodSpeed Technology
Home
Terms
Privacy
Docs