Avatar for the langchain-ai user
langchain-ai
langchain
BlogDocsChangelog

Performance History

Latest Results

fix(core): preserve chunk `additional_kwargs` across v3 stream assembly (#37435) The v3 streaming path drops `additional_kwargs` from per-chunk `AIMessageChunk`s during assembly: `chunks_to_events` emits no event field for them, and `ChatModelStream._assemble_message` constructs the final `AIMessage` without an `additional_kwargs` argument. Non-streaming `ainvoke` returns the provider message unchanged, so streaming and non-streaming diverge for any provider that uses `additional_kwargs` to carry data outside the typed protocol blocks. ## How this surfaces The concrete failure mode is Gemini's `__gemini_function_call_thought_signatures__` — a per-tool-call signature blob the Google GenAI integration places in `additional_kwargs`, keyed by `tool_call_id`. Gemini requires that signature on follow-up turns to replay the prior thought trace; without it, multi-turn streaming flows lose thought continuity (and may regenerate thinking, charging additional reasoning tokens, or in some cases refuse). Other providers that use `additional_kwargs` (e.g. older `function_call` accumulators, custom routing metadata) hit the same gap; the fix is intentionally provider-agnostic. ## Fix Provider-agnostic, two seams: - `_compat_bridge` accumulates `msg.additional_kwargs` across chunks with `merge_dicts` (matching `AIMessageChunk`'s own merge semantics for fields that accumulate, like `function_call`) and emits the merged dict on the `message-finish` event as an off-spec extension. The bridge already uses one such extension (`metadata` on `MessageFinishData`); this PR follows the same pattern for `additional_kwargs`. - `ChatModelStream._finish` reads the new field; `_assemble_message` threads it onto the final `AIMessage` only when non-empty, preserving today's behavior of leaving `additional_kwargs` empty when no provider data needs to ride on it.
master
42 minutes ago
fix(core): apply ruff format to _compat_bridge additional_kwargs accumulator
nh/stream-events-v3-additional-kwargs-dropped
46 minutes ago
fix(core): preserve reasoning blocks alongside tool_call in v3 stream (#37434) Closes #37420 --- `stream_events(version="v3")` (and the `astream_events` async twin) silently dropped reasoning content from the final assembled `AIMessage` whenever the same message also produced a tool_call. The bug reproduces against Gemini 2.5 Pro with `include_thoughts=True`: reasoning streams correctly through `ChatModelStream.reasoning`, but the persisted message in the final graph state carries only the `tool_call` block. ## Root cause `_iter_protocol_blocks` in the compat bridge groups per-chunk content blocks by source-side identifier. When a provider doesn't supply an `index` field on its content blocks — which the Google GenAI translator does not for either `reasoning` or `tool_call` blocks — the bridge falls back to positional `i` as the bucket key. Because Gemini typically emits one block per chunk, every reasoning chunk and the later tool_call chunk all key to `0`, and the type mismatch trips `_accumulate`'s self-contained `else` branch. That branch clears accumulated reasoning state and replaces it with the incoming tool_call, so reasoning never reaches `content-block-finish`. ## Fix When a block has no source-side `index`, key it by `("__lc_no_index__", block_type, positional_i)` instead of bare `i`. Same-type chunks at the same position still share a bucket and merge cleanly (streaming text and reasoning unchanged); different-type chunks at the same position now occupy distinct wire blocks and both reach `content-block-finish`. Providers that supply explicit indices (Anthropic, OpenAI Responses) are unaffected. ## Verification Unit-tested at the compat-bridge layer for both sync (`chunks_to_events`) and async (`achunks_to_events`) paths. Verified live against Gemini 2.5 Pro `gemini-2.5-pro` with `thinking_budget=2048`, `include_thoughts=True`, and a single `get_weather` tool. Pre-fix: `final_state.messages[tool_calling_ai_message].content == [{type: tool_call, ...}]`. Post-fix: `[..., {type: reasoning, reasoning: "..."}, {type: tool_call, ...}]`, matching the shape `ainvoke` returns on the same input.
master
50 minutes ago
fix(core): apply ruff format to _compat_bridge additional_kwargs accumulator
nh/stream-events-v3-additional-kwargs-dropped
1 hour ago
test(core): apply ruff format to no-index reasoning test
nh/stream-events-v3-reasoning-dropped
1 hour ago
fix(core): preserve chunk additional_kwargs across v3 stream assembly The v3 streaming path drops `additional_kwargs` from per-chunk `AIMessageChunk`s during assembly: `chunks_to_events` emits no event field for them, and `ChatModelStream._assemble_message` constructs the final `AIMessage` without an `additional_kwargs` argument. Non-streaming `ainvoke` returns the provider message unchanged, so streaming and non-streaming diverge for any provider that uses `additional_kwargs` to carry data outside the typed protocol blocks. The concrete failure mode that surfaced this is Gemini's `__gemini_function_call_thought_signatures__` — a per-tool-call signature blob the Google GenAI integration places in `additional_kwargs`, keyed by `tool_call_id`. Gemini requires that signature on follow-up turns to replay the prior thought trace; without it, multi-turn streaming flows lose thought continuity. Other providers that use `additional_kwargs` (e.g. older `function_call` shapes, custom routing metadata) hit the same gap. Fix is provider-agnostic: - `_compat_bridge` accumulates `msg.additional_kwargs` across chunks with `merge_dicts` (matching `AIMessageChunk`'s own merge semantics for accumulating fields like `function_call`) and emits the merged dict on the `message-finish` event as an off-spec extension parallel to the existing `metadata` extension. - `ChatModelStream._finish` reads the field; `_assemble_message` threads it onto the final `AIMessage` only when non-empty (preserving the existing behavior of leaving `additional_kwargs` empty when no provider data needs to ride on it). Sync and async paths covered. This contribution was prepared with help from an AI agent; an author has reviewed and is responsible for the change.
nh/stream-events-v3-additional-kwargs-dropped
1 hour ago
fix(core): preserve reasoning blocks alongside tool_call in v3 stream Closes #37420 `stream_events(version="v3")` was dropping reasoning content from the final assembled `AIMessage` whenever the same message also produced a `tool_call`. The issue reproduced against Gemini 2.5 Pro with `include_thoughts=True`; reasoning streamed live through `ChatModelStream.reasoning`, but the persisted message in the final graph state carried only the `tool_call` content block. The bug lives in `_compat_bridge._iter_protocol_blocks`. When a provider emits per-chunk content blocks without an `index` field — which the Google GenAI translator does for both `reasoning` and `tool_call` blocks — the bridge falls back to positional `i` as the bucket key. Because Gemini typically emits one block per chunk, every reasoning chunk and every later tool_call chunk all key to `0`, and the type mismatch trips `_accumulate`'s self-contained `else` branch, which clears the accumulated reasoning state and replaces it with the incoming tool_call. Fix: when a block has no source-side `index`, key it by `("__lc_no_index__", block_type, positional_i)` instead of bare `i`. Same-type chunks at the same position still share a bucket and merge cleanly (streaming text / reasoning unchanged); different-type chunks at the same position now occupy distinct wire blocks and both reach `content-block-finish`. Verified live against Gemini 2.5 Pro: the final-state tool-calling `AIMessage.content` now carries both the `reasoning` and `tool_call` blocks, matching what `ainvoke` would have returned.
nh/stream-events-v3-reasoning-dropped
1 hour ago

Latest Branches

CodSpeed Performance Gauge
-1%
fix(core): preserve chunk `additional_kwargs` across v3 stream assembly#37435
1 hour ago
ab1d155
nh/stream-events-v3-additional-kwargs-dropped
CodSpeed Performance Gauge
+1%
1 hour ago
f9e33b8
nh/stream-events-v3-reasoning-dropped
CodSpeed Performance Gauge
0%
chore: bump langsmith from 0.7.31 to 0.8.4 in /libs/core#37395
21 hours ago
6923c94
dependabot/uv/libs/core/langsmith-0.8.4
© 2026 CodSpeed Technology
Home Terms Privacy Docs