Latest Results
fix(core): preserve chunk `additional_kwargs` across v3 stream assembly (#37435)
The v3 streaming path drops `additional_kwargs` from per-chunk
`AIMessageChunk`s during assembly: `chunks_to_events` emits no event
field for them, and `ChatModelStream._assemble_message` constructs the
final `AIMessage` without an `additional_kwargs` argument. Non-streaming
`ainvoke` returns the provider message unchanged, so streaming and
non-streaming diverge for any provider that uses `additional_kwargs` to
carry data outside the typed protocol blocks.
## How this surfaces
The concrete failure mode is Gemini's
`__gemini_function_call_thought_signatures__` — a per-tool-call
signature blob the Google GenAI integration places in
`additional_kwargs`, keyed by `tool_call_id`. Gemini requires that
signature on follow-up turns to replay the prior thought trace; without
it, multi-turn streaming flows lose thought continuity (and may
regenerate thinking, charging additional reasoning tokens, or in some
cases refuse). Other providers that use `additional_kwargs` (e.g. older
`function_call` accumulators, custom routing metadata) hit the same gap;
the fix is intentionally provider-agnostic.
## Fix
Provider-agnostic, two seams:
- `_compat_bridge` accumulates `msg.additional_kwargs` across chunks
with `merge_dicts` (matching `AIMessageChunk`'s own merge semantics for
fields that accumulate, like `function_call`) and emits the merged dict
on the `message-finish` event as an off-spec extension. The bridge
already uses one such extension (`metadata` on `MessageFinishData`);
this PR follows the same pattern for `additional_kwargs`.
- `ChatModelStream._finish` reads the new field; `_assemble_message`
threads it onto the final `AIMessage` only when non-empty, preserving
today's behavior of leaving `additional_kwargs` empty when no provider
data needs to ride on it. fix(core): preserve reasoning blocks alongside tool_call in v3 stream (#37434)
Closes #37420
---
`stream_events(version="v3")` (and the `astream_events` async twin)
silently dropped reasoning content from the final assembled `AIMessage`
whenever the same message also produced a tool_call. The bug reproduces
against Gemini 2.5 Pro with `include_thoughts=True`: reasoning streams
correctly through `ChatModelStream.reasoning`, but the persisted message
in the final graph state carries only the `tool_call` block.
## Root cause
`_iter_protocol_blocks` in the compat bridge groups per-chunk content
blocks by source-side identifier. When a provider doesn't supply an
`index` field on its content blocks — which the Google GenAI translator
does not for either `reasoning` or `tool_call` blocks — the bridge falls
back to positional `i` as the bucket key. Because Gemini typically emits
one block per chunk, every reasoning chunk and the later tool_call chunk
all key to `0`, and the type mismatch trips `_accumulate`'s
self-contained `else` branch. That branch clears accumulated reasoning
state and replaces it with the incoming tool_call, so reasoning never
reaches `content-block-finish`.
## Fix
When a block has no source-side `index`, key it by `("__lc_no_index__",
block_type, positional_i)` instead of bare `i`. Same-type chunks at the
same position still share a bucket and merge cleanly (streaming text and
reasoning unchanged); different-type chunks at the same position now
occupy distinct wire blocks and both reach `content-block-finish`.
Providers that supply explicit indices (Anthropic, OpenAI Responses) are
unaffected.
## Verification
Unit-tested at the compat-bridge layer for both sync
(`chunks_to_events`) and async (`achunks_to_events`) paths.
Verified live against Gemini 2.5 Pro `gemini-2.5-pro` with
`thinking_budget=2048`, `include_thoughts=True`, and a single
`get_weather` tool. Pre-fix:
`final_state.messages[tool_calling_ai_message].content == [{type:
tool_call, ...}]`. Post-fix: `[..., {type: reasoning, reasoning: "..."},
{type: tool_call, ...}]`, matching the shape `ainvoke` returns on the
same input. Latest Branches
-1%
nh/stream-events-v3-additional-kwargs-dropped +1%
nh/stream-events-v3-reasoning-dropped 0%
dependabot/uv/libs/core/langsmith-0.8.4 © 2026 CodSpeed Technology