Avatar for the langchain-ai user
langchain-ai
langchain
Instrumentation
Wall Time
BlogDocsChangelog

feat: Allow passing tokenizer template values to HuggingFace chat models

#31489
Comparing
Aristote-code:feat-huggingface-chat-template-values
(
eb262b1
) with
master
(
ece9e31
)
CodSpeed Performance Gauge
0%
Improvements
0
Regressions
0
Untouched
13
New
0
Dropped
0
Ignored
0

Benchmarks

Passed

test_init_time
libs/partners/mistralai/tests/unit_tests/test_standard.py::TestMistralStandard::test_init_time
CodSpeed Performance Gauge
0%
13 ms
12.9 ms
test_stream_time
libs/partners/anthropic/tests/integration_tests/test_standard.py::TestAnthropicStandard::test_stream_time
CodSpeed Performance Gauge
0%
15.6 ms
15.6 ms
test_stream_time
libs/partners/openai/tests/integration_tests/chat_models/test_base_standard.py::TestOpenAIStandard::test_stream_time
CodSpeed Performance Gauge
0%
1.1 s
1.1 s
test_init_time
libs/partners/xai/tests/unit_tests/test_chat_models_standard.py::TestXAIStandard::test_init_time
CodSpeed Performance Gauge
0%
3.3 s
3.3 s
test_stream_time
libs/partners/openai/tests/integration_tests/chat_models/test_responses_standard.py::TestOpenAIStandard::test_stream_time
CodSpeed Performance Gauge
0%
1.1 s
1.1 s
test_init_time
libs/partners/groq/tests/unit_tests/test_standard.py::TestGroqStandard::test_init_time
CodSpeed Performance Gauge
0%
1.6 s
1.6 s
test_init_time
libs/partners/openai/tests/unit_tests/chat_models/test_azure_standard.py::TestOpenAIStandard::test_init_time
CodSpeed Performance Gauge
0%
1.7 s
1.7 s
test_init_time
libs/partners/fireworks/tests/unit_tests/test_standard.py::TestFireworksStandard::test_init_time
CodSpeed Performance Gauge
0%
6.6 s
6.6 s
test_init_time
libs/partners/deepseek/tests/unit_tests/test_chat_models.py::TestChatDeepSeekUnit::test_init_time
CodSpeed Performance Gauge
0%
1.6 s
1.6 s
test_stream_time
libs/partners/openai/tests/integration_tests/chat_models/test_responses_standard.py::TestOpenAIResponses::test_stream_time
CodSpeed Performance Gauge
0%
481.3 ms
481.4 ms
test_init_time
libs/partners/anthropic/tests/unit_tests/test_standard.py::TestAnthropicStandard::test_init_time
CodSpeed Performance Gauge
0%
3.8 ms
3.8 ms
test_init_time
libs/partners/openai/tests/unit_tests/chat_models/test_responses_standard.py::TestOpenAIResponses::test_init_time
CodSpeed Performance Gauge
0%
16.8 ms
16.9 ms
test_init_time
libs/partners/openai/tests/unit_tests/chat_models/test_base_standard.py::TestOpenAIStandard::test_init_time
CodSpeed Performance Gauge
0%
16.7 ms
16.8 ms

Commits

Click on a commit to change the comparison range
Base
master
ece9e31
0%
feat: Allow passing tokenizer template values to HuggingFace chat models This commit introduces two ways for you to customize chat templating when using `ChatHuggingFace`: 1. **Custom Chat Template String**: A new `chat_template` parameter has been added to the `ChatHuggingFace` constructor. This allows you to provide a custom Jinja template string, which will be assigned to `tokenizer.chat_template` after the tokenizer is loaded. This gives you full control over the chat prompt formatting if the default template associated with a model is not suitable or if you want to experiment with different prompt structures. 2. **Dynamic Template Variables via `**kwargs`**: The `_to_chat_prompt` method in `ChatHuggingFace` (which is responsible for formatting messages using `tokenizer.apply_chat_template`) has been modified to accept arbitrary keyword arguments (`**kwargs`). These `kwargs` are then passed directly to `tokenizer.apply_chat_template`. This allows you to define variables in your Jinja chat templates (either the default one or a custom one) and provide values for these variables dynamically during calls to `invoke`, `stream`, `generate`, etc. Unit tests have been added to verify these new functionalities, including setting custom templates and passing keyword arguments to `apply_chat_template`. Documentation has been updated in the `ChatHuggingFace` class docstring and in the HuggingFace integration notebook (`docs/docs/integrations/chat/huggingface.ipynb`) to explain these new features with examples. This change addresses issue #31470 by providing a flexible way for you to pass tokenizer template values, interpreted as HuggingFace chat template strings and variables for those Jinja templates.
d61626a
20 days ago
by google-labs-jules[bot]
0%
feat: Allow passing tokenizer template values to HuggingFace chat models This commit introduces two ways for you to customize chat templating when using `ChatHuggingFace`: 1. **Custom Chat Template String**: A new `chat_template` parameter has been added to the `ChatHuggingFace` constructor. This allows you to provide a custom Jinja template string, which will be assigned to `tokenizer.chat_template` after the tokenizer is loaded. This gives you full control over the chat prompt formatting if the default template associated with a model is not suitable or if you want to experiment with different prompt structures. 2. **Dynamic Template Variables via `**kwargs`**: The `_to_chat_prompt` method in `ChatHuggingFace` (which is responsible for formatting messages using `tokenizer.apply_chat_template`) has been modified to accept arbitrary keyword arguments (`**kwargs`). These `kwargs` are then passed directly to `tokenizer.apply_chat_template`. This allows you to define variables in your Jinja chat templates (either the default one or a custom one) and provide values for these variables dynamically during calls to `invoke`, `stream`, `generate`, etc. Unit tests have been added to verify these new functionalities, including setting custom templates and passing keyword arguments to `apply_chat_template`. Documentation has been updated in the `ChatHuggingFace` class docstring and in the HuggingFace integration notebook (`docs/docs/integrations/chat/huggingface.ipynb`) to explain these new features with examples. Linting errors (E501 Line too long) found in a previous CI run have been fixed. This change addresses issue #31470 by providing a flexible way for you to pass tokenizer template values, interpreted as HuggingFace chat template strings and variables for those Jinja templates.
a822322
20 days ago
by google-labs-jules[bot]
0%
feat: Allow passing tokenizer template values to HuggingFace chat models This commit introduces two ways for you to customize chat templating when using `ChatHuggingFace`: 1. **Custom Chat Template String**: A new `chat_template` parameter has been added to the `ChatHuggingFace` constructor. This allows you to provide a custom Jinja template string, which will be assigned to `tokenizer.chat_template` after the tokenizer is loaded. This gives you full control over the chat prompt formatting if the default template associated with a model is not suitable or if you want to experiment with different prompt structures. 2. **Dynamic Template Variables via `**kwargs`**: The `_to_chat_prompt` method in `ChatHuggingFace` (which is responsible for formatting messages using `tokenizer.apply_chat_template`) has been modified to accept arbitrary keyword arguments (`**kwargs`). These `kwargs` are then passed directly to `tokenizer.apply_chat_template`. This allows you to define variables in your Jinja chat templates (either the default one or a custom one) and provide values for these variables dynamically during calls to `invoke`, `stream`, `generate`, etc. Unit tests have been added to verify these new functionalities, including setting custom templates and passing keyword arguments to `apply_chat_template`. Documentation has been updated in the `ChatHuggingFace` class docstring and in the HuggingFace integration notebook (`docs/docs/integrations/chat/huggingface.ipynb`) to explain these new features with examples. Linting errors (E501 Line too long) found in a previous CI run have been fixed by reformatting dictionary comprehensions and other long lines. This change addresses issue #31470 by providing a flexible way for you to pass tokenizer template values, interpreted as HuggingFace chat template strings and variables for those Jinja templates.
b64ef6f
20 days ago
by google-labs-jules[bot]
0%
feat: Allow passing tokenizer template values to HuggingFace chat models This commit introduces two ways for you to customize chat templating when using `ChatHuggingFace`: 1. **Custom Chat Template String**: A new `chat_template` parameter has been added to the `ChatHuggingFace` constructor. This allows you to provide a custom Jinja template string, which will be assigned to `tokenizer.chat_template` after the tokenizer is loaded. This gives you full control over the chat prompt formatting if the default template associated with a model is not suitable or if you want to experiment with different prompt structures. 2. **Dynamic Template Variables via `**kwargs`**: The `_to_chat_prompt` method in `ChatHuggingFace` (which is responsible for formatting messages using `tokenizer.apply_chat_template`) has been modified to accept arbitrary keyword arguments (`**kwargs`). These `kwargs` are then passed directly to `tokenizer.apply_chat_template`. This allows you to define variables in your Jinja chat templates (either the default one or a custom one) and provide values for these variables dynamically during calls to `invoke`, `stream`, `generate`, etc. Unit tests have been added to verify these new functionalities, including setting custom templates and passing keyword arguments to `apply_chat_template`. Documentation has been updated in the `ChatHuggingFace` class docstring and in the HuggingFace integration notebook (`docs/docs/integrations/chat/huggingface.ipynb`) to explain these new features with examples. Further fixes for E501 (Line too long) linting errors based on CI feedback. This change addresses issue #31470 by providing a flexible way to pass tokenizer template values, interpreted as HuggingFace chat template strings and variables for those Jinja templates.
fddbb29
20 days ago
by google-labs-jules[bot]
0%
feat: Allow passing tokenizer template values to HuggingFace chat models This commit introduces two ways you can customize chat templating when using `ChatHuggingFace`: 1. **Custom Chat Template String**: A new `chat_template` parameter has been added to the `ChatHuggingFace` constructor. This allows you to provide a custom Jinja template string, which will be assigned to `tokenizer.chat_template` after the tokenizer is loaded. This gives you full control over the chat prompt formatting if the default template associated with a model is not suitable or if you want to experiment with different prompt structures. 2. **Dynamic Template Variables via `**kwargs`**: The `_to_chat_prompt` method in `ChatHuggingFace` (which is responsible for formatting messages using `tokenizer.apply_chat_template`) has been modified to accept arbitrary keyword arguments (`**kwargs`). These `kwargs` are then passed directly to `tokenizer.apply_chat_template`. This allows you to define variables in your Jinja chat templates (either the default one or a custom one) and provide values for these variables dynamically during calls to `invoke`, `stream`, `generate`, etc. Unit tests have been added to verify these new functionalities, including setting custom templates and passing keyword arguments to `apply_chat_template`. Documentation has been updated in the `ChatHuggingFace` class docstring and in the HuggingFace integration notebook (`docs/docs/integrations/chat/huggingface.ipynb`) to explain these new features with examples. This commit also includes comprehensive linting and formatting fixes for the `libs/partners/huggingface` directory to align with project standards, including `pyupgrade`, `ruff`, `black`, and `isort` changes. This change addresses issue #31470 by providing a flexible way to pass tokenizer template values, interpreted as HuggingFace chat template strings and variables for those Jinja templates.
02a1bdb
20 days ago
by google-labs-jules[bot]
0%
feat: Allow passing tokenizer template values to HuggingFace chat models This commit introduces two ways to customize chat templating when using `ChatHuggingFace`: 1. **Custom Chat Template String**: A new `chat_template` parameter has been added to the `ChatHuggingFace` constructor. This allows you to provide a custom Jinja template string, which will be assigned to `tokenizer.chat_template` after the tokenizer is loaded. This gives you full control over the chat prompt formatting if the default template associated with a model is not suitable or if you want to experiment with different prompt structures. 2. **Dynamic Template Variables via `**kwargs`**: The `_to_chat_prompt` method in `ChatHuggingFace` (which is responsible for formatting messages using `tokenizer.apply_chat_template`) has been modified to accept arbitrary keyword arguments (`**kwargs`). These `kwargs` are then passed directly to `tokenizer.apply_chat_template`. This allows you to define variables in your Jinja chat templates (either the default one or a custom one) and provide values for these variables dynamically during calls to `invoke`, `stream`, `generate`, etc. Unit tests have been added to verify these new functionalities, including setting custom templates and passing keyword arguments to `apply_chat_template`. Documentation has been updated in the `ChatHuggingFace` class docstring and in the HuggingFace integration notebook (`docs/docs/integrations/chat/huggingface.ipynb`) to explain these new features with examples. This commit also includes comprehensive linting and formatting fixes for the `libs/partners/huggingface` directory to align with project standards, including `pyupgrade`, `ruff --fix`, `black`, `isort`, and `ruff format` changes. This change addresses issue #31470 by providing a flexible way to pass tokenizer template values, interpreted as HuggingFace chat template strings and variables for those Jinja templates.
5aba416
20 days ago
by google-labs-jules[bot]
0%
feat: Allow passing tokenizer template values to HuggingFace chat models This commit introduces two ways you can customize chat templating when using `ChatHuggingFace`: 1. **Custom Chat Template String**: A new `chat_template` parameter has been added to the `ChatHuggingFace` constructor. This allows you to provide a custom Jinja template string, which will be assigned to `tokenizer.chat_template` after the tokenizer is loaded. This gives you full control over the chat prompt formatting if the default template associated with a model is not suitable or if you want to experiment with different prompt structures. 2. **Dynamic Template Variables via `**kwargs`**: The `_to_chat_prompt` method in `ChatHuggingFace` (which is responsible for formatting messages using `tokenizer.apply_chat_template`) has been modified to accept arbitrary keyword arguments (`**kwargs`). These `kwargs` are then passed directly to `tokenizer.apply_chat_template`. This allows you to define variables in your Jinja chat templates (either the default one or a custom one) and provide values for these variables dynamically during calls to `invoke`, `stream`, `generate`, etc. Unit tests have been added to verify these new functionalities, including setting custom templates and passing keyword arguments to `apply_chat_template`. Documentation has been updated in the `ChatHuggingFace` class docstring and in the HuggingFace integration notebook (`docs/docs/integrations/chat/huggingface.ipynb`) to explain these new features with examples. This commit also includes comprehensive linting and formatting fixes for the `libs/partners/huggingface` directory to align with project standards. This includes: - Fixing E501 (line too long) errors. - Correcting an invalid `type: ignore[import]` comment in `llms/huggingface_endpoint.py`. - Resolving an `AttributeError` in mypy for `tests/unit_tests/test_chat_models.py`. - Ensuring `pyupgrade`, `ruff`, `black`, and `isort` checks pass. This change addresses issue #31470 by providing a flexible way to pass tokenizer template values, interpreted as HuggingFace chat template strings and variables for those Jinja templates.
eb262b1
17 days ago
by google-labs-jules[bot]
© 2025 CodSpeed Technology
Home Terms Privacy Docs