From acfabe4da92cbc3eb75417736239d3d0a48269ae Mon Sep 17 00:00:00 2001 From: Akhil Reddy <69634931+akhilreddy097@users.noreply.github.com> Date: Tue, 18 Nov 2025 23:08:55 -0800 Subject: [PATCH] Add HuggingFace chat model integration examples This adds HuggingFace chat model integration examples to the chat-model-tabs.mdx snippet file. Includes: - Installation instructions for langchain[huggingface] - init_chat_model example with HuggingFace provider - Direct model class usage with HuggingFaceEndpoint and ChatHuggingFace - Uses microsoft/Phi-3-mini-4k-instruct as the example model Resolves issue #28226 --- src/snippets/chat-model-tabs.mdx | 38 ++++++++++++++++++++++++++++++++ 1 file changed, 38 insertions(+) diff --git a/src/snippets/chat-model-tabs.mdx b/src/snippets/chat-model-tabs.mdx index 242f395913..a2fc9cae07 100644 --- a/src/snippets/chat-model-tabs.mdx +++ b/src/snippets/chat-model-tabs.mdx @@ -134,5 +134,43 @@ model = ChatBedrock(model="anthropic.claude-3-5-sonnet-20240620-v1:0") ``` + + + 👉 Read the [HuggingFace chat model integration docs](/oss/python/integrations/chat/huggingface/) + + ```shell + pip install -U "langchain[huggingface]" + ``` + + + ```python init_chat_model + import os + from langchain.chat_models import init_chat_model + + os.environ["HUGGINGFACEHUB_API_TOKEN"] = "hf_..." + + model = init_chat_model( + "microsoft/Phi-3-mini-4k-instruct", + model_provider="huggingface", + temperature=0.7, + max_tokens=1024, + ) + ``` + + ```python Model Class + import os + from langchain_huggingface import ChatHuggingFace, HuggingFaceEndpoint + + os.environ["HUGGINGFACEHUB_API_TOKEN"] = "hf_..." + + llm = HuggingFaceEndpoint( + repo_id="microsoft/Phi-3-mini-4k-instruct", + temperature=0.7, + max_length=1024, + ) + model = ChatHuggingFace(llm=llm) + ``` + +