From f2c60002fa7eb1ca1df4b8e5394dc806ccb80526 Mon Sep 17 00:00:00 2001 From: Celina Hanouti Date: Fri, 12 Sep 2025 10:08:43 +0200 Subject: [PATCH] add vs code version requirement --- docs/inference-providers/guides/vscode.md | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/docs/inference-providers/guides/vscode.md b/docs/inference-providers/guides/vscode.md index 3df068eda..7d62e8453 100644 --- a/docs/inference-providers/guides/vscode.md +++ b/docs/inference-providers/guides/vscode.md @@ -13,6 +13,12 @@ Use frontier open LLMs like Kimi K2, DeepSeek V3.1, GLM 4.5 and more in VS Code 5. Enter your Hugging Face Token. You can get one from your [settings page](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained). 6. Choose the models you want to add to the model picker. 🥳 + + +VS Code 1.104.0+ is required to install the HF Copilot Chat extension. If "Hugging Face" doesn't appear in the Copilot provider list, update VS Code, then reload. + + + ## ✨ Why use the Hugging Face provider in Copilot - Access [SoTA open‑source LLMs](https://huggingface.co/models?pipeline_tag=text-generation&inference_provider=cerebras,together,fireworks-ai,nebius,novita,sambanova,groq,hyperbolic,nscale,fal-ai,cohere,replicate,scaleway,black-forest-labs,ovhcloud&sort=trending) with tool calling capabilities.