diff --git a/docs/inference-providers/_toctree.yml b/docs/inference-providers/_toctree.yml index cbd2f4cb6..0895871c8 100644 --- a/docs/inference-providers/_toctree.yml +++ b/docs/inference-providers/_toctree.yml @@ -28,6 +28,19 @@ - local: guides/vscode title: VS Code with GitHub Copilot +- title: Integrations + sections: + - local: integrations/index-simple + title: Integrations Overview (Simple) + - local: integrations/index + title: Integrations Overview (Full) + - local: integrations/adding-integration + title: Add Your Integration + - local: integrations/open-code + title: OpenCode + - local: integrations/macwhisper + title: MacWhisper + - local: tasks/index title: Inference Tasks sections: diff --git a/docs/inference-providers/integrations/adding-integration.md b/docs/inference-providers/integrations/adding-integration.md new file mode 100644 index 000000000..4dfcdcf25 --- /dev/null +++ b/docs/inference-providers/integrations/adding-integration.md @@ -0,0 +1,59 @@ +# Add Your Integration + +Building a tool that works with Hugging Face Inference Providers? We'd love to feature it in our integrations directory! + +## Requirements + +To be listed, your integration should: + +- ✅ **Work with HF Inference Providers** via our API or OpenAI-compatible endpoints +- ✅ **Be actively maintained** with recent commits or releases +- ✅ **Have clear documentation** showing how to connect to HF + +## How to Submit + +1. **Test your integration** with Hugging Face Inference Providers +2. **Fork the repository** at [github.com/huggingface/hub-docs](https://github.com/huggingface/hub-docs) +3. **Add your integration page** in `docs/inference-providers/integrations/` +4. **Update the index** in `docs/inference-providers/integrations/index.md` +5. **Submit a Pull Request** with your changes + +## Integration Page Template + +Create a file named `your-tool-name.md` with this structure: + +```markdown +# Your Tool Name + +Brief description of what your tool does. + +## Overview + +How your tool integrates with Hugging Face Inference Providers. + +## Prerequisites + +- Your tool installed +- HF account with [API token](https://huggingface.co/settings/tokens) + +## Configuration + +Step-by-step setup instructions with code examples. + +## Resources + +- [Your Tool Documentation](https://yourtool.com/docs) +- [HF Integration Guide](link-to-your-guide) +``` + +## Updating the Index + +Add your tool to the table in `integrations/index.md`: + +```markdown +| [Your Tool](./your-tool) | Brief description | [Docs](https://yourtool.com/docs) | [Guide](../guides/your-guide) | +``` + +## Questions? + +Need help with your integration? Visit the [Hugging Face Forums](https://discuss.huggingface.co/) or open an issue in the [hub-docs repository](https://github.com/huggingface/hub-docs/issues). diff --git a/docs/inference-providers/integrations/index-simple.md b/docs/inference-providers/integrations/index-simple.md new file mode 100644 index 000000000..c74d55c32 --- /dev/null +++ b/docs/inference-providers/integrations/index-simple.md @@ -0,0 +1,56 @@ +# Integrations + +Connect your favorite tools with Hugging Face Inference Providers. + +## Featured Integrations + +These integrations have detailed guides to help you get started: + +- **[OpenCode](./opencode)** - AI coding agent for your terminal +- **[MacWhisper](./macwhisper)** - Transcribe audio on macOS with Whisper + +## All Integrations + +### Development Tools + +- [Continue](https://continue.dev/docs/reference/model-providers/huggingface) - AI code assistant for IDEs +- [Cursor](https://cursor.sh/docs) - AI-first code editor +- [Codeium](https://codeium.com/docs) - Free code completion +- [Roo Code](https://docs.roocode.com/providers/huggingface) - Enterprise code generation + +### Observability + +- [Langfuse](https://langfuse.com/docs/integrations/huggingface) - LLM observability platform +- [UK AISI Inspect](https://inspect.aisi.org.uk/docs) - AI safety evaluation + +### Frameworks + +- [LangChain](https://python.langchain.com/docs/integrations/platforms/huggingface) - LLM application framework +- [Haystack](https://docs.haystack.deepset.ai/docs/huggingfaceapichatgenerator) - Open-source LLM framework +- [LlamaIndex](https://docs.llamaindex.ai/en/stable/examples/llm/huggingface/) - Data framework for LLMs +- [CrewAI](https://docs.crewai.com/core-concepts/LLMs/) - Multi-agent orchestration +- [AutoGen](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat/) - Multi-agent conversations + +### Applications + +- [Open WebUI](https://docs.openwebui.com/getting-started/) - Self-hosted LLM interface +- [TypingMind](https://docs.typingmind.com/) - Enhanced ChatGPT UI + +### API Clients + +- [OpenAI SDK](https://github.com/openai/openai-python) - Works with our OpenAI-compatible endpoints +- [LiteLLM](https://docs.litellm.ai/docs/providers/huggingface) - Unified LLM interface +- [Portkey](https://docs.portkey.ai/) - AI gateway with advanced features + +## OpenAI-Compatible Endpoints + +Any tool that supports OpenAI can work with Hugging Face: + +```python +base_url = "https://api-inference.huggingface.co/v1/" +api_key = "hf_YOUR_TOKEN" +``` + +## Add Your Integration + +Building something? [Let us know](./adding-integration) and we'll add it to the list. diff --git a/docs/inference-providers/integrations/index.md b/docs/inference-providers/integrations/index.md new file mode 100644 index 000000000..e73aa2c68 --- /dev/null +++ b/docs/inference-providers/integrations/index.md @@ -0,0 +1,23 @@ +# Integrations Overview + +Hugging Face Inference Providers works with a growing ecosystem of developer tools, frameworks, and platforms. These integrations let you use state-of-the-art models in your existing workflows and development environments. + +## Why Use Integrations? + +- **Keep your existing tools**: Use Inference Providers with tools you already know +- **Access 17+ providers**: Switch between providers without changing your code +- **Zero markup pricing**: Get the same rates as going direct to providers +- **Single API token**: One HF token for all providers and models + +## Available Integrations + +| Integration | Description | Official Documentation | +| --------------------------------- | -------------------------------------------------------------------- | ----------------------------------------------------------- | +| [Haystack](./haystack) | Open-source LLM framework for building production-ready applications | [Documentation](https://docs.haystack.deepset.ai/) | +| [Langfuse](./langfuse) | Open-source LLM engineering platform for observability | [Documentation](https://langfuse.com/docs) | +| [MacWhisper](./macwhisper) | Speech-to-text application for macOS | [Product Page](https://goodsnooze.gumroad.com/l/macwhisper) | +| [OpenCode](./open-code) | AI coding agent built for the terminal | [Documentation](https://opencode.ai/docs) | +| [Roo Code](./roo-code) | AI-powered code generation and refactoring | [Documentation](https://docs.roocode.com/) | +| [UK AISI Inspect](./aisi-inspect) | AI safety evaluation framework | [Documentation](https://inspect.aisi.org.uk/) | + +More integrations coming soon! Want to add yours? See [how to add your integration](./adding-integration). diff --git a/docs/inference-providers/integrations/macwhisper.md b/docs/inference-providers/integrations/macwhisper.md new file mode 100644 index 000000000..582b11e53 --- /dev/null +++ b/docs/inference-providers/integrations/macwhisper.md @@ -0,0 +1,29 @@ +# MacWhisper + +[MacWhisper](https://goodsnooze.gumroad.com/l/macwhisper) lets you run Whisper locally on your Mac without having to install anything else. + +## Overview + +You can use MacWhisper with Hugging Face Inference Providers to access a wider range of models and take advantage of zero-markup pricing. + +### How can I use MacWhisper with Hugging Face Inference Providers? + +MacWhisper allows you to set up AI services which can be used to work with the outputs of the MacWhisper transcriptions. For example, you can set up a prompt to clean up dictations or translate transcriptions into another language. + +It's possible to use Hugging Face Inference Providers as the backend for these AI services, allowing you to leverage open models from various providers. + +## Prerequisites + +- MacWhisper installed ([installation guide](https://goodsnooze.gumroad.com/l/macwhisper)) +- A Hugging Face account with [API token](https://huggingface.co/settings/t + okens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) (needs "Make calls to Inference Providers" permission) + +## Configuration + +1. Create a Hugging Face token with Inference Providers permissions at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) +2. Open MacWhisper and go to **Settings > AI** > **Service**. +3. Select **Hugging Face Inference Providers** as the service. +4. Enter your Hugging Face API token in the provided field. +5. Add the model ID for the model you want to use. + +Tip diff --git a/docs/inference-providers/integrations/open-code.md b/docs/inference-providers/integrations/open-code.md new file mode 100644 index 000000000..913871996 --- /dev/null +++ b/docs/inference-providers/integrations/open-code.md @@ -0,0 +1,66 @@ +# OpenCode + + + +[OpenCode](https://opencode.ai/) is an AI coding agent built for the terminal that helps with code review, refactoring, testing, and general development tasks. + +## Overview + +OpenCode natively supports Hugging Face Inference Providers, giving you access to open models from 17+ providers through a single interface. + +## Prerequisites + +- OpenCode installed ([installation guide](https://opencode.ai/docs)) +- A Hugging Face account with [API token](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) (needs "Make calls to Inference Providers" permission) + +## Configuration + +### Quick Setup + +1. Create a Hugging Face token with Inference Providers permissions at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) + +2. Run `opencode auth login` and select **Hugging Face**: + +```bash +$ opencode auth login + +┌ Add credential +│ +◆ Select provider +│ ● Hugging Face +│ ... +└ +``` + +3. Enter your Hugging Face token when prompted: + +```bash +┌ Add credential +│ +◇ Select provider +│ Hugging Face +│ +◇ Enter your API key +│ hf_... +└ +``` + +4. Run the `/models` command in OpenCode to select a model. + +Once configured, OpenCode will use your selected model for all operations. You can switch models anytime using the `/models` command in the OpenCode TUI (Terminal User Interface). + +## GitHub Actions Integration + +OpenCode can also be used to run open models in GitHub Actions via Inference Providers. See our [GitHub Actions guide](../guides/github-actions-code-review) for setting up automated PR reviews. + +## Resources + +- [OpenCode Documentation](https://opencode.ai/docs) +- [OpenCode Provider Configuration](https://opencode.ai/docs/providers/#hugging-face) +- [GitHub Actions Integration Guide](../guides/github-actions-code-review)