Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
18 commits
Select commit Hold shift + click to select a range
c2645e7
feat/integrations: add initial documentation for available integrations
davanstrien Nov 3, 2025
91e4a4f
feat(open-code): add initial documentation for OpenCode integration w…
davanstrien Nov 3, 2025
fbcc4cf
fix(open-code): clarify GitHub Actions integration usage in documenta…
davanstrien Nov 3, 2025
e89f905
feat(integrations): add Inference Provider Integrations section to do…
davanstrien Nov 3, 2025
794cece
feat(integrations): add OpenCode section to Inference Provider Integr…
davanstrien Nov 3, 2025
55ce637
fix(integrations): correct spelling and formatting in Integrations se…
davanstrien Nov 3, 2025
ed4587f
try different organization
davanstrien Nov 3, 2025
21e1e21
temp example intergration page
davanstrien Nov 3, 2025
1113a5d
fix(integrations): update OpenCode logo URLs to use absolute paths
davanstrien Nov 3, 2025
7c14a24
feat(integrations): add MacWhisper integration documentation
davanstrien Nov 3, 2025
7175c29
fix(integrations): comment out OpenCode logo HTML for clarity
davanstrien Nov 3, 2025
c7afa96
refactor(integrations): enhance overview and structure of integration…
davanstrien Nov 3, 2025
5c86e6f
typo
davanstrien Nov 4, 2025
1d96831
feat(integrations): add initial integrations overview documentation
davanstrien Nov 4, 2025
65a509a
simple index version
davanstrien Nov 6, 2025
44513d4
feat(integrations): add simple version of integrations overview
davanstrien Nov 6, 2025
9c79d19
feat(integrations): update integrations section and remove outdated d…
davanstrien Nov 6, 2025
45caf3d
tidy poc
davanstrien Nov 6, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 13 additions & 0 deletions docs/inference-providers/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,19 @@
- local: guides/vscode
title: VS Code with GitHub Copilot

- title: Integrations
sections:
- local: integrations/index-simple
title: Integrations Overview (Simple)
- local: integrations/index
title: Integrations Overview (Full)
- local: integrations/adding-integration
title: Add Your Integration
- local: integrations/open-code
title: OpenCode
- local: integrations/macwhisper
title: MacWhisper

- local: tasks/index
title: Inference Tasks
sections:
Expand Down
59 changes: 59 additions & 0 deletions docs/inference-providers/integrations/adding-integration.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
# Add Your Integration

Building a tool that works with Hugging Face Inference Providers? We'd love to feature it in our integrations directory!

## Requirements

To be listed, your integration should:

- ✅ **Work with HF Inference Providers** via our API or OpenAI-compatible endpoints
- ✅ **Be actively maintained** with recent commits or releases
- ✅ **Have clear documentation** showing how to connect to HF

## How to Submit

1. **Test your integration** with Hugging Face Inference Providers
2. **Fork the repository** at [github.com/huggingface/hub-docs](https://github.com/huggingface/hub-docs)
3. **Add your integration page** in `docs/inference-providers/integrations/`
4. **Update the index** in `docs/inference-providers/integrations/index.md`
5. **Submit a Pull Request** with your changes

## Integration Page Template

Create a file named `your-tool-name.md` with this structure:

```markdown
# Your Tool Name

Brief description of what your tool does.

## Overview

How your tool integrates with Hugging Face Inference Providers.

## Prerequisites

- Your tool installed
- HF account with [API token](https://huggingface.co/settings/tokens)

## Configuration

Step-by-step setup instructions with code examples.

## Resources

- [Your Tool Documentation](https://yourtool.com/docs)
- [HF Integration Guide](link-to-your-guide)
```

## Updating the Index

Add your tool to the table in `integrations/index.md`:

```markdown
| [Your Tool](./your-tool) | Brief description | [Docs](https://yourtool.com/docs) | [Guide](../guides/your-guide) |
```

## Questions?

Need help with your integration? Visit the [Hugging Face Forums](https://discuss.huggingface.co/) or open an issue in the [hub-docs repository](https://github.com/huggingface/hub-docs/issues).
56 changes: 56 additions & 0 deletions docs/inference-providers/integrations/index-simple.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# Integrations

Connect your favorite tools with Hugging Face Inference Providers.

## Featured Integrations

These integrations have detailed guides to help you get started:

- **[OpenCode](./opencode)** - AI coding agent for your terminal
- **[MacWhisper](./macwhisper)** - Transcribe audio on macOS with Whisper

## All Integrations

### Development Tools

- [Continue](https://continue.dev/docs/reference/model-providers/huggingface) - AI code assistant for IDEs
- [Cursor](https://cursor.sh/docs) - AI-first code editor
- [Codeium](https://codeium.com/docs) - Free code completion
- [Roo Code](https://docs.roocode.com/providers/huggingface) - Enterprise code generation

### Observability

- [Langfuse](https://langfuse.com/docs/integrations/huggingface) - LLM observability platform
- [UK AISI Inspect](https://inspect.aisi.org.uk/docs) - AI safety evaluation

### Frameworks

- [LangChain](https://python.langchain.com/docs/integrations/platforms/huggingface) - LLM application framework
- [Haystack](https://docs.haystack.deepset.ai/docs/huggingfaceapichatgenerator) - Open-source LLM framework
- [LlamaIndex](https://docs.llamaindex.ai/en/stable/examples/llm/huggingface/) - Data framework for LLMs
- [CrewAI](https://docs.crewai.com/core-concepts/LLMs/) - Multi-agent orchestration
- [AutoGen](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat/) - Multi-agent conversations

### Applications

- [Open WebUI](https://docs.openwebui.com/getting-started/) - Self-hosted LLM interface
- [TypingMind](https://docs.typingmind.com/) - Enhanced ChatGPT UI

### API Clients

- [OpenAI SDK](https://github.com/openai/openai-python) - Works with our OpenAI-compatible endpoints
- [LiteLLM](https://docs.litellm.ai/docs/providers/huggingface) - Unified LLM interface
- [Portkey](https://docs.portkey.ai/) - AI gateway with advanced features

## OpenAI-Compatible Endpoints

Any tool that supports OpenAI can work with Hugging Face:

```python
base_url = "https://api-inference.huggingface.co/v1/"
api_key = "hf_YOUR_TOKEN"
```

## Add Your Integration

Building something? [Let us know](./adding-integration) and we'll add it to the list.
23 changes: 23 additions & 0 deletions docs/inference-providers/integrations/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# Integrations Overview

Hugging Face Inference Providers works with a growing ecosystem of developer tools, frameworks, and platforms. These integrations let you use state-of-the-art models in your existing workflows and development environments.

## Why Use Integrations?

- **Keep your existing tools**: Use Inference Providers with tools you already know
- **Access 17+ providers**: Switch between providers without changing your code
- **Zero markup pricing**: Get the same rates as going direct to providers
- **Single API token**: One HF token for all providers and models

## Available Integrations

| Integration | Description | Official Documentation |
| --------------------------------- | -------------------------------------------------------------------- | ----------------------------------------------------------- |
| [Haystack](./haystack) | Open-source LLM framework for building production-ready applications | [Documentation](https://docs.haystack.deepset.ai/) |
| [Langfuse](./langfuse) | Open-source LLM engineering platform for observability | [Documentation](https://langfuse.com/docs) |
| [MacWhisper](./macwhisper) | Speech-to-text application for macOS | [Product Page](https://goodsnooze.gumroad.com/l/macwhisper) |
| [OpenCode](./open-code) | AI coding agent built for the terminal | [Documentation](https://opencode.ai/docs) |
| [Roo Code](./roo-code) | AI-powered code generation and refactoring | [Documentation](https://docs.roocode.com/) |
| [UK AISI Inspect](./aisi-inspect) | AI safety evaluation framework | [Documentation](https://inspect.aisi.org.uk/) |

More integrations coming soon! Want to add yours? See [how to add your integration](./adding-integration).
29 changes: 29 additions & 0 deletions docs/inference-providers/integrations/macwhisper.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# MacWhisper

[MacWhisper](https://goodsnooze.gumroad.com/l/macwhisper) lets you run Whisper locally on your Mac without having to install anything else.

## Overview

You can use MacWhisper with Hugging Face Inference Providers to access a wider range of models and take advantage of zero-markup pricing.

### How can I use MacWhisper with Hugging Face Inference Providers?

MacWhisper allows you to set up AI services which can be used to work with the outputs of the MacWhisper transcriptions. For example, you can set up a prompt to clean up dictations or translate transcriptions into another language.

It's possible to use Hugging Face Inference Providers as the backend for these AI services, allowing you to leverage open models from various providers.

## Prerequisites

- MacWhisper installed ([installation guide](https://goodsnooze.gumroad.com/l/macwhisper))
- A Hugging Face account with [API token](https://huggingface.co/settings/t
okens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) (needs "Make calls to Inference Providers" permission)

## Configuration

1. Create a Hugging Face token with Inference Providers permissions at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained)
2. Open MacWhisper and go to **Settings > AI** > **Service**.
3. Select **Hugging Face Inference Providers** as the service.
4. Enter your Hugging Face API token in the provided field.
5. Add the model ID for the model you want to use.

Tip
66 changes: 66 additions & 0 deletions docs/inference-providers/integrations/open-code.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# OpenCode

<!--
<div class="flex justify-center">
<a href="https://opencode.ai/" target="_blank">
<img class="block dark:hidden" src="https://opencode.ai/_build/assets/preview-opencode-wordmark-light-nzmKQT2r.png" alt="OpenCode">
<img class="hidden dark:block" src="https://opencode.ai/_build/assets/preview-opencode-wordmark-dark-tZ1Y3VXe.png" alt="OpenCode"/>
</a>
</div> -->

[OpenCode](https://opencode.ai/) is an AI coding agent built for the terminal that helps with code review, refactoring, testing, and general development tasks.

## Overview

OpenCode natively supports Hugging Face Inference Providers, giving you access to open models from 17+ providers through a single interface.

## Prerequisites

- OpenCode installed ([installation guide](https://opencode.ai/docs))
- A Hugging Face account with [API token](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) (needs "Make calls to Inference Providers" permission)

## Configuration

### Quick Setup

1. Create a Hugging Face token with Inference Providers permissions at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained)

2. Run `opencode auth login` and select **Hugging Face**:

```bash
$ opencode auth login

┌ Add credential
◆ Select provider
│ ● Hugging Face
│ ...
```

3. Enter your Hugging Face token when prompted:

```bash
┌ Add credential
◇ Select provider
│ Hugging Face
◇ Enter your API key
│ hf_...
```

4. Run the `/models` command in OpenCode to select a model.

Once configured, OpenCode will use your selected model for all operations. You can switch models anytime using the `/models` command in the OpenCode TUI (Terminal User Interface).

## GitHub Actions Integration

OpenCode can also be used to run open models in GitHub Actions via Inference Providers. See our [GitHub Actions guide](../guides/github-actions-code-review) for setting up automated PR reviews.

## Resources

- [OpenCode Documentation](https://opencode.ai/docs)
- [OpenCode Provider Configuration](https://opencode.ai/docs/providers/#hugging-face)
- [GitHub Actions Integration Guide](../guides/github-actions-code-review)