Skip to content

Hugging Face Inference Providers unifies 15+ inference partners under a single, OpenAI‑compatible endpoint. Move from prototype to production with the same, unified API and no infrastructure to manage.

Notifications You must be signed in to change notification settings

sheikhcoders/instantly

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Instantly

A unified interface for Hugging Face Inference Providers and Google AI with OpenAI API compatibility.

Installation

pip install instantly

Configuration

Set your API keys in environment variables:

export HF_TOKEN=your_huggingface_token
export GEMINI_API_KEY=your_google_ai_token

Or use a .env file:

HF_TOKEN=your_huggingface_token
GEMINI_API_KEY=your_google_ai_token
OPENAI_BASE_URL=https://router.huggingface.co/v1

Usage

OpenAI-Compatible Interface

from instantly import OpenAIClient

client = OpenAIClient(api_key="hf_token")
response = client.chat_completion(
    model="moonshotai/Kimi-K2-Instruct",
    messages=[{"role": "user", "content": "Hello"}]
)

Hugging Face Direct Interface

from instantly import InferenceClient

client = InferenceClient(api_key="hf_token")
image = client.text_to_image(
    prompt="A landscape",
    model="black-forest-labs/FLUX.1-dev"
)

Google AI Interface

from instantly import GoogleAIClient

client = GoogleAIClient(api_key="your_gemini_key")
response = client.generate_content(
    model="gemini-2.5-flash-image-preview",
    prompt="Hello, how are you?"
)

Search Tools

from instantly import DuckDuckGoSearchTool, WebSearchTool, VisitWebpageTool

# DuckDuckGo search
search = DuckDuckGoSearchTool(max_results=5)
results = search("Hugging Face")

# Web search with configurable engine
web_search = WebSearchTool(max_results=10, engine="duckduckgo")
results = web_search("Machine Learning")

# Visit and process webpage
webpage = VisitWebpageTool()
content = webpage("https://example.com")

Development

  1. Clone the repository:
git clone https://github.com/yourusername/instantly.git
cd instantly
  1. Install development dependencies:
pip install -e ".[dev]"
  1. Run tests:
python -m pytest tests/

License

MIT License

About

Hugging Face Inference Providers unifies 15+ inference partners under a single, OpenAI‑compatible endpoint. Move from prototype to production with the same, unified API and no infrastructure to manage.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published