Skip to content
@BerriAI

Berri AI

The fastest way to take your LLM app to production

Pinned Loading

  1. reliableGPT Public

    Get 100% uptime, reliability from OpenAI. Handle Rate Limit, Timeout, API, Keys Errors

    Python 645 46

Repositories

Showing 10 of 45 repositories
  • litellm Public

    Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]

    Python 18,706 2,319 1,054 (30 issues need help) 322 Updated Mar 12, 2025
  • TypeScript 0 MIT 0 0 0 Updated Mar 8, 2025
  • example_anthropic_endpoint Public

    An example anthropic API Endpoint

    Python 3 MIT 0 0 0 Updated Mar 1, 2025
  • Python 4 MIT 8 0 0 Updated Feb 18, 2025
  • example_openai_endpoint Public

    An example OpenAI /chat/completions endpoint

    Python 9 5 0 1 Updated Feb 14, 2025
  • Python 4 2 0 0 Updated Jan 16, 2025
  • Python 1 1 0 0 Updated Jan 16, 2025
  • provider-litellm-http Public Forked from crossplane-contrib/provider-http

    Crossplane Provider designed to facilitate sending LiteLLM HTTP requests as resources.

    Go 0 Apache-2.0 16 0 0 Updated Oct 23, 2024
  • provider-litellm Public

    LiteLLM Gateway (Proxy) crossplane provider

    Go 0 Apache-2.0 0 0 0 Updated Oct 21, 2024
  • example_litellm_gcp_cloud_run Public

    Example Repo to deploy LiteLLM Proxy (AI Gateway) on GCP Cloud Run

    Dockerfile 2 MIT 13 0 0 Updated Oct 16, 2024

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…