Skip to content

Add “Custom” LLM Provider for Third-Party APIs (e.g., NVIDIA, Groq, etc. #170

@ragultv

Description

@ragultv

Description:

BrowserOS Agent currently supports predefined providers (OpenAI, Anthropic, Gemini, Ollama, OpenRouter, LM Studio, etc.).
However, there is no option to add new third-party hosted providers (e.g., NVIDIA NIM, Groq, Together.ai, DeepInfra, Mistral API) that expose OpenAI-compatible APIs.
This feature would introduce a “Custom” provider type in the “Add Provider” modal - allowing users to connect to any external LLM service that uses the /v1/completions or /v1/chat/completions API format.

In regions like India and other developing countries, many students, independent developers, and researchers cannot afford paid APIs from providers like OpenAI, Anthropic (Claude), or Google Gemini.

However, several new providers — such as NVIDIA NIM, Groq Cloud, Together.ai, and Mistral — now offer free or low-cost OpenAI-compatible APIs.
By allowing users to add custom providers, BrowserOS would empower these users to access powerful LLMs without financial barriers.

This feature directly supports BrowserOS’s open-source mission by:

  1. Making AI development tools more accessible globally
  2. Encouraging educational and community adoption
  3. Enabling users to experiment, learn, and build even without paid API keys

“Custom providers bridge the affordability gap — making BrowserOS truly open and useful for everyone.”

Steps to Reproduce:

  1. Go to Settings → BrowserOS AI → Add custom provider
  2. Open the “Provider Type” dropdown
  3. Only the built-in list appears (OpenAI, Anthropic, Gemini, Ollama, OpenRouter, LM Studio)
  4. There’s no way to add a new hosted API provider like Groq or NVIDIA.

Expected Behavior:

When adding a new provider:

  • Provider Type includes: Custom
  • Selecting Custom shows:
  • Provider Name (e.g., “Groq Cloud”)
  • Base URL (e.g., https://api.groq.com/openai/v1)
  • API Key (mandatory)
  • Fetch Models when api key entered → requests /models
  • Dynamically populates Model ID dropdown
  • Provider is saved in the LLM Provider list
  • Users can Test, Edit, or Delete the provider

In Agent UI:
Provider dropdown lists all custom ones
When “Custom” provider is selected, display its models in the model dropdown.
Allow changing model dynamically without re-saving provider.
Cache fetched model lists locally per provider ID

[ Provider ▼ Groq ] [ Model ▼ llama3-70b ]

[ Prompt Input Box ................................................. ]
[ Run Button ] [ Stop Button ]

Actual Behavior:

  1. Provider dropdown is limited to fixed vendor list.
  2. Cannot add new OpenAI-compatible hosted APIs like NVIDIA or Groq.
  3. Users must manually modify config or code to experiment with these APIs.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions