Skip to content

feat(provider): add LiteLLM provider support#22008

Closed
tobias-weiss-ai-xr wants to merge 1 commit intoanomalyco:devfrom
tobias-weiss-ai-xr:feature/litellm-provider
Closed

feat(provider): add LiteLLM provider support#22008
tobias-weiss-ai-xr wants to merge 1 commit intoanomalyco:devfrom
tobias-weiss-ai-xr:feature/litellm-provider

Conversation

@tobias-weiss-ai-xr
Copy link
Copy Markdown

Summary

  • Add litellm as a built-in provider option using @ai-sdk/openai-compatible
  • LiteLLM acts as a unified proxy for 100+ LLM providers — this lets users point opencode at a LiteLLM server
  • Configuration: set baseURL to the LiteLLM server address (default http://localhost:4000)

Test plan

  • Configure a LiteLLM provider in opencode config — verify it connects and completes a chat
  • Verify models list correctly from the LiteLLM proxy
  • Verify error handling when LiteLLM server is not running

Add litellm entry to the custom() provider map. Uses the bundled
@ai-sdk/openai-compatible SDK with user-configured baseURL. Supports
LITELLM_API_KEY env var and apiKey in provider options. Autoloads when
baseURL is configured in opencode.json under provider.litellm.
@github-actions
Copy link
Copy Markdown
Contributor

This PR doesn't fully meet our contributing guidelines and PR template.

What needs to be fixed:

  • PR description is missing required template sections. Please use the PR template.

Please edit this PR description to address the above within 2 hours, or it will be automatically closed.

If you believe this was flagged incorrectly, please let a maintainer know.

@github-actions github-actions bot added the needs:compliance This means the issue will auto-close after 2 hours. label Apr 11, 2026
@github-actions
Copy link
Copy Markdown
Contributor

The following comment was made by an LLM, it may be inaccurate:

Based on my search, I found two potentially related PRs that may be duplicates or should be reviewed together:

  1. PR feat(opencode): add LiteLLM provider with auto model discovery #14468: feat(opencode): add LiteLLM provider with auto model discovery

  2. PR feat(opencode): add auto loading models for litellm providers #13896: feat(opencode): add auto loading models for litellm providers

  3. PR fix: handle dangling tool_use blocks for LiteLLM proxy compatibility #8497: fix: handle dangling tool_use blocks for LiteLLM proxy compatibility

These older PRs (#13896, #14468) suggest that LiteLLM provider support may have already been attempted or implemented. You should verify whether PR #22008 is:

  • A reimplementation/redesign of existing work
  • An update to previous attempts
  • Or if those older PRs were closed/abandoned without merging

@github-actions
Copy link
Copy Markdown
Contributor

This pull request has been automatically closed because it was not updated to meet our contributing guidelines within the 2-hour window.

Feel free to open a new pull request that follows our guidelines.

@github-actions github-actions bot removed the needs:compliance This means the issue will auto-close after 2 hours. label Apr 11, 2026
@github-actions github-actions bot closed this Apr 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant