Skip to content

Cost tracking shows $0.00 for custom providers using @ai-sdk/openai-compatible even when cost fields are configured #24113

@fabienbarbaud

Description

@fabienbarbaud

Description

When using a custom provider configured via opencode.json with @ai-sdk/openai-compatible, the UI does not track or display $ Spent. The cost remains at $0.00 regardless of token usage, even when cost fields are explicitly configured per the config schema.
This is a similar issue to what was reported on the popular fork anomalyco/opencode#17223.

Plugins

No response

OpenCode version

1.14.22

Steps to reproduce

  1. Configure a custom provider in opencode.json with @ai-sdk/openai-compatible and cost fields specified:
{
  "provider": {
    "my-proxy": {
      "name": "My LiteLLM Proxy",
      "npm": "@ai-sdk/openai-compatible",
      "options": {
        "baseURL": "https://proxy.example.com/v1",
        "apiKey": "sk-..."
      },
      "models": {
        "my-model": {
          "name": "My Model",
          "cost": {
            "input": 5.5e-07,
            "output": 2.75e-06,
            "cache": {
              "read": 5.5e-08,
              "write": 6.875e-07
            }
          },
          "limit": {
            "context": 200000,
            "output": 64000
          }
        }
      }
    }
  }
}
  1. Start a session using the custom provider model.
  2. Send messages and observe the cost display in the status bar.

Expected Behavior

The UI should calculate and display a running cost using either:

  • The cost values defined in the model config combined with the token usage from the API response (usage.prompt_tokens, usage.completion_tokens), OR
  • The cost header returned by the proxy (e.g., x-litellm-response-cost).

Actual Behavior

  • $ Spent remains at $0.00 for the entire session.
  • The cost config fields on custom provider models have no observable effect.
  • The upstream API (LiteLLM proxy) correctly returns token usage in the response body:
usage: {
    prompt_tokens: 10,
    completion_tokens: 52,
    total_tokens: 62
  }
  • The upstream API also returns a cost header: x-litellm-response-cost: 0.0002015
  • OpenCode appears to neither calculate cost from token counts nor read the x-litellm-response-cost header.

Screenshot and/or share link

No response

Operating System

Ubuntu 24

Terminal

No response

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions