Skip to content

Provide API Endpoint for Model Pricing #2074

@rubinjoshua

Description

@rubinjoshua

Confirm this is a feature request for the Python library and not the underlying OpenAI API.

  • This is a feature request for the Python library

Describe the feature or improvement you're requesting

Issue

Currently, OpenAI does not offer a way to dynamically retrieve model pricing through the API. Developers must hardcode pricing information and manually update their configurations when OpenAI changes pricing. This approach is outdated and impractical, as any other modern API service would provide pricing dynamically via an endpoint.

Proposed Solution

Please provide an official API endpoint that returns current model pricing, including:
• Per-token costs for input and output
• Pricing per model (GPT-4, GPT-3.5, etc.)
• Any upcoming changes (if applicable)

Why This Matters
• Prevents manual updates and errors in cost calculations
• Ensures accurate cost tracking for dynamic workloads
• Helps enterprise users avoid unexpected cost changes
• Reduces reliance on scraping OpenAI’s website

Expected Format (Example Response)

A simple JSON response like:

{
  "gpt-4-1106-turbo": {
    "input_cost_per_1k_tokens": 0.01,
    "output_cost_per_1k_tokens": 0.03
  },
  "gpt-3.5-turbo": {
    "input_cost_per_1k_tokens": 0.0015,
    "output_cost_per_1k_tokens": 0.002
  }
}

This would let developers dynamically fetch pricing without hardcoding values.

Conclusion

A pricing API would be a small but high-impact addition, aligning OpenAI’s developer experience with modern cloud services like AWS and Google Cloud.

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions