Skip to content

Conversation

@sauravpanda
Copy link
Member

@sauravpanda sauravpanda commented May 28, 2024

PR Description

This pull request aims to address the issue of custom token pricing by introducing changes to the configuration and provider files.

Changes Made

  • In the config.json file, the language model's model attribute has been replaced with a nested object containing name, input_token_cost, and output_token_cost properties.
  • In the provider.py file, default input and output token costs have been added to the LLMProvider class, and the chat_completion method has been updated to utilize the new token cost values.

These modifications enable the customization of token pricing for the language model, providing more flexibility and control over the cost associated with input and output tokens.

✨ Generated with love by Kaizen ❤️

Original Description None

@sauravpanda sauravpanda linked an issue May 28, 2024 that may be closed by this pull request
@kaizen-bot
Copy link
Contributor

kaizen-bot bot commented May 28, 2024

Code Review

Code Quality

[moderate] -> Using constants can improve code readability and maintainability. kaizen/llms/provider.py | 6 - 14

Potential Issues

[important] -> Missing error handling can lead to unexpected runtime errors. kaizen/llms/provider.py | 33 - 49

Improvements

[important] -> Input validation can prevent unexpected behavior due to invalid input. kaizen/llms/provider.py | 33 - 49

✨ Generated with love by Kaizen ❤️

@sauravpanda sauravpanda merged commit d7ae240 into main May 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

pass model -> cost via config to be used by litellm

2 participants