Skip to content

added calculate_cost_by_tokens - the case when token count is available#25

Merged
areibman merged 1 commit into
AgentOps-AI:mainfrom
nikilp:adds-calculate_cost_by_tokens
Feb 6, 2024
Merged

added calculate_cost_by_tokens - the case when token count is available#25
areibman merged 1 commit into
AgentOps-AI:mainfrom
nikilp:adds-calculate_cost_by_tokens

Conversation

@nikilp
Copy link
Copy Markdown

@nikilp nikilp commented Feb 1, 2024

This covers the case when a the response from the LLM already contains the total token count, thus it doesn't need to be calculated from the prompt or the completion again.

Especially useful when the underlying embeddings model is not available for correct calculation, while the response already has the counts.

Copy link
Copy Markdown
Contributor

@areibman areibman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks for the suggestion, @nikilp !

@areibman areibman merged commit cde75dc into AgentOps-AI:main Feb 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants