You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Remove response.usage.completion_tokens, response.usage.total_tokens, response.usage.prompt_tokens from LlmEmbedding and LlmChatCompletionSummary
#2057
Closed
bizob2828 opened this issue
Feb 29, 2024
· 1 comment
· Fixed by #2093
Once the AIM UI has been updated to read token counts from token_count key(see #2056), we should remove the legacy attributes response.usage.completion_tokens, response.usage.total_tokens, and response.usage.prompt_tokens from openai and langchain. We also want to remove transaction_id from all events as it will no longer be necesary as events will be queried by trace_id
The text was updated successfully, but these errors were encountered:
Description
Once the AIM UI has been updated to read token counts from
token_count
key(see #2056), we should remove the legacy attributesresponse.usage.completion_tokens
,response.usage.total_tokens
, andresponse.usage.prompt_tokens
from openai and langchain. We also want to removetransaction_id
from all events as it will no longer be necesary as events will be queried by trace_idThe text was updated successfully, but these errors were encountered: