-
-
Notifications
You must be signed in to change notification settings - Fork 103
Open
Labels
AgentIssues & PRs about the AI Agent componentIssues & PRs about the AI Agent componentPlatformIssues & PRs about the AI Platform componentIssues & PRs about the AI Platform componentRFCRFC = Request For Comments (proposals about features that you want to be discussed)RFC = Request For Comments (proposals about features that you want to be discussed)
Description
Hi !
OpenAI GPT added a functionality to receive the usage token using stream https://platform.openai.com/docs/api-reference/chat/create#chat-create-stream_options-include_usage.
The current implementation just ignore these messages as is not part of the delta content.
continue; |
I’ve been working on an implementation to parse and return the usage object through the generator, but I’m not really sure if this is the best approach. Alternatively, I could consider using an event listener or adding a usage property in the StreamResult that would be accessible after the stream has been consumed.
So far, this is how the PR looks in action:
$options = [
'stream' => true,
'stream_options' => [
'include_usage' => true,
],
];
$stream = $agent->call($messages, $options)->getContent();
if (!is_iterable($stream)) {
throw new \RuntimeException('Invalid result type, expected iterable');
}
foreach ($stream as $chunk) {
if (is_string($chunk)) {
$context->setDelta($chunk);
continue;
}
if ($chunk instanceof TokenUsage) {
$context->addUsedTokens(
$chunk->totalTokens ?: 0,
$chunk->cachedTokens ?: 0
);
}
}
I'll be so glad to know your opinion about these approaches or any other alternatives.
Thanks !
Metadata
Metadata
Assignees
Labels
AgentIssues & PRs about the AI Agent componentIssues & PRs about the AI Agent componentPlatformIssues & PRs about the AI Platform componentIssues & PRs about the AI Platform componentRFCRFC = Request For Comments (proposals about features that you want to be discussed)RFC = Request For Comments (proposals about features that you want to be discussed)