Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Store the last response object from OpenAI #116

Merged
merged 7 commits into from
May 12, 2024

Conversation

ezimuel
Copy link
Collaborator

@ezimuel ezimuel commented May 7, 2024

This PR adds the possibility to store the last response from OpenAI to get advanced information like the token usage, fixing #109.

Here an example:

$config = new OpenAIConfig();
$chat = new OpenAIChat($config);

$answer = $chat->generateText('here the question');
$response = $chat->getLastResponse();
printf("Tokens in prompt: %d\n", $response->usage->promptTokens);
printf("Tokens in response: %d\n", $response->usage->completionTokens);
printf("Total tokens: %d\n", $response->usage->totalTokens);

All the JSON properties from the OpenAI original HTTP response are available. Here an example of HTTP response (taken from the original OpenAI documentation):

{
    "id": "chatcmpl-123",
    "object": "chat.completion",
    "created": 1677652288,
    "model": "gpt-3.5-turbo-0125",
    "system_fingerprint": "fp_44709d6fcb",
    "choices": [{
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "\n\nHello there, how may I assist you today?"
      },
      "logprobs": null,
      "finish_reason": "stop"
    }],
    "usage": {
      "prompt_tokens": 9,
      "completion_tokens": 12,
      "total_tokens": 21
    }
}

@ezimuel ezimuel mentioned this pull request May 7, 2024
@MaximeThoonsen
Copy link
Collaborator

Hey @ezimuel , did you see this PR from @samuelgjekic ?
Maybe having some typing/objetcs for the usage is a good idea and for the rest your approach is pragmatic

@ezimuel
Copy link
Collaborator Author

ezimuel commented May 8, 2024

Ops! I didn't see PR #110 sorry for that :-( I'll have a look and check the differences.

@ezimuel
Copy link
Collaborator Author

ezimuel commented May 8, 2024

@MaximeThoonsen, @samuelgjekic I did a look at #110 and the approach is similar. My proposal adds the full response object from OpenAI and not only the token usage. Moreover, I didn't add a specific class like TokenUsage since the types are already defined in the openai-php/client library.

Another difference, I stored the OpenAI response in generateText and generateTextOrReturnFunctionCalled. The PR #110 added in generate and generateChat. I think the last approach is better since the generate is used in generateText and generateTextOrReturnFunctionCalled. I'll change this.

@MaximeThoonsen do you think make sense to store the entire OpenAI response instead of just the token usage? Do you prefer having dedicated types, like a OpenAIResponse class that maps all the properties or reuse the existing OpenAI\Responses\Chat\CreateResponse class?

@samuelgjekic
Copy link
Contributor

@ezimuel well after looking at your solution i will have to say that i feel like you did take a better approach with the lastResponse solution as it its clearer to the user what it does. @MaximeThoonsen can decide what to do here 🙂

@ezimuel
Copy link
Collaborator Author

ezimuel commented May 8, 2024

Thanks @samuelgjekic for your feedback. In the meantime, I moved the lastResponse in generate() and generateText() functions.
If @MaximeThoonsen approves the approach proposed in this PR I'll resolve the conflicts with the #110 PR.
We still need to add the documentation for this feature. Should I add it in docusaurus/docs/usage.md ?

@MaximeThoonsen
Copy link
Collaborator

Ok to store in a lastResponse object. Maybe add a small helper so that people can get the usage in an easy way?

Should I add it in docusaurus/docs/usage.md ?

This would be perfect

@ezimuel
Copy link
Collaborator Author

ezimuel commented May 10, 2024

I added a getTotalTokens() function that adds the total token usage from sequential calls.
Here an example (form the documentation included):

$chat = new OpenAIChat();

$answer = $chat->generateText('what is one + one ?');
printf("%s\n", $answer); # One plus one equals two
printf("Total tokens usage: %d\n", $chat->getTotalTokens()); # 19

$answer = $chat->generateText('And what is two + two ?');
printf("%s\n", $answer); # Two plus two equals four
printf("Total tokens usage: %d\n", $chat->getTotalTokens()); # 39

I think this function can be useful to take track of the API cost from OpenAI.
I also added the documentation and removed the TokenUsage class introduced in #110.

@MaximeThoonsen and @samuelgjekic let me know WDYT, thanks!

@MaximeThoonsen MaximeThoonsen merged commit 51459a3 into theodo-group:main May 12, 2024
4 checks passed
@MaximeThoonsen
Copy link
Collaborator

thanks a lot @ezimuel and @samuelgjekic . I really like the solution!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants