-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Store the last response object from OpenAI #116
Conversation
Hey @ezimuel , did you see this PR from @samuelgjekic ? |
Ops! I didn't see PR #110 sorry for that :-( I'll have a look and check the differences. |
@MaximeThoonsen, @samuelgjekic I did a look at #110 and the approach is similar. My proposal adds the full response object from OpenAI and not only the token usage. Moreover, I didn't add a specific class like Another difference, I stored the OpenAI response in @MaximeThoonsen do you think make sense to store the entire OpenAI response instead of just the token usage? Do you prefer having dedicated types, like a |
@ezimuel well after looking at your solution i will have to say that i feel like you did take a better approach with the lastResponse solution as it its clearer to the user what it does. @MaximeThoonsen can decide what to do here 🙂 |
Thanks @samuelgjekic for your feedback. In the meantime, I moved the |
Ok to store in a lastResponse object. Maybe add a small helper so that people can get the usage in an easy way?
This would be perfect |
I added a $chat = new OpenAIChat();
$answer = $chat->generateText('what is one + one ?');
printf("%s\n", $answer); # One plus one equals two
printf("Total tokens usage: %d\n", $chat->getTotalTokens()); # 19
$answer = $chat->generateText('And what is two + two ?');
printf("%s\n", $answer); # Two plus two equals four
printf("Total tokens usage: %d\n", $chat->getTotalTokens()); # 39 I think this function can be useful to take track of the API cost from OpenAI. @MaximeThoonsen and @samuelgjekic let me know WDYT, thanks! |
thanks a lot @ezimuel and @samuelgjekic . I really like the solution! |
This PR adds the possibility to store the last response from OpenAI to get advanced information like the token usage, fixing #109.
Here an example:
All the JSON properties from the OpenAI original HTTP response are available. Here an example of HTTP response (taken from the original OpenAI documentation):