-
-
Notifications
You must be signed in to change notification settings - Fork 505
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add testing ability #71
Conversation
--------- Co-authored-by: Nuno Maduro <enunomaduro@gmail.com>
# Conflicts: # tests/Arch.php
@gehrisandro thank you very much for putting in the work for this PR. Internally, I personally don't think it's an issue that the testing implementation is inside the library. There are two things that I noticed during implementing this:
While testing, I didn't specify all of the choices responses and I saw the default fixture values like "this is indeed a test". Those default values made me wonder for a second if I'm making some actual API requests, so I think it might be better to either leave them out entirely or use some more verbose defaults that clearly indicate that this is a fake response.
I have to add some tests to ensure that error handling is working correctly. This can have a couple of different reasons. I thought that I could just create a fake response with an error, but that didn't work as the
Other than that, this is perfect! :) |
I would like to add, on top of @mpociot's feedback, that we need to adjust this pull request for the new streaming feature coming out. |
Hi @mpociot Thank you for your valuable feedback.
@nunomaduro I will start adjusting to the stream responses. Hope to finish until Monday evening. |
# Conflicts: # tests/Arch.php
bf9f84f
to
347a2de
Compare
@mpociot Exception testing should work now: $client = new ClientFake([
new \OpenAI\Exceptions\ErrorException([
'message' => 'The model `gpt-1` does not exist',
'type' => 'invalid_request_error',
'code' => null,
])
]);
// the `ErrorException` will be thrown
$completion = $client->completions()->create([
'model' => 'text-davinci-003',
'prompt' => 'PHP is ',
]); |
This PR makes it easier to write tests in your application using this library.
This is related to openai-php/laravel#23
@nunomaduro @mpociot I have decided to create most of the testing logic within the client library itself and not in the Laravel wrapper. The main reason for this decision is to provide the same testing options to users which are using the client directly.
Additionally it should be fairly easy to provide the testing options in other wrappers too (@GromNaN)
The PR for the Laravel wrapper is here: openai-php/laravel#27
There are multiple things to discuss:
OpenAI\Testing
namespace fromphpstan
because it's not really possible to add proper doc types as the return type could be possibly every available response.Todos:
A test using a fake response without providing all parameters