-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for OpenAI's .with_raw_response. #1138
Conversation
🦙 MegaLinter status: ❌ ERROR
See detailed report in MegaLinter reports |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1138 +/- ##
==========================================
- Coverage 81.67% 81.66% -0.02%
==========================================
Files 193 193
Lines 21334 21341 +7
Branches 3717 3721 +4
==========================================
+ Hits 17425 17428 +3
+ Misses 2826 2825 -1
- Partials 1083 1088 +5 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just had one requested change to the implementation otherwise looks good.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested it manually and things show up in the UI. Couple additional fixes but after that I think we're good to merge this one.
This PR adds support for
.with_raw_response.
prepending to HTTP method calls in OpenAI.Currently, the New Relic instrumentation will instrument this, but during the collection of custom AI event attributes, this modified response type (LegacyAPIResponse) will result in a crash.
This is part 1 of two PRs. The second part will cover the streaming version of this functionality,
.with_streaming_response.