-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: Azure, OpenAI - Log Response Headers #1724
Comments
Probably wouldn't hurt to do for other providers as well. Off the top of my head, I don't think any popular LLM API is returning private/sensitive content in their response headers. |
Starting with just OpenAI/Azure because we're going to need to move all OpenAI python calls from |
Tradeoffs of this move: openai/openai-python#416 (comment) It looks like |
another approach is using: https://www.python-httpx.org/advanced/#event-hooks |
Useful details here: langchain-ai/langchain#9601 |
Feedback from Langfuse's Marc (when I asked if he thought it was a good idea to also feed upstream traces to a different project):
I'm inclined to agree with him, not a good idea if it's just me asking. For logging the request and response metadata, can we do it like this? https://github.com/orgs/langfuse/discussions/1070#discussioncomment-8369918 {
"request": {
"headers": {
"host": "example.com"
},
"url": "https://api.openai.com/v1/chat/completions"
},
"response": {
"headers": {
"age": "1337"
},
"status": 200
}
} |
Attempt 1 PR: #1873 |
Hmm, I noticed that Sentry is able to log the response object/headers. Maybe we can steal/borrow their method of doing it? |
This is possible to do with: https://til.simonwillison.net/httpx/openai-log-requests-responses |
we do this now |
The Feature
User request
Motivation, pitch
cc @Manouchehri
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: