-
Notifications
You must be signed in to change notification settings - Fork 13.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Expose complete response metadata from chat model via .invoke/.batch/.stream #16403
Open
1 task done
Labels
03 enhancement
Enhancement of existing functionality
Ɑ: models
Related to LLMs or chat model modules
Comments
cc @baskaryan |
Another discussion: #16030 |
dosubot
bot
added
Ɑ: models
Related to LLMs or chat model modules
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
labels
Jan 22, 2024
eyurtsev
added
03 enhancement
Enhancement of existing functionality
and removed
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
labels
Jan 22, 2024
@eyurtsev Is there an update here? I'm having trouble with the lack of reproducibility of the output. |
This was referenced Mar 1, 2024
baskaryan
added a commit
that referenced
this issue
Mar 12, 2024
Inspired by #16030 (reply in thread)
|
bechbd
pushed a commit
to bechbd/langchain
that referenced
this issue
Mar 29, 2024
gkorland
pushed a commit
to FalkorDB/langchain
that referenced
this issue
Mar 30, 2024
hinthornw
pushed a commit
that referenced
this issue
Apr 26, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
03 enhancement
Enhancement of existing functionality
Ɑ: models
Related to LLMs or chat model modules
Privileged issue
Issue Content
Impossible to access
system_fingerprint
from OpenAI responses.see: #13170 (reply in thread)
The text was updated successfully, but these errors were encountered: