BaseChatModel.astream()
does not pass the run_manager
object to the BaseChatModel._astream()
method
#21327
Open
5 tasks done
Labels
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
Ɑ: core
Related to langchain-core
investigate
Checked other resources
Example Code
The bug was introduced in
langchain/libs/core/langchain_core/language_models/chat_models.py
(link to master) since v0.1.14.First, note the definition of the
BaseChatModel._astream()
method (link to master):It accepts the optional
run_manager: Optional[AsyncCallbackManagerForLLMRun]
parameter.That's how
BaseChatModel.astream()
callsBaseChatModel._astream()
(link to master), note that it never passes therun_manager
parameter:That's how
BaseChatModel.astream()
used to callBaseChatModel._astream()
in v0.1.13, note that therun_manager
object used to be passed properly:As stated in the current docs,
BaseChatMode._astream()
method can be overridden by users, so preserving the same API within minor version updates is important. That's why I consider this a bug.Error Message and Stack Trace (if applicable)
No response
Description
I am using a
RunnableConfig
object to pass the chat ID through the whole pipeline for logging purposes:Then I try to get the chat ID back in my custom model derived from
BaseChatModel
like this:This code worked perfectly fine in v0.1.13.
Starting with v.0.1.14 and up to the current master, the
run_manager
object is never passed inside theBaseChatModel._astream()
method. Hence, my code always seesrun_manager == None
and fails with theValueError
exception.System Info
LangChain libs versions:
Platform: MacOS
Python: 3.11.7
The text was updated successfully, but these errors were encountered: