New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug/4299 cant instantiate simplechatmodel subclass without defining agenerate #4300
Closed
JacobFV
wants to merge
10
commits into
langchain-ai:master
from
JacobFV:bug/4299-cant-instantiate-simplechatmodel-subclass-without-defining-agenerate
Closed
Changes from all commits
Commits
Show all changes
10 commits
Select commit
Hold shift + click to select a range
0cfe4cb
Added facade classes: LLMFacade, ChatModelFacade
limboid-inc 5d4dae5
Added messages.serialize_messages for Chat facade
limboid-inc 6338cd6
fixed imports
limboid-inc 1375961
add __init__ to utils
limboid-inc 922bd36
moved utils to existing utils.py
limboid-inc fdb9590
added tests
limboid-inc 923168b
fixed bug
limboid-inc 9022e7b
Merge branch 'hwchase17:master' into bug/4299-cant-instantiate-simple…
JacobFV a7cf1c0
Merge branch 'hwchase17:master' into bug/4299-cant-instantiate-simple…
JacobFV b70f334
Merge branch 'hwchase17:master' into bug/4299-cant-instantiate-simple…
JacobFV File filter
Filter by extension
Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
from langchain.wrappers.chat_model_facade import ChatModelFacade | ||
from langchain.wrappers.llm_facade import LLMFacade | ||
|
||
__all__ = [ | ||
"ChatModelFacade", | ||
"LLMFacade", | ||
] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,33 @@ | ||
from __future__ import annotations | ||
|
||
from typing import List, Optional | ||
|
||
from langchain.chat_models.base import BaseChatModel, SimpleChatModel | ||
from langchain.schema import BaseMessage | ||
from langchain.llms.base import BaseLanguageModel | ||
from langchain.utils import serialize_msgs | ||
|
||
|
||
class ChatModelFacade(SimpleChatModel): | ||
llm: BaseLanguageModel | ||
|
||
def _call(self, messages: List[BaseMessage], stop: Optional[List[str]] = None) -> str: | ||
if isinstance(self.llm, BaseChatModel): | ||
return self.llm(messages, stop=stop).content | ||
elif isinstance(self.llm, BaseLanguageModel): | ||
return self.llm(serialize_msgs(messages), stop=stop) | ||
else: | ||
raise ValueError( | ||
f"Invalid llm type: {type(self.llm)}. Must be a chat model or language model." | ||
) | ||
|
||
@classmethod | ||
def of(cls, llm): | ||
if isinstance(llm, BaseChatModel): | ||
return llm | ||
elif isinstance(llm, BaseLanguageModel): | ||
return cls(llm) | ||
else: | ||
raise ValueError( | ||
f"Invalid llm type: {type(llm)}. Must be a chat model or language model." | ||
) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,37 @@ | ||
from __future__ import annotations | ||
|
||
from typing import Any, List, Mapping, Optional | ||
|
||
from langchain.chat_models.base import BaseChatModel | ||
from langchain.llms.base import LLM, BaseLanguageModel | ||
|
||
|
||
class LLMFacade(LLM): | ||
chat_model: BaseChatModel | ||
|
||
@property | ||
def _llm_type(self) -> str: | ||
return self.chat_model._llm_type | ||
|
||
def _call( | ||
self, | ||
prompt: str, | ||
stop: Optional[List[str]] = None, | ||
) -> str: | ||
return self.chat_model.call_as_llm(prompt, stop=stop) | ||
|
||
@property | ||
def _identifying_params(self) -> Mapping[str, Any]: | ||
"""Get the identifying parameters.""" | ||
return self._chat._identifying_params | ||
|
||
@staticmethod | ||
def of(llm) -> LLMFacade: | ||
if isinstance(llm, BaseChatModel): | ||
return LLMFacade(llm) | ||
elif isinstance(llm, BaseLanguageModel): | ||
return llm | ||
else: | ||
raise ValueError( | ||
f"Invalid llm type: {type(llm)}. Must be a chat model or language model." | ||
) |
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,12 @@ | ||
from langchain.llms.fake import FakeListLLM | ||
from langchain.schema import SystemMessage | ||
from langchain.wrappers.chat_model_facade import ChatModelFacade | ||
|
||
|
||
def test_chat_model_facade(): | ||
llm = FakeListLLM(responses=["hello", "goodbye"]) | ||
chat_model = ChatModelFacade.of(llm) | ||
input_message = SystemMessage(content="hello") | ||
output_message = chat_model([input_message]) | ||
assert output_message.content == "hello" | ||
assert output_message.type == "ai" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
def test_llm_facade(): | ||
pass |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR!
This will have to be an async method. Will defer to @agola11 on whether it makes sense to have it default to
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, this needs to be async and we should not be calling blocking code in async methods. A good solution is having the default implementation for
async def _agenerate
call run_in_executor, as @vowelparrot suggestedThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#4701
how does this look