-
Notifications
You must be signed in to change notification settings - Fork 13.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Check for single prompt in __call__ method of the BaseLLM class #4892
Check for single prompt in __call__ method of the BaseLLM class #4892
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding the run time check! What do you think about flipping the logic so it'll catch more more cases?
langchain/llms/base.py
Outdated
@@ -287,6 +287,11 @@ def __call__( | |||
self, prompt: str, stop: Optional[List[str]] = None, callbacks: Callbacks = None | |||
) -> str: | |||
"""Check Cache and run the LLM on the given prompt and input.""" | |||
if isinstance(prompt, list): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's common to flip the logic:
if not instance(prompt, str):
this will then catch a case when prompt
is a tuple or a set
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @eyurtsev, agree, makes more sense that way 👍
langchain/llms/base.py
Outdated
@@ -287,6 +287,11 @@ def __call__( | |||
self, prompt: str, stop: Optional[List[str]] = None, callbacks: Callbacks = None | |||
) -> str: | |||
"""Check Cache and run the LLM on the given prompt and input.""" | |||
if isinstance(prompt, list): | |||
raise ValueError( | |||
"Argument `prompt` is expected to be a single string, not a list. If " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Argument `prompt` is expected to be a single string, not a list. If " | |
f"Argument `prompt` is expected to be a string. Instead found {type(prompt)}. If " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm! thanks for adding!
Ensuring that users pass a single prompt when calling a LLM
__call__
method of theBaseLLM
class to ensure that it is called with a single promptValueError
if users try to call a LLM with a list of prompt and instructs them to use thegenerate
method insteadWhy this could be useful
I stumbled across this by accident. I accidentally called the OpenAI LLM with a list of prompts instead of a single string and still got a result:
It might be better to catch such a scenario preventing unnecessary costs and irritation for the user.
Proposed behaviour
Before submitting
Who can review?
Community members can review the PR once tests pass. Tag maintainers/contributors who might be interested: