Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Check for single prompt in __call__ method of the BaseLLM class #4892

Merged
merged 2 commits into from
May 19, 2023
Merged

Check for single prompt in __call__ method of the BaseLLM class #4892

merged 2 commits into from
May 19, 2023

Conversation

mwinterde
Copy link
Contributor

Ensuring that users pass a single prompt when calling a LLM

  • This PR adds a check to the __call__ method of the BaseLLM class to ensure that it is called with a single prompt
  • Raises a ValueError if users try to call a LLM with a list of prompt and instructs them to use the generate method instead

Why this could be useful

I stumbled across this by accident. I accidentally called the OpenAI LLM with a list of prompts instead of a single string and still got a result:

>>> from langchain.llms import OpenAI
>>> llm = OpenAI()
>>> llm(["Tell a joke"]*2)
"\n\nQ: Why don't scientists trust atoms?\nA: Because they make up everything!"

It might be better to catch such a scenario preventing unnecessary costs and irritation for the user.

Proposed behaviour

>>> from langchain.llms import OpenAI
>>> llm = OpenAI()
>>> llm(["Tell a joke"]*2)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/marcus/Projects/langchain/langchain/llms/base.py", line 291, in __call__
    raise ValueError(
ValueError: Argument `prompt` is expected to be a single string, not a list. If you want to run the LLM on multiple prompts, use `generate` instead.

Before submitting

Who can review?

Community members can review the PR once tests pass. Tag maintainers/contributors who might be interested:

Copy link
Collaborator

@eyurtsev eyurtsev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding the run time check! What do you think about flipping the logic so it'll catch more more cases?

@@ -287,6 +287,11 @@ def __call__(
self, prompt: str, stop: Optional[List[str]] = None, callbacks: Callbacks = None
) -> str:
"""Check Cache and run the LLM on the given prompt and input."""
if isinstance(prompt, list):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's common to flip the logic:

if not instance(prompt, str):

this will then catch a case when prompt is a tuple or a set

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @eyurtsev, agree, makes more sense that way 👍

@@ -287,6 +287,11 @@ def __call__(
self, prompt: str, stop: Optional[List[str]] = None, callbacks: Callbacks = None
) -> str:
"""Check Cache and run the LLM on the given prompt and input."""
if isinstance(prompt, list):
raise ValueError(
"Argument `prompt` is expected to be a single string, not a list. If "
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"Argument `prompt` is expected to be a single string, not a list. If "
f"Argument `prompt` is expected to be a string. Instead found {type(prompt)}. If "

Copy link
Contributor

@hwchase17 hwchase17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm! thanks for adding!

@hwchase17 hwchase17 added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label May 19, 2023
@dev2049 dev2049 merged commit 2aa3754 into langchain-ai:master May 19, 2023
12 checks passed
@danielchalef danielchalef mentioned this pull request Jun 5, 2023
This was referenced Jun 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lgtm PR looks good. Use to confirm that a PR is ready for merging.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants