Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

prompt ergonomics #7799

Merged
merged 8 commits into from
Jul 22, 2023
Merged

prompt ergonomics #7799

merged 8 commits into from
Jul 22, 2023

Conversation

hwchase17
Copy link
Contributor

No description provided.

@vercel
Copy link

vercel bot commented Jul 16, 2023

The latest updates on your projects. Learn more about Vercel for Git 鈫楋笌

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain 猬滐笍 Ignored (Inspect) Jul 22, 2023 9:01pm

@dosubot dosubot bot added the 馃:docs Changes to documentation and examples, like .md, .rst, .ipynb files. Changes to the docs/ folder label Jul 16, 2023
@@ -77,6 +77,13 @@ def lc_serializable(self) -> bool:
"""Whether this class is LangChain serializable."""
return True

# Cannot type this since it returns something we can't import
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could maybe gate on TYPE_CHECKING but maybe not

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah good call

@@ -164,6 +168,18 @@ class ChatPromptTemplate(BaseChatPromptTemplate, ABC):
input_variables: List[str]
messages: List[Union[BaseMessagePromptTemplate, BaseMessage]]

def __or__(self, other: Any) -> ChatPromptTemplate:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

feels weird to use or for this, would add not make more sense? like adding strings

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah thats fair

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For piping it seems like it'd make more sense to do it between a prompt and an LLM (basically decompose an llm chain)

parsed_result = my_prompt | HuggingFace() | my_parser;

input_variables = list(
set(self.input_variables) | set(other.input_variables)
)
template = self.template + "\n\n" + other.template
Copy link
Collaborator

@hinthornw hinthornw Jul 17, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It feels a bit limiting to always insert the double newline, especially if we can do concatenation with strings directly. I think I'd rather be able to do

prompt = (
    PromptTemplate.from_template("Tell me a joke about {topic}")
    + "\n\nmake it funny"
    + ", and in {language}"
)

And be explicit rather than have that be magical or restrictive

@hwchase17 hwchase17 merged commit cbf2fc8 into master Jul 22, 2023
23 checks passed
@hwchase17 hwchase17 deleted the harrison/prompt-ergonomics branch July 22, 2023 21:19
aerrober pushed a commit to aerrober/langchain-fork that referenced this pull request Jul 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
馃:docs Changes to documentation and examples, like .md, .rst, .ipynb files. Changes to the docs/ folder
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants