Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

add concept of prompt collection #1507

Merged
merged 7 commits into from Mar 8, 2023
Merged

Conversation

hwchase17
Copy link
Contributor

No description provided.

Copy link
Collaborator

@nfcampos nfcampos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just one comment, LGTM otherwise

Tuple[Callable[[BaseLanguageModel], bool], BasePromptTemplate]
] = Field(default_factory=list)

def get_default_prompt(self, llm: BaseLanguageModel) -> BasePromptTemplate:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this just be called get_prompt?

@agola11
Copy link
Collaborator

agola11 commented Mar 7, 2023

Can we add an example for ConversationChain too? I think one of the benefits of prompt collection is to handle creating the ChatPromptTemplate with MessagesPlaceHolder already in the right place.

@hwchase17
Copy link
Contributor Author

Can we add an example for ConversationChain too? I think one of the benefits of prompt collection is to handle creating the ChatPromptTemplate with MessagesPlaceHolder already in the right place.

yes ill add a lot more

@hwchase17 hwchase17 merged commit c4a557b into master Mar 8, 2023
@hwchase17 hwchase17 deleted the harrison/promptcollections branch March 8, 2023 16:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants