Skip to content

How can I configure the final prompt of ConversationalRetrievalChain? #9818

Answered by dosubot bot
matthiasroder asked this question in Q&A
Discussion options

You must be logged in to vote

🤖

Hello,

Based on the information you provided and the context from the LangChain repository, there are a couple of ways you can change the final prompt of the ConversationalRetrievalChain without modifying the LangChain source code.

  1. You can change the main prompt in ConversationalRetrievalChain by passing it in via combine_docs_chain_kwargs if you instantiate the chain using from_llm. Here is how you can do it:
ConversationalRetrievalChain.from_llm(combine_docs_chain_kwargs={"prompt": your_prompt}))
  1. Alternatively, you can use load_qa_chain with memory and a custom prompt. Here is an example:
from langchain.chains.question_answering import load_qa_chain
template = """
  {Your_Prompt}
  

Replies: 2 comments 8 replies

Comment options

You must be logged in to vote
6 replies
@nithinsikinam
Comment options

@muke5hy
Comment options

@DanQuin
Comment options

@muke5hy
Comment options

@DanQuin
Comment options

Answer selected by matthiasroder
Comment options

You must be logged in to vote
2 replies
@mbkamran
Comment options

@mdsaifk
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
6 participants