-
Notifications
You must be signed in to change notification settings - Fork 13.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when overriding default prompt template of ConversationChain #1800
Comments
yes you're correct - conversation chain only currently allows for a single input, that being the input key, and then also history (coming from memory) |
How should I handle this situation if I want to add context?
|
I'm looking into this issue as well. Is it possible to add additional context and to the systemprompt for ConversationChain? |
I am facing the similar issue. There are use cases when we need customization for prompt in ConversationChain. Current implementation does not support that and really hope langchain can support customize prompt and make conversationchain more flexible and even better to consider different prompt as conversation goes. As I am writing this, to me, it sounds maybe more like Agent instead of Chain, is there a agent class that is capable of customize prompt? or Conversation Agent that is capable of using different prompt and pass along chat history as conversation goes? |
I've opened a PR that I believe addresses this issue. My understanding is that currently Conversation chain's memory does not inherit the conversation chain's system_msg_template = SystemMessagePromptTemplate.from_template(template="You are a translator helping me in translating from {input_language} to {output_language}. " +
"Please translate the messages I type.") What do you think @hwchase17? |
I'm looking into this issue as well, |
any update on this issue |
I need this case too. |
Same for me, it would be a lot easier to get good results for other language than English and improve the first response |
Any fix? I'm only using the two variables and it still doesn't work. |
?? |
What's the current work around for this? Using a single prompt template that has an input putting the SystemMessage and the HumanMessage there? |
This is very weird. I can get ConversationChain to work with multiple inputs in the JS library, but it fails in python. |
I also encounter the same problem.It seems that you can't use customized variable to replace the "input" placeholder. |
Do we have any updates on this one? |
I too am waiting on this. |
same here :) |
Same here |
1 similar comment
Same here |
As a workaround I'm just subclassing the memory and use that instead like this...
...then initialize the chain with an instance of that memory class
After that I just add the "context" (and any other extra variables) by passing them via the input when calling the chain:
I'm not sure whether this is a good way to do it, but it works fine in my case, so maybe it will also work for others. |
Hi! Is this what you all are looking for? I was able to use this to set a new system message |
Hi! Even though @Bananenfraese 's solution might work, I'd let template classes do template things. As someone said above, ConversationChain only allows to use 'history', and 'input' as input variables for the PromptTemplate, nothing more, nothing less. In you're interested in the related code, (and understand why @Bananenfraese 's solution works), check this: method:https://github.com/hwchase17/langchain/blob/fcb3a647997c6275e3d341abb032e5106ea39cac/langchain/chains/conversation/base.py#L44 However, you can just pass those values to the PromptTemplate and let it use them. Here is the example from @universe6666 modified:
Disclaimer: not tested but enough for you to know a clean way to solve this |
Glad I saw this though I was missing something obvious. |
@tezer @universe6666 For what it's worth I use this pattern, but I inject contextual data into my main prompt (a Jinja2 template in a YAML file), and then make that the initial system message. Then you have the interaction frame and instructions, any contextual data, the chat history, then the next input. You can:
default_chat_prompt = ChatPromptTemplate.from_messages([
# SystemMessage, contains the prompt with context data injected above.
smpt,
# Placeholder for chat history.
MessagesPlaceholder(variable_name="history"),
# Incoming message from user.
HumanMessagePromptTemplate.from_template("{input}"),
]) |
Hi, @universe6666, I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, the issue you raised pertains to the mismatch between expected input variables in the prompt template and the actual input received in ConversationChain. It has garnered significant attention from the community, with discussions on potential solutions and alternative approaches. Notably, ulucinar has opened a pull request addressing the issue. Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you for your understanding and contribution to LangChain! |
It's worked for me. Anybody know, do they fix this issue? (putting context in ConversationChain prompt at every run) |
Hi, does anyone know how to override the prompt template of ConversationChain? I am creating a custom prompt template that takes in an additional input variable
but got the following error:
Is my understanding correct that currently ConversationChain can only support prompt template that takes in "history" and "input" as the input variables?
The text was updated successfully, but these errors were encountered: