Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when overriding default prompt template of ConversationChain #1800

Closed
universe6666 opened this issue Mar 19, 2023 · 27 comments
Closed

Comments

@universe6666
Copy link

universe6666 commented Mar 19, 2023

Hi, does anyone know how to override the prompt template of ConversationChain? I am creating a custom prompt template that takes in an additional input variable

PROMPT_TEMPLATE = """ {my_info}
{history}
Human: {input}
AI:"""

PROMPT = PromptTemplate(
    input_variables=["history", "input", "my_info"], template=PROMPT_TEMPLATE
)

conversation_chain = ConversationChain(
    prompt=PROMPT,
    llm=OpenAI(temperature=0.7), 
    verbose=True, 
    memory=ConversationBufferMemory()
)

but got the following error:

Got unexpected prompt input variables. The prompt expects ['history', 'input', 'my_info'], but got ['history'] as inputs from memory, and input as the normal input key. (type=value_error)

Is my understanding correct that currently ConversationChain can only support prompt template that takes in "history" and "input" as the input variables?

@hwchase17
Copy link
Contributor

yes you're correct - conversation chain only currently allows for a single input, that being the input key, and then also history (coming from memory)

@tezer
Copy link

tezer commented Mar 20, 2023

How should I handle this situation if I want to add context?

system_template="""Use the following pieces of context to answer the users question. 
If you don't know the answer, just say that you don't know, don't try to make up an answer.
----------------
{context}"""
messages = [
    SystemMessagePromptTemplate.from_template(system_template),
    MessagesPlaceholder(variable_name="history"),
    HumanMessagePromptTemplate.from_template("{input}")
]

@mrbende
Copy link

mrbende commented Mar 29, 2023

I'm looking into this issue as well. Is it possible to add additional context and to the systemprompt for ConversationChain?

@SoulEvill
Copy link

SoulEvill commented Apr 5, 2023

I am facing the similar issue. There are use cases when we need customization for prompt in ConversationChain. Current implementation does not support that and really hope langchain can support customize prompt and make conversationchain more flexible and even better to consider different prompt as conversation goes. As I am writing this, to me, it sounds maybe more like Agent instead of Chain, is there a agent class that is capable of customize prompt? or Conversation Agent that is capable of using different prompt and pass along chat history as conversation goes?

@ulucinar
Copy link

ulucinar commented Apr 6, 2023

I've opened a PR that I believe addresses this issue. My understanding is that currently Conversation chain's memory does not inherit the conversation chain's input_key and we try to deduce it with get_prompt_input_key assuming there are only memory, input and stop variables in the prompt. The PR suggests that the memory of the conversation chain inherits the conversation chain's input_key.
As exemplified in the PR, with the proposed change we can use a system prompt template such as the following:

system_msg_template = SystemMessagePromptTemplate.from_template(template="You are a translator helping me in translating from {input_language} to {output_language}. " + 
    "Please translate the messages I type.")

What do you think @hwchase17?

@notedit
Copy link

notedit commented May 1, 2023

I'm looking into this issue as well,

@firaskudsy
Copy link

any update on this issue

@GianottiGustavo
Copy link

I need this case too.

@Gogoboy123
Copy link

Same for me, it would be a lot easier to get good results for other language than English and improve the first response

@Bonobo791
Copy link

Bonobo791 commented May 27, 2023

Any fix? I'm only using the two variables and it still doesn't work.

@hussainwali74
Copy link

??

@edmondop
Copy link

edmondop commented Jun 4, 2023

What's the current work around for this? Using a single prompt template that has an input putting the SystemMessage and the HumanMessage there?

@ragzman
Copy link

ragzman commented Jun 4, 2023

This is very weird. I can get ConversationChain to work with multiple inputs in the JS library, but it fails in python.
I moved my whole app to python to make it faster.. and now this. :-(

@usamimeri
Copy link

I also encounter the same problem.It seems that you can't use customized variable to replace the "input" placeholder.

@edugargar
Copy link

Do we have any updates on this one?

@Bonobo791
Copy link

@hwchase17

@unmotivatedgene
Copy link

I too am waiting on this.

@zxdream64230
Copy link

same here :)

@Hallimede
Copy link

Same here

1 similar comment
@vickwv
Copy link

vickwv commented Jun 19, 2023

Same here

@Bananenfraese
Copy link

As a workaround I'm just subclassing the memory and use that instead like this...

class ExtendedConversationBufferMemory(ConversationBufferMemory):
    extra_variables:List[str] = []

    @property
    def memory_variables(self) -> List[str]:
        """Will always return list of memory variables."""
        return [self.memory_key] + self.extra_variables

    def load_memory_variables(self, inputs: Dict[str, Any]) -> Dict[str, Any]:
        """Return buffer with history and extra variables"""
        d = super().load_memory_variables(inputs)
        d.update({k:inputs.get(k) for k in self.extra_variables})        
        return d

...then initialize the chain with an instance of that memory class

    llm_chain = ConversationChain(
        llm=llm,
        prompt=prompt,
        memory=ExtendedConversationBufferMemory(extra_variables=["context"]) 
    )

After that I just add the "context" (and any other extra variables) by passing them via the input when calling the chain:

result = llm_chain({"input": "some input", "context": "whatever context"})

I'm not sure whether this is a good way to do it, but it works fine in my case, so maybe it will also work for others.

@njc-ai
Copy link

njc-ai commented Jun 20, 2023

Hi!

Is this what you all are looking for?

#5462

I was able to use this to set a new system message

@aigloss
Copy link

aigloss commented Jun 27, 2023

Hi! Even though @Bananenfraese 's solution might work, I'd let template classes do template things. As someone said above, ConversationChain only allows to use 'history', and 'input' as input variables for the PromptTemplate, nothing more, nothing less. In you're interested in the related code, (and understand why @Bananenfraese 's solution works), check this: method:https://github.com/hwchase17/langchain/blob/fcb3a647997c6275e3d341abb032e5106ea39cac/langchain/chains/conversation/base.py#L44

However, you can just pass those values to the PromptTemplate and let it use them. Here is the example from @universe6666 modified:

PROMPT_TEMPLATE = """ {my_info}
{history}
Human: {input}
AI:"""


# define a custom PromptTemplate that supports your new variables
class CustomPromptTemplate(StringPromptTemplate):
   my_info: str

  def format(self, **kwargs) -> str:
    kwargs['my_info']=self.my_info
    return self.template.format(**kwargs)


# make sure you feed the PromptTemplate with the new variables
PROMPT = CustomPromptTemplate(
    input_variables=["history", "input"], template=PROMPT_TEMPLATE, my_info="whatever"
)


conversation_chain = ConversationChain(
    prompt=PROMPT,
    llm=OpenAI(temperature=0.7), 
    verbose=True, 
    memory=ConversationBufferMemory()
)

Disclaimer: not tested but enough for you to know a clean way to solve this

@ColinTitahi
Copy link

Glad I saw this though I was missing something obvious.
My horrid kludge is to add what I need to the beginning of the template
e.g PROMPT.template = f'The current date is {todayFormat}.' + PROMPT.template

@wnmurphy
Copy link
Contributor

wnmurphy commented Aug 5, 2023

How should I handle this situation if I want to add context?

system_template="""Use the following pieces of context to answer the users question. 
If you don't know the answer, just say that you don't know, don't try to make up an answer.
----------------
{context}"""
messages = [
    SystemMessagePromptTemplate.from_template(system_template),
    MessagesPlaceholder(variable_name="history"),
    HumanMessagePromptTemplate.from_template("{input}")
]

@tezer @universe6666 For what it's worth I use this pattern, but I inject contextual data into my main prompt (a Jinja2 template in a YAML file), and then make that the initial system message. Then you have the interaction frame and instructions, any contextual data, the chat history, then the next input.

You can:

  1. Have your main prompt take any number of contextual data variables.
  2. Create a SystemMessagePromptTemplate from a Jinja2 template like: smpt = SystemMessagePromptTemplate.from_template(my_prompt_template, template_format="jinja2")
  3. Format the prompt with your data like: smpt = smpt.format(**context_data)
  4. Use it in a ChatPromptTemplate like this:
default_chat_prompt = ChatPromptTemplate.from_messages([
    # SystemMessage, contains the prompt with context data injected above.
    smpt,
    # Placeholder for chat history.
    MessagesPlaceholder(variable_name="history"),
    # Incoming message from user.
    HumanMessagePromptTemplate.from_template("{input}"),
])

Copy link

dosubot bot commented Dec 13, 2023

Hi, @universe6666,

I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, the issue you raised pertains to the mismatch between expected input variables in the prompt template and the actual input received in ConversationChain. It has garnered significant attention from the community, with discussions on potential solutions and alternative approaches. Notably, ulucinar has opened a pull request addressing the issue.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your understanding and contribution to LangChain!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Dec 13, 2023
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Dec 20, 2023
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Dec 20, 2023
@fitlemon
Copy link

fitlemon commented Feb 8, 2024

extra_variables=["context"]

It's worked for me. Anybody know, do they fix this issue? (putting context in ConversationChain prompt at every run)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet