Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ConversationChain Validation Error #2024

Closed
hussamsayeed opened this issue Mar 27, 2023 · 6 comments
Closed

ConversationChain Validation Error #2024

hussamsayeed opened this issue Mar 27, 2023 · 6 comments

Comments

@hussamsayeed
Copy link

II'm using some variable data in my prompt, and sending that prompt to ConversationChain and there I'm getting some validation error.
what changes we can make it to work as expected?
Screenshot (1225)

@L4rryFisherman
Copy link
Contributor

In your memory function, you can define:

  • memory_key (the identifier for the memory)
  • input_key (the identifier for the 'human')
  • output_key (the identifier for the A.I.

It looks like your prompt's input_key: input does not match your memory's input_key: history.

You can either adjust your memory, like so:

memory = ConversationTokenBufferMemory(
    ...
    memory_key="chat_history_lines",
    input_key="input",
    ...
)

@hussamsayeed
Copy link
Author

In your memory function, you can define:

  • memory_key (the identifier for the memory)
  • input_key (the identifier for the 'human')
  • output_key (the identifier for the A.I.

It looks like your prompt's input_key: input does not match your memory's input_key: history.

You can either adjust your memory, like so:

memory = ConversationTokenBufferMemory(
    ...
    memory_key="chat_history_lines",
    input_key="input",
    ...
)

yep, it is defined exactly the sameway. still it persists the same

@edom18
Copy link

edom18 commented Apr 24, 2023

I have same issue. I've digged the class to know about. I also use ChatPromptTemplate to create prompts with ChatOpenAI class.

I've noticed that the ConversationChain validator will collect all input keys from prompts but the validator refers only its input key and memory key.

Other words, how to add some keys that are hold on prompts to the ConversationChain class?

If the class accepts input keys as an array, I can adjust the keys to that I want.

@bigrig2212
Copy link

bigrig2212 commented May 13, 2023

Me too - all kinds of issues trying to get memory to work with a prompt template with lots of variables. I'm generally finding it easier to not use all of the langchain helpers and do all of the prompt compilation custom and just send in the precompiled prompt to a very simple chain call.

This is one of many variations i've tried below. it gives variables not found errors until i add both input and history to chatPrompt.format, and then finally it throws an error

>  TypeError: this.prompt.formatPromptValue is not a function
>      at ConversationChain._call (file:///Users/benjaminrigby/Documents/GitHub/eggreat/functions/node_modules/langchain/dist/chains/llm_chain.js:79:47) 
const chat = new ChatOpenAI({ 
      openAIApiKey: config.openai.apiKey, 
      temperature: 0.0,
      model_name: "gpt-3.5-turbo",
      verbose: true
  });

  const chatPrompt = ChatPromptTemplate.fromPromptMessages([
      SystemMessagePromptTemplate.fromTemplate(
          system_prompt_template
      ),
      new MessagesPlaceholder("history"),
      HumanMessagePromptTemplate.fromTemplate("{input}"),
    ]);

  let pastMessages = [];
  if (short_term_memory_array != null){
      for (let i = 0; i < short_term_memory_array.length; i++) {
          let message = short_term_memory_array[i];
          pastMessages.push(new HumanChatMessage(message.player_msg));
          let this_npc_msg = `1. ${message.npc_msg}.\n2. ${message.npc_action}`;
          pastMessages.push(new AIChatMessage(this_npc_msg));
      }
  }
  console.log(pastMessages)

  const memory = new BufferMemory({
      returnMessages: true,
      memoryKey: "history",
      chatHistory: new ChatMessageHistory(pastMessages)
  });

  const chain = new ConversationChain({
      memory: memory,
      prompt: await chatPrompt.format({
          npc_name: prompt_variables.npc_name,
          npc_persona: prompt_variables.npc_persona,
          long_term_memory: prompt_variables.long_term_memory,
          action_list: prompt_variables.action_list,
          player_name: prompt_variables.player_name,
          input: new_human_input
      }),
      llm: chat,
      verbose: true
  });

  const response = await chain.call();

@bigrig2212
Copy link

Like instead of all of that above, i'll precompile long/short term memory into a system prompt and just do this instead:

const NpcChat = async function (system_prompt, human_prompt) {
    const model = new ChatOpenAI({ 
        openAIApiKey: config.openai.apiKey, 
        temperature: 0.0,
        model_name: "gpt-3.5-turbo",
        verbose: true
    });

    const response = await model.call([
        new SystemChatMessage(system_prompt),
        new HumanChatMessage(human_prompt)
      ]);

    return (response)
}

I keep going back to langchain to try to make all the helpers work b/c i feel there will be a long term benefit to using the framework... but keep running into these various probs...

@dosubot
Copy link

dosubot bot commented Sep 19, 2023

Hi, @hussamsayeed! I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, the issue is about a validation error when using variable data in the prompt and sending it to ConversationChain. L4rryFisherman suggested adjusting the memory keys to resolve the issue, but you mentioned that the keys are already defined correctly. edom18 also reported facing the same issue and suggested adding input keys as an array to the ConversationChain class. Additionally, bigrig2212 shared their experience and suggested customizing the prompt compilation instead of relying on langchain helpers, along with an alternative approach using precompiled prompts.

Before we close this issue, we wanted to check if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.

Thank you for your understanding and contributions to the LangChain repository!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Sep 19, 2023
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Sep 26, 2023
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Sep 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants