Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue: Issue providing LLMChain with multiple variables when memory is used #8710

Closed
dreysco opened this issue Aug 3, 2023 · 7 comments
Closed
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: memory Related to memory module 🤖:question A specific question about the codebase, product, project, or how to use a feature

Comments

@dreysco
Copy link

dreysco commented Aug 3, 2023

Issue you'd like to raise.

I'm having an issue with providing the LLMChain class with multiple variables when I provide it with a memory object. It works fine when I don't have memory attached to it. I followed the example given in this document: LLM Chain Multiple Inputs

Here is the code that I used, which is mostly based on the example from the above documentation, besides I've added memory.

# Multiple inputs example

template = """Tell me a {adjective} joke about {subject}."""
prompt = PromptTemplate(template=template, input_variables=["adjective", "subject"])
llm = OpenAI(temperature=0)
memory = ConversationKGMemory(llm=llm)
llm_chain = LLMChain(prompt=prompt, llm=OpenAI(temperature=0))

llm_chain.predict(adjective="sad", subject="ducks")

With the above code, I get the following error:

ValueError: One input key expected got ['adjective', 'subject']

Python version: 3.11
LangChain version: 0.0.250

Suggestion:

Is there support for using multiple input variables when memory is involved?

@dosubot dosubot bot added Ɑ: memory Related to memory module 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🤖:question A specific question about the codebase, product, project, or how to use a feature labels Aug 3, 2023
@keenborder786
Copy link
Contributor

I am not so sure why are you using ConversationKGMemory in this case. But still to clarify ConversationKGMemory needs to have one single input key. Since your prompt has two input keys, the ConversationKGMemory raises a ValuerError. Therefore you need to specify the single input key for ConversationKGMemory that you would like to use:

template = """Tell me a {adjective} joke about {subject}."""
prompt = PromptTemplate(
    template=template, input_variables=["adjective", "subject"]
)
llm = OpenAI(temperature=0)
memory = ConversationKGMemory(llm=llm, input_key="adjective")
llm_chain = LLMChain(prompt=prompt, llm=OpenAI(temperature=0), memory=memory)

print(llm_chain.predict(adjective="sad", subject="ducks"))

@dreysco
Copy link
Author

dreysco commented Aug 3, 2023

Just to clarify. Whenever any Memory is used, it will take at most one single input key?

And when no Memory is used, it will take multiple input keys?

@keenborder786
Copy link
Contributor

No, that's not right. The number of input keys and Memory are independent, but the Memory requires a way to access its buffer through input keys. Therefore, you need to specify an input key for certain types of memory, such as VectorBasedMemory. This input key allows the VectorBased memory to identify which specific input it needs to compare against the documents in the Vector DB.

@keenborder786
Copy link
Contributor

If you are satisfied, please close the issue.

@AndrewMacho
Copy link

AndrewMacho commented Aug 15, 2023

I face similar issue while using ConversationBufferWindowMemory

langchain.version
'0.0.264'

after specifying the history key in memory it worked

template:

{history}

cusom key: {custom_key}
Human: {human_input}
Assistant:

chain:

 LLMChain(
            llm=ChatOpenAI(temperature=temperature, model=model),
            prompt=prompt,
            # verbose=True,
            memory=ConversationBufferWindowMemory(k=2, input_key='history')
            )            

@nikitacorp
Copy link

nikitacorp commented Sep 12, 2023

Oh, ok, I got it... I was facing the same issue, but adding input_key='input' resolved my problem, so if anyone needed to create LLMChain and then pass it into "custom" agent - here is slice of my code:

...
memory = ConversationBufferMemory(memory_key="chat_history", input_key='input')
readonlymemory = ReadOnlySharedMemory(memory=memory)

prompt = PromptTemplate(input_variables=["input", "chat_history", "custom_variable"], template=template)

filter_chain = LLMChain(
    llm=OpenAI(),
    prompt=prompt,
    verbose=True,
    memory=readonlymemory
)
llm_chain = LLMChain(llm=ChatOpenAI(temperature=0), prompt=prompt)
agent_test = ZeroShotAgent(llm_chain=filter_chain, tools=tools, verbose=True)
agent_chain_test = AgentExecutor.from_agent_and_tools(
    agent=agent_test, tools=tools, verbose=True, memory=memory
)
question = "What is ...?"
custom_variable= define_custom_variable(question)
res_agent = agent_chain_test.run(input=question, custom_variable=custom_variable)

Copy link

dosubot bot commented Dec 13, 2023

Hi, @dreysco

I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, the issue you raised involved using the LLMChain class with multiple variables when using a memory object, resulting in a ValueError indicating that only one input key is expected. keenborder786 clarified that certain types of memory, such as VectorBasedMemory, require specifying an input key to access its buffer. Other users also shared their experiences and solutions with different memory types.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Dec 13, 2023
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Dec 20, 2023
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Dec 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: memory Related to memory module 🤖:question A specific question about the codebase, product, project, or how to use a feature
Projects
None yet
Development

No branches or pull requests

4 participants