Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trying to pass custom prompt in load_qa_with_sources_chain results in error #2858

Closed
momegas opened this issue Apr 13, 2023 · 11 comments
Closed

Comments

@momegas
Copy link

momegas commented Apr 13, 2023

Running the code below produces the following error: document_variable_name summaries was not found in llm_chain input_variables: ['name'] (type=value_error)

Any ideas?

Code:

def use_prompt(self, template: str, variables=List[str], verbose: bool = False):
    prompt_template = PromptTemplate(
        template=template,
        input_variables=variables,
    )

    self.chain = load_qa_with_sources_chain(
        llm=self.llm,
        prompt=prompt_template,
        verbose=verbose,
    )
        
use_prompt(template="Only answer the question 'What is my name?' by replaying with only the name. My name is {name}", variables=["name"])
@skeretna
Copy link

skeretna commented Apr 17, 2023

That should work

It seems to be expecting {summaries} variable,
try this, it should work

def use_prompt(template: str, variables=List[str], verbose: bool = False):
    prompt_template = PromptTemplate(
        template=template,
        input_variables=variables,
    )
    chain = load_qa_with_sources_chain(
        llm=llm,
        prompt=prompt_template,
        verbose=verbose,
    )
        
use_prompt(template="""Only answer the question 'What is my name?' by replaying with only the name. My name is {name}
    =========
    {summaries}
    =========
    Final Answer:""", variables=["summaries", "name"])

@momegas
Copy link
Author

momegas commented Apr 17, 2023

I may really be doing something wrong. I get the error: ValueError: Missing some input keys: {'name'} with the code below. I pass exactly what you provided, right? Am i missing something? Could this be a bug?

from langchain import PromptTemplate
from langchain.chains.qa_with_sources import load_qa_with_sources_chain
from langchain.chat_models import ChatOpenAI
from dotenv import load_dotenv

load_dotenv()

def get_chain(template: str, variables, verbose: bool = False):
    llm = ChatOpenAI()
    
    prompt_template = PromptTemplate(
        template=template,
        input_variables=variables,
    )
    return load_qa_with_sources_chain(
        llm=llm,
        prompt=prompt_template,
        verbose=verbose,
    )
        
chain = get_chain(template="""Only answer the question 'What is my name?' by replaying with only the name. My name is {name}
    =========
    {summaries}
    =========
    Final Answer:""", variables=["summaries", "name"])

question = "test question?"
answer = chain.run(input_documents="", question=question)

@momegas momegas closed this as completed Apr 17, 2023
@momegas momegas reopened this Apr 17, 2023
@skeretna
Copy link

skeretna commented Apr 19, 2023

I am a bit confused about how you're defining "name" in the prompt, actually I don't exactly understand the prompt since you do not take the name as input anywhere, what do you want to do exactly?

This works for me:

from langchain import PromptTemplate
from langchain.chains.qa_with_sources import load_qa_with_sources_chain
from langchain.chat_models import ChatOpenAI
from dotenv import load_dotenv

load_dotenv()

def get_chain(template: str, variables, verbose: bool = False):
    llm = ChatOpenAI(engine=deployment_name)
    
    prompt_template = PromptTemplate(
        template=template,
        input_variables=variables,
    )
    return load_qa_with_sources_chain(
        llm=llm,
        prompt=prompt_template,
        verbose=verbose,
    )
        
chain = get_chain(template="""Only answer the question 'What is my name?' by replaying with only the name. My name is name
    =========
    {summaries}
    =========
    Final Answer:""", variables=["summaries"])

question = "test question?"
answer = chain.run(input_documents="", question=question)

print(answer)

@amirgamil
Copy link

How would the screenshot u shared work if you're not passing the summaries variable when you call run?

@wmbutler
Copy link

wmbutler commented Jun 2, 2023

Another question about this. Why is {summaries} required at all? Shouldn't I be able to create a template that takes only the inputs I want it to take?

@vibha0411
Copy link

I have the same question... why is {summaries} even required?

@aiquick
Copy link

aiquick commented Jul 19, 2023

Is there an example of making it work with RetrievalQA ?

The code examples above do not work with RetrievalQA

@stepkurniawan
Copy link

I have the same issue...
other google search pointed out that i need to create a variable called "context" but i dont need it..

@Stephen-Strosko
Copy link

Have there been any resolutions here? This issue is still persistent. I am getting the same error when I am passing a custom prompt purposefully with no variables associated with the f-string. Following the documentation directly from LangChain:
https://python.langchain.com/docs/modules/model_io/prompts/prompt_templates/

@RubensZimbres
Copy link

RubensZimbres commented Dec 29, 2023

Maybe this helps:

messages = [
        SystemMessagePromptTemplate.from_template(promptTemplate),
        HumanMessagePromptTemplate.from_template("{question}")
        ]
qa_prompt = ChatPromptTemplate.from_messages(messages)

qa_chain = ConversationalRetrievalChain.from_llm(
      llm, retriever, memory=memory,get_chat_history=lambda h : h,combine_docs_chain_kwargs={"prompt": qa_prompt})

@bga41
Copy link

bga41 commented Mar 4, 2024

I have the same issue... other google search pointed out that i need to create a variable called "context" but i dont need it..

Did you ever solve it? What did you do

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jun 3, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Jun 10, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jun 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants