Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLMChain throwing error > value is not a valid #4281

Closed
4 of 14 tasks
TechnoRahmon opened this issue May 7, 2023 · 2 comments
Closed
4 of 14 tasks

LLMChain throwing error > value is not a valid #4281

TechnoRahmon opened this issue May 7, 2023 · 2 comments

Comments

@TechnoRahmon
Copy link

System Info

I am getting an error when using LLMChain with openai model, here is the code :

    # prepare the prompt
    prompt = PromptTemplate(
        input_variables=give_assistance_input_variables,
        template=give_assistance_prompt
    )
    prompt = prompt.format(command=query, context="this is test context")

    tokens = tiktoken_len(prompt)
    print(f"prompt  : {prompt}")
    print(f"prompt tokens : {tokens}")

    llm = OpenAI(
        model_name="text-davinci-003",
        temperature=0,
        #max_tokens=256,
        #top_p=1.0,
        #n=1,
        #best_of=1
    )

    # connect to the LLM
    llm_chain = LLMChain(prompt=prompt, llm=llm)

the issue is with line :

    # connect to the LLM
     llm_chain = LLMChain(prompt=prompt, llm=llm)

error :
llm_chain = LLMChain(prompt=prompt, llm=llm)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for LLMChain
prompt
value is not a valid dict (type=type_error.dict)

any idea to solve this ?

Who can help?

@hwchase17
@agola

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

# prepare the prompt
prompt = PromptTemplate(
    input_variables=give_assistance_input_variables,
    template=give_assistance_prompt
)
prompt = prompt.format(command=query, context="this is test context")

tokens = tiktoken_len(prompt)
print(f"prompt  : {prompt}")
print(f"prompt tokens : {tokens}")

llm = OpenAI(
    model_name="text-davinci-003",
    temperature=0,
    #max_tokens=256,
    #top_p=1.0,
    #n=1,
    #best_of=1
)

# connect to the LLM
llm_chain = LLMChain(prompt=prompt, llm=llm)
response = llm_chain.run()

Expected behavior

I should get a response from openai API

@vanillechai
Copy link

vanillechai commented May 7, 2023

prompt = PromptTemplate(
input_variables=give_assistance_input_variables,
template=give_assistance_prompt
)

Assuming that give_assistance_input_variables and give_assistance_prompt are strings, you have now a Prompt in the variable prompt.

prompt = prompt.format(command=query, context="this is test context")

Assuming that query is a string, you have now replaced the Prompt with a string in prompt.

llm_chain = LLMChain(prompt=prompt, llm=llm)

LLMChain does not expect a string as prompt, hence the error.

See https://python.langchain.com/en/latest/modules/chains/getting_started.html

@TechnoRahmon
Copy link
Author

prompt = PromptTemplate(
input_variables=give_assistance_input_variables,
template=give_assistance_prompt
)

Assuming that give_assistance_input_variables and give_assistance_prompt are strings, you have now a Prompt in the variable prompt.

prompt = prompt.format(command=query, context="this is test context")

Assuming that query is a string, you have now replaced the Prompt with a string in prompt.

llm_chain = LLMChain(prompt=prompt, llm=llm)

LLMChain does not expect a string as prompt, hence the error.

See https://python.langchain.com/en/latest/modules/chains/getting_started.html

Thank you @vanillechai
I just changed the prompt to be PromptTemplate type and it worked

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants