Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

create_sql_agent: Prompt missing required variables: {'tools', 'tool_names'} #17939

Open
4 tasks done
NikhilKosare opened this issue Feb 22, 2024 · 9 comments
Open
4 tasks done
Labels
Ɑ: agent Related to agents module 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations

Comments

@NikhilKosare
Copy link

NikhilKosare commented Feb 22, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.

Example Code

few_shot_prompt = FewShotPromptTemplate(
    examples=examples,
    example_prompt=PromptTemplate.from_template(
        "User input: {input}\nSQL query: {query}"
    ),
    input_variables=["input", "dialect", "top_k"],
    prefix=system_prefix,
    suffix="User input: {input}\nSQL query: ",
)

full_prompt = ChatPromptTemplate.from_messages(
    [
        SystemMessagePromptTemplate(prompt=few_shot_prompt),
        ("human", "{input}"),
        MessagesPlaceholder("agent_scratchpad"),
    ]
)

agent = create_sql_agent(
    llm=llm,
    db=db,
    prompt=full_prompt,
    verbose=True
)

Error Message and Stack Trace (if applicable)


ValueError Traceback (most recent call last)
Cell In[32], line 1
----> 1 agent = create_sql_agent(
2 llm=llm,
3 db=db,
4 prompt=full_prompt,
5 verbose=True
6 )

File ~\anaconda3\Lib\site-packages\langchain_community\agent_toolkits\sql\base.py:182, in create_sql_agent(llm, toolkit, agent_type, callback_manager, prefix, suffix, format_instructions, input_variables, top_k, max_iterations, max_execution_time, early_stopping_method, verbose, agent_executor_kwargs, extra_tools, db, prompt, **kwargs)
172 template = "\n\n".join(
173 [
174 react_prompt.PREFIX,
(...)
178 ]
179 )
180 prompt = PromptTemplate.from_template(template)
181 agent = RunnableAgent(
--> 182 runnable=create_react_agent(llm, tools, prompt),
183 input_keys_arg=["input"],
184 return_keys_arg=["output"],
185 )
187 elif agent_type == AgentType.OPENAI_FUNCTIONS:
188 if prompt is None:

File ~\anaconda3\Lib\site-packages\langchain\agents\react\agent.py:97, in create_react_agent(llm, tools, prompt)
93 missing_vars = {"tools", "tool_names", "agent_scratchpad"}.difference(
94 prompt.input_variables
95 )
96 if missing_vars:
---> 97 raise ValueError(f"Prompt missing required variables: {missing_vars}")
99 prompt = prompt.partial(
100 tools=render_text_description(list(tools)),
101 tool_names=", ".join([t.name for t in tools]),
102 )
103 llm_with_stop = llm.bind(stop=["\nObservation"])

ValueError: Prompt missing required variables: {'tools', 'tool_names'}

Description

Create_sql_agent is throwing an error

System Info

langchain 0.1.8
langchain-community 0.0.21
langchain-core 0.1.25
langchain-experimental 0.0.52
langchain-openai 0.0.6

@dosubot dosubot bot added Ɑ: agent Related to agents module 🔌: openai Primarily related to OpenAI integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Feb 22, 2024
@liugddx
Copy link
Contributor

liugddx commented Feb 23, 2024

Use agent = create_sql_agent( llm=llm, db=db, prompt=full_prompt, agent_type="openai-tools", verbose=True ) and try again.

@NikhilKosare
Copy link
Author

@liugddx - Thank you for your inputs!

I tried and getting below error
NotFoundError: Error code: 404 - {'error': {'message': 'Unrecognized request argument supplied: tools', 'type': 'invalid_request_error', 'param': None, 'code': None}}

I am bound to use the resources which are available on azure platform, so not sure if i could use "openai-tools" in agent_type.

this is how i am initializing the model

llm = AzureChatOpenAI(azure_deployment='abc',
openai_api_key=aoai_api_key,
azure_endpoint=aoai_endpoint,
openai_api_version=aoai_api_version)

@RadhikaBansal97
Copy link
Contributor

Hi @NikhilKosare,

You can use openai-tools with AzureOpenAI endpoint.

Use the latest langchain and langchain_openai version and initialize LLM like below

from langchain_openai import AzureChatOpenAI

os.environ["AZURE_OPENAI_API_KEY"] = " "
os.environ["AZURE_OPENAI_ENDPOINT"] = " "

llm = AzureChatOpenAI(
    openai_api_version="2023-05-15",
    azure_deployment="gpt-35-turbo",
)

You can follow this article for more info https://python.langchain.com/docs/use_cases/sql/agents#using-a-dynamic-few-shot-prompt

Let me know if this works for you.

@sridharangopal
Copy link

I am trying to use the posted example but have been trying to modify it and use Ollama+mistral instead of OpenAI. I am able to get it working if I do not supply a prompt when using create_sql_agent. But I get this posted error when trying to use the dynamic few shot prompt as described in the link. Any help is much appreciated.

image

@97k
Copy link

97k commented Apr 7, 2024

Getting the same error!

@langonifelipe
Copy link

Guys, the key to adapt it its to create the full prompt from scratch, following this format:
image

This is a printscreen from the original source of the langchain "create_sql_agent" method, it creates the prompt if no prompt is provided, but if it is, the prompt must be complete and in proper form, different from the "openai_tools" method that gets your prompt and insert it in the ReAct prompt somehow. When using a different Llm however, we have to build it from scratch:

You can have access to the prefix and suffix used in the create_sql_agent in here: https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/agents/mrkl/prompt.py

I suggest putting the sql prompt "system_prefix " from the tutorial (https://python.langchain.com/docs/use_cases/sql/agents/) in the prefix, but putting the "Here are some examples of user inputs and their corresponding SQL queries:" part in the final of format_instructions, and then create the few shot object like this:

example_prompt = ChatPromptTemplate.from_messages( [ ("human", "{input}"), ("ai", "{output}"), ] ) few_shot_prompt = FewShotChatMessagePromptTemplate( example_prompt=example_prompt, examples=examples, )

And after this, call it in the template using .format():
`template = "\n\n".join(
[
PREFIX,
"{tools}",
format_instructions,
few_shot_prompt.format(),
SUFFIX,
]
)

prompt = PromptTemplate.from_template(template)`

@avisionh
Copy link

@langonifelipe - Thank you so much for this! I am using Ollama locally and have followed your advice, getting it to work. For the avoidance of any doubt, the prompt should be like below:

from langchain_core.prompts import (
    ChatPromptTemplate,
    FewShotChatMessagePromptTemplate,
    PromptTemplate,
)

from langchain.agents.mrkl import prompt as react_prompt

examples = [
  {"input": "List all the characters in Anna Karenina",
   "query": "SELECT DISTINCT characterNames FROM novels WHERE novelName = 'Anna Karenina'"},
]

system_prefix = """
    You are an agent designed to interact with a SQL database.
    Given an input question, create a syntactically correct query to run, then look at the results of the query and 
    return the answer. Unless the user specifies a specific number of examples they wish to obtain.
    You can order the results by a relevant column to return the most interesting examples in the database.
    Never query for all the columns from a specific table, only ask for the relevant columns given the question.
    You have access to tools for interacting with the database.
    Only use the given tools. Only use the information returned by the tools to construct your final answer.
    You MUST double check your query before executing it. If you get an error while executing a query, rewrite the query 
    and try again.

    DO NOT make any DML statements (INSERT, UPDATE, DELETE, DROP etc.) to the database.

    If the question does not seem related to the database, just return "I don't know" as the answer.
"""

basic_suffix = """
    Begin!

    Question: {input}
    Thought: I should look at the tables in the database to see what I can query. Then I should query the schema of the most relevant tables.
    {agent_scratchpad}

"""

example_prompt = ChatPromptTemplate.from_messages(
    messages=[
        ('human', "{input}"),
        ('ai', "{query}")
    ]
)

few_shot_prompt = FewShotChatMessagePromptTemplate(
    examples=examples,
    example_prompt=example_prompt,
    input_variables=["input",
                     "agent_scratchpad"],
)

format_instructions = f"{react_prompt.FORMAT_INSTRUCTIONS}\n " \
                      f"Here are some examples of user inputs and " \
                      f"their corresponding SQL queries:\n"

template = "\n\n".join(
    [
        system_prefix,
        "{tools}",
        format_instructions,
        few_shot_prompt.format(),
        basic_suffix
    ]
)

prompt = PromptTemplate.from_template(template=template)

@Vinoth-Ramadoss
Copy link

@avisionh ,
I am to trying to create an sql agent with ollama with few shot learning and facing the same issue.
Would you be able to share the full code?

@FredrikGordh
Copy link

Thank you! @avisionh @langonifelipe I sat with this a day haha but finally your template got it to work :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Ɑ: agent Related to agents module 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations
Projects
None yet
Development

No branches or pull requests

9 participants