Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

InvalidRequestError: Unrecognized request argument supplied: functions #12260

Closed
2 of 14 tasks
younes-io opened this issue Oct 25, 2023 · 2 comments
Closed
2 of 14 tasks
Labels
Ɑ: agent Related to agents module 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules

Comments

@younes-io
Copy link

System Info

Python 3.11.4
LangChain 0.0.321

Platform info (WSL2):
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=20.04
DISTRIB_CODENAME=focal
DISTRIB_DESCRIPTION="Ubuntu 20.04.6 LTS"

Who can help?

No response

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

This is my code (based on https://python.langchain.com/docs/use_cases/question_answering/conversational_retrieval_agents and modified to use AzureOpenAI and OpenSearch):

import os
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.chat_models.azure_openai import AzureChatOpenAI
from langchain.vectorstores.opensearch_vector_search import OpenSearchVectorSearch
from langchain.memory import ConversationTokenBufferMemory

from langchain.prompts import (
    ChatPromptTemplate,
    MessagesPlaceholder,
    SystemMessagePromptTemplate,
    HumanMessagePromptTemplate,
)

from langchain.chains import ConversationalRetrievalChain


# Load environment variables
host = os.environ['HOST']
auth = os.environ['AUTH_PASS']
index_uk = os.environ['INDEX_NAME_UK']
opensearch_url = os.environ['OPENSEARCH_URL']
embedding_model = os.environ['EMBEDDING_MODEL']
model_name = os.environ['MODEL_NAME']

openai_api_base = os.environ['OPENAI_API_BASE']
openai_api_key = os.environ['OPENAI_API_KEY']
openai_api_type = os.environ['OPENAI_API_TYPE']
openai_api_version = os.environ['OPENAI_API_VERSION']

# Define Azure OpenAI component
llm = AzureChatOpenAI(
    openai_api_key = openai_api_key,
    openai_api_base = openai_api_base,
    openai_api_type = openai_api_type,
    openai_api_version = openai_api_version,
    deployment_name = model_name,
    temperature=0
)

# Define Memory component for chat history
memory = ConversationTokenBufferMemory(llm=llm,memory_key="chat_history",return_messages=True, max_token_limit=1000)

# Build a Retriever
embeddings = OpenAIEmbeddings(deployment=embedding_model, chunk_size=1)
docsearch = OpenSearchVectorSearch(index_name=index_uk, embedding_function=embeddings,opensearch_url=opensearch_url, http_auth=('admin', auth))
doc_retriever = docsearch.as_retriever()

# Build a retrieval tool
from langchain.agents.agent_toolkits import create_retriever_tool
tool = create_retriever_tool(
    doc_retriever, 
    "search_hr_documents",
    "Searches and returns documents regarding HR questions."
)
tools = [tool]

# Build an Agent Constructor
from langchain.agents.agent_toolkits import create_conversational_retrieval_agent
agent_executor = create_conversational_retrieval_agent(llm, tools, verbose=True)

result = agent_executor({"input": "hi, im bob"})

When I execute it, it generates the log error below:

InvalidRequestError                       Traceback (most recent call last)
[c:\k8s-developer\git\lambda-hr-docQA\conv_retrieval_tool.ipynb](file:///C:/k8s-developer/git/lambda-hr-docQA/conv_retrieval_tool.ipynb) Cell 3 line 6
     [58](vscode-notebook-cell:/c%3A/k8s-developer/git/lambda-hr-docQA/conv_retrieval_tool.ipynb#W2sZmlsZQ%3D%3D?line=57) from langchain.agents.agent_toolkits import create_conversational_retrieval_agent
     [59](vscode-notebook-cell:/c%3A/k8s-developer/git/lambda-hr-docQA/conv_retrieval_tool.ipynb#W2sZmlsZQ%3D%3D?line=58) agent_executor = create_conversational_retrieval_agent(llm, tools, verbose=True)
---> [61](vscode-notebook-cell:/c%3A/k8s-developer/git/lambda-hr-docQA/conv_retrieval_tool.ipynb#W2sZmlsZQ%3D%3D?line=60) result = agent_executor({"input": "hi, im bob"})

File [c:\Users\fezzef\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py:310](file:///C:/Users/fezzef/AppData/Local/Programs/Python/Python311/Lib/site-packages/langchain/chains/base.py:310), in Chain.__call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info)
    308 except BaseException as e:
    309     run_manager.on_chain_error(e)
--> 310     raise e
    311 run_manager.on_chain_end(outputs)
    312 final_outputs: Dict[str, Any] = self.prep_outputs(
    313     inputs, outputs, return_only_outputs
    314 )
File [c:\Users\fezzef\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\agents\openai_functions_agent\base.py:104](file:///C:/Users/fezzef/AppData/Local/Programs/Python/Python311/Lib/site-packages/langchain/agents/openai_functions_agent/base.py:104), in OpenAIFunctionsAgent.plan(self, intermediate_steps, callbacks, with_functions, **kwargs)
    102 messages = prompt.to_messages()
    103 if with_functions:
--> 104     predicted_message = self.llm.predict_messages(
    105         messages,
    106         functions=self.functions,
    107         callbacks=callbacks,
    108     )
    109 else:
    110     predicted_message = self.llm.predict_messages(
    111         messages,
    112         callbacks=callbacks,
    113     )
InvalidRequestError: Unrecognized request argument supplied: functions

I suspect the OpenAIFunctionsAgent is not compatible with AzureOpenAI. I'm not sure. Please check and let me know.

Expected behavior

It should be getting a response but LangChain throws an error

@dosubot dosubot bot added Ɑ: agent Related to agents module Ɑ: models Related to LLMs or chat model modules 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Oct 25, 2023
@dosubot
Copy link

dosubot bot commented Oct 25, 2023

🤖

Hello,

Thank you for reaching out and providing a detailed description of your issue.

Based on the information you've provided and the context I've found, it seems that the OpenAIFunctionsAgent in LangChain is not directly compatible with AzureOpenAI. The OpenAIFunctionsAgent is designed to work with OpenAI's function enabled API, specifically with ChatOpenAI models. The error you're encountering suggests that the AzureOpenAI API you're trying to use does not recognize or support the functions argument. This is likely because the AzureOpenAI API does not have the same function enabled API as OpenAI's.

In the LangChain code, there is a class AzureOpenAI which is designed to work with Azure's version of the OpenAI API. However, it does not appear to support the functions argument either, as there is no mention of it in the class definition or its methods.

Therefore, to resolve this issue, you would need to modify the LangChain code or use an adapter to make it compatible with AzureOpenAI.

Additionally, there is a similar solved issue in the LangChain repository that suggests updating the openai.api_version to the preview version 2023-07-01-preview when using Azure OpenAI. You can find more details about this in the following issue: Functions might not be supported through Azure OpenAI.

I hope this information helps. If you have any other questions or need further clarification, please let me know.

Best,
Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@younes-io
Copy link
Author

#6777 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Ɑ: agent Related to agents module 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

No branches or pull requests

1 participant