-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: AzureOpenAI : Sub Question Query Engine is not working #8641
Comments
🤖 Hello @vipinap98, I'm Dosu, an AI bot designed to assist you with bugs, answer questions, and guide you in becoming a contributor to the LlamaIndex repository. While we wait for a human maintainer, I'm here to help you with any queries or concerns you might have. Feel free to ask anything, and let's make your experience with LlamaIndex smoother together! Best, Give me a moment to look at your issue. I'll post a detailed answer shortly. |
🤖 Hello @vipinap98, I'm Dosu, your friendly bot here to assist you while we wait for a human maintainer. I can help you tackle bugs, answer your questions, and guide you on becoming a contributor to LlamaIndex. Remember, I'm here to make your journey smoother and more enjoyable. Let's work together! Based on the error message you're encountering, it seems like the Azure OpenAI service does not support the function calling API that is used by the OpenAIQuestionGenerator in the LlamaIndex version 0.8.57. This is likely the reason why you're seeing the "InvalidRequestError: Unrecognized request argument supplied: functions" error. In the To resolve this issue, you could use a question generator that does not rely on the function calling API, such as the LLMQuestionGenerator. You can specify this when creating the SubQuestionQueryEngine: question_gen = LLMQuestionGenerator.from_defaults(service_context=service_context)
sub_question_query_engine = SubQuestionQueryEngine.from_defaults(
query_engine_tools=query_engine_tools,
question_gen=question_gen,
response_synthesizer=response_synthesizer,
service_context=service_context,
verbose=verbose,
use_async=use_async,
) This will ensure that the question generator used by the SubQuestionQueryEngine is compatible with the Azure OpenAI service. You can find more details in the source code of the SubQuestionQueryEngine class in the llama_index/query_engine/sub_question_query_engine.py file. As for the changes in the LlamaIndex version 0.8.58, I wasn't able to find specific changes that could potentially resolve this issue. However, it's always a good idea to update to the latest version of a software package, as it may include bug fixes and improvements that could help resolve your issue. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
@vipinap98 you need to make sure you use the latest version of OPENAI_API_VERSION, I think the august preview added support for function calling |
Bug Description
Was trying to do the Subquestion query engine
Got this error
InvalidRequestError: Unrecognized request argument supplied: functions
is it an issue from the Azure Openai itself? because the deployment is gpt-35-turbo-0301 not the 0613 model
Version
0.8.57
Steps to Reproduce
Follow the subquestion query engine with azure openai service
ps:it works with openai service
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered: