-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question]: openai.Getting error.InvalidRequestError on SubQuestionQueryEngine #7133
Comments
Answer generated by a 🤖 AnswerHello, The error you're encountering, In your code, the The Required arguments:
Optional arguments:
Please ensure that you're correctly initializing the Also, there was a similar issue reported in the LlamaIndex repository: [Bug]: SubQuestionQueryEngine error. The suggested solution was to set the LLM to use OpenAI I hope this helps! If you have any other questions or need further clarification, feel free to ask. Best, SourcesThis response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. |
Was using gpt-3.5-turbo-16k,, changed to gpt-3.5-turbo, but still giving me the same error. |
Moving to Bug: #7167 (comment) |
@Yackadaisical did your issue got resolved as i am facing same issue. |
Question Validation
Question
My set up works for all other kinds of queries I been working on, but today I was trying to add SubQuestionQueryEngine to my code, and keep getting this error:
openai.error.InvalidRequestError: Unrecognized request argument supplied: functions
This is basic outline of my code:
Whenever I run .query it would raise openai.error.InvalidRequestError: Unrecognized request argument supplied: functions
The text was updated successfully, but these errors were encountered: