Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: openai.Getting error.InvalidRequestError on SubQuestionQueryEngine #7133

Closed
1 task done
Yackadaisical opened this issue Aug 3, 2023 · 4 comments
Closed
1 task done
Labels
question Further information is requested

Comments

@Yackadaisical
Copy link

Question Validation

  • I have searched both the documentation and discord for an answer.

Question

My set up works for all other kinds of queries I been working on, but today I was trying to add SubQuestionQueryEngine to my code, and keep getting this error:

openai.error.InvalidRequestError: Unrecognized request argument supplied: functions

This is basic outline of my code:

llm = AzureOpenAI(engine="try2", model="gpt-3.5-turbo-16k", temperature=0.2)

embedding_llm = LangchainEmbedding(
    OpenAIEmbeddings(
        model="text-embedding-ada-002",
        deployment="embedding_try1",
        openai_api_key=openai.api_key,
        openai_api_base=openai.api_base,
        openai_api_type=openai.api_type,
        openai_api_version=openai.api_version,
    ),
    embed_batch_size=1,
)

service_context = ServiceContext.from_defaults(
    llm=llm,
    embed_model=embedding_llm,
)

set_global_service_context(service_context)

storage_context = StorageContext.from_defaults(
    persist_dir="docs_index_storage"
)

vector_query_engine = load_index_from_storage(
    storage_context=storage_context, 
    service_context=service_context
    ).as_query_engine()

query_engine_tools = [
    QueryEngineTool(
        query_engine=vector_query_engine,
        metadata=ToolMetadata(
            name="tissue_research", description="Provide information on consumer tissue paper."
        ),
    ),
]

query_engine = SubQuestionQueryEngine.from_defaults(
    query_engine_tools=query_engine_tools,
    service_context=service_context
)

answer = query_engine.query("What is the latest trends on tissue paper")

print(answer.get_formatted_sources())

Whenever I run .query it would raise openai.error.InvalidRequestError: Unrecognized request argument supplied: functions

@Yackadaisical Yackadaisical added the question Further information is requested label Aug 3, 2023
@dosubot
Copy link

dosubot bot commented Aug 3, 2023

Answer generated by a 🤖

Answer

Hello,

The error you're encountering, openai.error.InvalidRequestError: Unrecognized request argument supplied: functions, is likely due to an incorrect or unsupported argument being passed to the OpenAI API.

In your code, the SubQuestionQueryEngine class uses the OpenAIQuestionGenerator class to generate sub-questions. If the OpenAIQuestionGenerator is not properly initialized or if it's being passed an unsupported argument, it could potentially throw the error you're seeing.

The SubQuestionQueryEngine class in the LlamaIndex codebase is initialized with the following arguments:

Required arguments:

  1. question_gen (BaseQuestionGenerator): A module for generating sub questions given a complex question and tools.
  2. response_synthesizer (BaseSynthesizer): A response synthesizer for generating the final response.
  3. query_engine_tools (Sequence[QueryEngineTool]): Tools to answer the sub questions.

Optional arguments:

  1. callback_manager (Optional[CallbackManager]): Defaults to None.
  2. verbose (bool): Whether to print intermediate questions and answers. Defaults to True.
  3. use_async (bool): Whether to execute the sub questions with asyncio. Defaults to False.

Please ensure that you're correctly initializing the SubQuestionQueryEngine and OpenAIQuestionGenerator with the appropriate arguments. If the error persists, please provide the specific code you're using to initialize and use these classes.

Also, there was a similar issue reported in the LlamaIndex repository: [Bug]: SubQuestionQueryEngine error. The suggested solution was to set the LLM to use OpenAI gpt-3.5-turbo or gpt-3.5-turbo. You might want to try this solution as well.

I hope this helps! If you have any other questions or need further clarification, feel free to ask.

Best,
Dosu

Sources


This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

@Yackadaisical
Copy link
Author

Was using gpt-3.5-turbo-16k,, changed to gpt-3.5-turbo, but still giving me the same error.
Also am using the exact same initialization as the SubQuestionQueryEngine documentation, the only difference is that I'm using Azure OpenAI LLM and embedding

@Yackadaisical
Copy link
Author

Moving to Bug: #7167 (comment)

@Shubham-Sahay
Copy link

@Yackadaisical did your issue got resolved as i am facing same issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants