Replies: 1 comment
-
🤖 To generate related questions using the context from the RAG response in LlamaIndex, you can use the context as the input query to the Here's a step-by-step guide on how to do this:
from llama_index.query_engine import QueryBundle
query_bundle = QueryBundle(query_str=response)
from llama_index.retrievers.you_retriever import YouRetriever
retriever = YouRetriever(api_key="your_api_key")
nodes_with_scores = retriever._retrieve(query_bundle) In this code,
For more information, you can refer to the following sources:
Please note that you need to replace Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
How do I pass the RAG response to LLM again to generate the related questions using the same context?
query_str = "what are all the protocol studies done by Bill Wright"
response = query_engine.query(query_str)
print(str(response))
Here I want to use the context from the response to generated related questions on the topic, please help.
Beta Was this translation helpful? Give feedback.
All reactions