Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi topic Best practice #1852

Closed
DanSan23 opened this issue May 2, 2023 · 2 comments
Closed

Multi topic Best practice #1852

DanSan23 opened this issue May 2, 2023 · 2 comments
Labels

Comments

@DanSan23
Copy link

DanSan23 commented May 2, 2023

I'm working on building an index to efficiently query documents. With a single document, everything seems to work fairly well, but as the number of documents increases, the performance degrades. Can you tell me the best practices to deal with this scenario? The biggest problems occur with ambiguous queries where the answer could reside in several different documents.

@Disiok
Copy link
Collaborator

Disiok commented May 2, 2023

I think this might be a relevant example to take some inspiration from: https://github.com/jerryjliu/llama_index/blob/main/examples/composable_indices/city_analysis/City_Analysis-Unified-Query.ipynb

tldr: you can define sub-index for each document (or collection of documents), then define a (LLM powered) router to automatically choose between the different sub-indices.

@Disiok Disiok added the discord label May 2, 2023
@dosubot
Copy link

dosubot bot commented Aug 20, 2023

Hi, @DanSan23! I'm helping the LlamaIndex team manage their backlog and wanted to let you know that we are marking this issue as stale.

From what I understand, you were seeking advice on improving the performance of your document query index as the number of documents increases. Disiok suggested taking inspiration from an example in the llama_index repository, where sub-indices are defined for each document or collection of documents, and a router automatically chooses between them. It looks like you reacted positively to this suggestion with a thumbs up.

Before we close this issue, we wanted to check if it is still relevant to the latest version of the LlamaIndex repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.

Thank you for your contribution to the LlamaIndex project!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Aug 20, 2023
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Sep 10, 2023
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Sep 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants