Skip to content

Supporting large indexes - split the index into parts #128

@stjepangolemac

Description

@stjepangolemac

Maybe it is possible to have the index split into a configurable number of chunks. That along with a good caching policy in the browser could allow for much bigger indexes. I'll summarize below in an order of decreasing complexity and impact.

  • Index chunking / sharding
  • Preserving unmodified chunks when updating the index as much as possible (will help with caching)
  • Abstract away the number of chunks and allow configuring to prioritize for speed or bandwidth saving (or somewhere in between)
  • "Save data" mode where the search can be triggered manually after the query is typed

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions