Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

performance with large number of collections #43

Closed
gut4 opened this issue Aug 4, 2018 · 1 comment
Closed

performance with large number of collections #43

gut4 opened this issue Aug 4, 2018 · 1 comment
Labels

Comments

@gut4
Copy link

gut4 commented Aug 4, 2018

For example I have tens of thousands collections with thousands docs each.
How will this impact on
– index performance?
– search performance?
– memory usage?

How I can benchmark this?

@kishorenc
Copy link
Member

@gut4 As far as the underlying design goes, thousands of collections is not a problem as the in-memory overhead associated with each collection is minimal. All documents (across all collections) are stored in a single RocksDB store on disk, so disk performance is also the same as storing all documents in a single collection. We also only do O(1) per-collection lookups during indexing and searching so no impact there as well.

As for benchmarking, you can create a few thousand collections and index them with the kind of data you would be using in your actual production application. That would give you a sense of how the indexing performs. To test the search performance, you can run a benchmark using a tool like siege.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants