You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 9, 2019. It is now read-only.
Bulk loading more than 1000 items at a time is very slow. This is because nodes are not splitting before commit which causes large memmove() operations during insertion.
This should be (relatively) easy to fix.
The text was updated successfully, but these errors were encountered:
This only occurs with the initial batch of updates since they all go into the same bucket. After the first commit they spread out pretty evenly. Not worth the complexity to try to implement this at this time.
Bulk loading more than 1000 items at a time is very slow. This is because nodes are not splitting before commit which causes large memmove() operations during insertion.
This should be (relatively) easy to fix.
The text was updated successfully, but these errors were encountered: