You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I also noticed a very high bump in insert-time-per-record when bumping amount to 1B from 10M (as in, more than 100x time for 100x records). While the mass-loading pattern doesn't exactly complement a store built for concurrent DML, I agree the 1B benchmark should insert in a reasonable time regardless - especially with all data being known before hand.
I modified the million benchmark to billion and it took forever to insert.
Is it possible to expand the commit.Buffer to speed up the transaction?
The text was updated successfully, but these errors were encountered: