Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The performance seems low... #80

Open
didip opened this issue Dec 4, 2022 · 2 comments
Open

The performance seems low... #80

didip opened this issue Dec 4, 2022 · 2 comments

Comments

@didip
Copy link

didip commented Dec 4, 2022

I modified the million benchmark to billion and it took forever to insert.

Is it possible to expand the commit.Buffer to speed up the transaction?

@Dreeseaw
Copy link
Collaborator

Dreeseaw commented Dec 7, 2022

Hello, I also noticed a very high bump in insert-time-per-record when bumping amount to 1B from 10M (as in, more than 100x time for 100x records). While the mass-loading pattern doesn't exactly complement a store built for concurrent DML, I agree the 1B benchmark should insert in a reasonable time regardless - especially with all data being known before hand.

@kelindar
Copy link
Owner

Indeed, insertion is a bit slow as it touches most columns. I have a few questions to clarify this.

  1. How did you perform the insertion (single or multiple txn)?
  2. What was the time to insert 1B?
  3. What's the expected time?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants