Skip to content
This repository was archived by the owner on Aug 16, 2021. It is now read-only.

--db-pgbench and --workload-pgbench #109

Merged
merged 8 commits into from
Sep 19, 2018
Merged

Conversation

michelp
Copy link
Contributor

@michelp michelp commented Sep 18, 2018

No description provided.

Copy link
Collaborator

@NikolayS NikolayS left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome @michelp!

Checked, worked like a charm.

We could also catch pgbench's output (where it prints tps and so on) into a file, considering it as a new artifact and delivering to the artifacts directory, but this is definitely could be a separate PR.

This – having pgbench on board – can help with benchmarking AWS instances and checking various values for Postgres parameters, now it's so simple!

@NikolayS NikolayS merged commit 9a8b387 into postgres-ai:master Sep 19, 2018
@NikolayS
Copy link
Collaborator

NikolayS commented Sep 19, 2018

Cool:

nancy run \
  --db-pgbench "-s 100" \
  --workload-pgbench "-t 60 -j 16 -c 16" \
  --run-on aws  \
  --aws-ec2-type "i3.large"  \
  --aws-keypair-name awskey \
  --aws-ssh-key-path file://$(echo ~)/.ssh/awskey.pem

results:

transaction type: <builtin: TPC-B (sort of)>
scaling factor: 100
query mode: simple
number of clients: 16
number of threads: 16
number of transactions per client: 60
number of transactions actually processed: 960/960
latency average = 20.040 ms
tps = 798.414235 (including connections establishing)
tps = 817.377725 (excluding connections establishing)
...
------------------------------------------------------------------------------
Artifacts (collected in "./nancy-20180920-003245N-MSK/"):
  Postgres config:    postgresql.conf
  Postgres logs:      postgresql.prepare.log.gz (preparation),
                      postgresql.workload.log.gz (workload)
  pgBadger reports:   pgbadger.html (for humans),
                      pgbadger.json (for robots)
  Stat stapshots:     pg_stat_statements.csv,
                      pg_stat_***.csv
------------------------------------------------------------------------------
Total execution time: 0:05:28
------------------------------------------------------------------------------
Workload:
  Execution time:     0:00:03
  Total query time:   848.31000000001  ms
  Queries:            4801
  Query groups:       6
  Errors:             0
  Errors groups:      0
------------------------------------------------------------------------------

The answering to questions like "will an instance with X CPU / Y GB RAM and Postgres 10 handle *** INSERTs per seconds?" is SO easy now.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Development

Successfully merging this pull request may close these issues.

3 participants