Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pg performance v node-mysql-libmysqlsclient #123

Closed
mgutz opened this issue May 1, 2012 · 2 comments
Closed

pg performance v node-mysql-libmysqlsclient #123

mgutz opened this issue May 1, 2012 · 2 comments

Comments

@mgutz
Copy link

mgutz commented May 1, 2012

I benchmarked pg native vs node-mysql-libmysqlclient on my MBP with 16GB Ram, SSD. I created a simple test alternating between insert and select for 50,000 iterations. I'm using default postgresql and mysql install from Homebrew. The Mysql table is using InnoDB engine. pg is more than 3x slower. Results are in milliseconds

Am I doing something wrong?

ITERATIONS 50000
pg 17266
testLibMysql 5418
const ITERATIONS = 50000;

function testPg(cb) {
  var pg = require("pg").native;
  pg.connect(APP.config.postgresql.connectionString, function(err, client) {
    var start = new Date, elapsed;
    count = 0;
    async.whilst(
      function() { return count < ITERATIONS },

      function(cb) {
        count++;
        if (count % 2 === 1) {
          client.query("insert into users(user_name) values('test')", function(err, result) {
            if (count === 1) console.log(result);
            cb(err);
          });
        } else {
          client.query({text: "select * from users where id = $1", values: [1], name: 'foo'}, function(err, result) {
            if (count === 0) console.log(result);
            cb(err);
          });
        }
      },

      function(err) {
        elapsed = new Date - start;
        console.log("pg", elapsed);
        cb(err);
      }
    );
  });
}


function testLibMysql(cb) {
  var mysql = require('mysql-libmysqlclient');

  var client = mysql.createConnectionSync('localhost', 'userdev', 'dev', 'dbdev');
  var start = new Date, elapsed;

  count = 0;
  async.whilst(
    function() { return count < ITERATIONS },

    function(cb) {
      count++;
      if (count % 2 === 1) {
        client.query("insert into users(user_name) values('test')", function(err, result) {
          if (count === 1) console.log(result);
          cb(err);
        });
      } else {
        client.query("select * from users where id = 1;", function(err, result) {
          var r = result.fetchAllSync();
          if (count === 2) console.log(r);
          cb(err);
        });
      }
    },

    function(err) {
      elapsed = new Date - start;
      console.log("testLibMysql", elapsed);
      cb(err);
    }
  );
}
@mgutz
Copy link
Author

mgutz commented May 2, 2012

My create table script had a typo and it was using the MyISAM storage engine. After fixing, there is about a 15% difference:

ITERATIONS 50000
pg 18236
testLibMysql 15409

@mgutz mgutz closed this as completed May 2, 2012
@brianc
Copy link
Owner

brianc commented May 2, 2012

yeah it's hard to tell with benchmarks where the problem is. It's exceptionally hard with native bindings because the boundary between what happens in C and what happens in JavaScript has a (semi)unpredictable impact on performance. Basically the act of turning a C string/whatever into a JavaScript object is a non trivial act and can influence the v8 runtime's performance. The trickiest part is tuning things for today's v8 implementation (which I cannot do since v8's internals are way over my head) could in fact turn out to be all for naught when they change the VM in the future.

tl;dr: premature optimization is the root of all evil. It's good to see the difference between pg & mysql is low though.

brianc pushed a commit that referenced this issue Dec 27, 2019
* Prevent double release with callback

When using the callback instead of client.release, double releasing
a client was possible causing clients to be re-added multiple times.

* Remove idleListener when client is in-use

When a client is in-use, the error handling should be done by the
consumer and not by the pool itself as this otherwise might cause
errors to be handled multiple times.

* Handle verify failures
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants