Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

connection pointer is NULL #182

Closed
yawnt opened this issue Sep 4, 2012 · 19 comments
Closed

connection pointer is NULL #182

yawnt opened this issue Sep 4, 2012 · 19 comments

Comments

@yawnt
Copy link

yawnt commented Sep 4, 2012

I have a web app that uses node-postgres and after a while i notice this error in the logs

Error: connection pointer is NULL
at  at Connection.connection.on.q (....:146:17)
at Connection.EventEmitter.emit (events.js:88:17)

and database calls are (obviously) impossible after that , therefore i need to restart the app.. is there a way to avoid this?

Thanks

@freewil
Copy link

freewil commented Sep 4, 2012

Is there anymore information you can provide? OS? Postgres version? Some example code of how you are connecting to the database?

I have several apps that connect to Postgres manually (by creating a pg.Client object without connection pooling) for long periods of time and I have never had any problems with losing connection as this error seems to indicate.

@yawnt
Copy link
Author

yawnt commented Sep 4, 2012

my app is running on nodejitsu, so it's on SmartOS with heroku's postgresql (v 9.1.3)

code:

this.client = new pg.Client(connStr); // connStr is a string like postgres://user:pwd@host/db
this.client.connect();

Thanks a lot :)

@freewil
Copy link

freewil commented Sep 4, 2012

Each time you create a pg.Client object you are creating a new connection to Postgres. If you do this for each request then you will eventually exhaust all your allowable connections to Postgres.

You can instead use pg.connect() which will take advantage of connection pooling and reuse connections, by default it will only create a maximum of 10 connections.

@yawnt
Copy link
Author

yawnt commented Sep 4, 2012

ooh i didn't know that.. thanks a lot :)

i'll try and if that works i'll close the issue

@freewil
Copy link

freewil commented Sep 4, 2012

@yawnt please close this as this is not an actual bug/problem

EDIT: sorry, didnt notice the last part there 😄

@yawnt yawnt closed this as completed Sep 4, 2012
@yawnt
Copy link
Author

yawnt commented Sep 4, 2012

yeah the fact is that i don't create a connection for every request, but just once.. looking into pooling anyway @freewil

@yawnt
Copy link
Author

yawnt commented Sep 4, 2012

uhm it looks like i create the client only once :( , so that's not the problem

@yawnt yawnt reopened this Sep 4, 2012
@freewil
Copy link

freewil commented Sep 4, 2012

Are you trying to connect over SSL? pg doesn't actually support this yet without using the native bindings which you can use with require('pg').native

See #170

@brianc
Copy link
Owner

brianc commented Sep 4, 2012

I've seen this issue before a long time ago though I cannot remember what it was. Could you post a larger code sample? Fortunately, or unfortunately depending on how you look at it, I'm 99% sure it's due to your use of the library.

Just to be sure, it does execute some queries before this issue happens, correct?

@yawnt
Copy link
Author

yawnt commented Sep 4, 2012

yes, it does execute queries.. this is the whole code

https://github.com/yawnt/hater/blob/master/lib/builders/postgresql.js

@yawnt
Copy link
Author

yawnt commented Sep 8, 2012

have you had time to look at the code perhaps @brianc ? thanks

@brianc
Copy link
Owner

brianc commented Sep 10, 2012

I haven't had a chance to look at the code in detail within the module you're using. I would guess there is some custom connection pooling within the hater module that's not correctly checking in / checking out clients? At least that's where I would check first. Is there any particular place where the queries stop working or does it just randomly stop within some block of executing the same query n times?

@yawnt
Copy link
Author

yawnt commented Sep 10, 2012

what i do is: i create a single client and then use that for every query i make (via a Query instance).
the connection seems to drop after it has been inactive for a while, not after a certain amount of queries

thank you :)

@brianc
Copy link
Owner

brianc commented Sep 10, 2012

Perhaps heroku's using some addon to tell postgres to disconnect idle connections after a time? I think it could help to expose the underlying stream's end event to the client so 'scheduled' disconnects could be handled though it's hard to properly patch this without a little code snippet to reproduce.

Is there a way you can reproduce this locally on your machine without using the hater module? I really like to have a dependency free test for things if possible or at least a way to accurately reproduce them.

@aslakhellesoy
Copy link

I'm seeing this as well. My env:

pg: 0.8.6
node: v0.8.11
PostgreSQL: 9.2.1
OS X: 10.7.5

Here is a small script to reproduce it:

var pg = require('pg').native

var pgUrl = process.env.DATABASE_URL;
pg.defaults.poolSize = 5;
pg.defaults.poolIdleTimeout = 30000;

var delay = 1000;
function onConnect(err, client) {
  if(err) throw err;
  function run() {
    console.log('%s Sleeping %sms', new Date(), delay);
    setTimeout(function() {
      try {
        client.query('select * from docs;', [], function(err, docs) {
          if(err) {
            console.log('%s ASYNC ERROR', new Date(), err);
            // Trying to connect to fix things causes a segfault.
            pg.connect(pgUrl, onConnect);
          } else {
            delay = delay * 2;
            run();
          }
        });
      } catch(err) {
        console.log('%s SYNC ERROR', new Date(), err);
      }
    }, delay);
  }

  run();
}

pg.connect(pgUrl, onConnect);

This works fine in the beginning, but when the timeout reaches 32s I get an error. The attempt to reconnect causes a segfault.

Mon Oct 29 2012 01:16:23 GMT+0000 (GMT) Sleeping 1000ms
Mon Oct 29 2012 01:16:24 GMT+0000 (GMT) Sleeping 2000ms
Mon Oct 29 2012 01:16:26 GMT+0000 (GMT) Sleeping 4000ms
Mon Oct 29 2012 01:16:30 GMT+0000 (GMT) Sleeping 8000ms
Mon Oct 29 2012 01:16:38 GMT+0000 (GMT) Sleeping 16000ms
Mon Oct 29 2012 01:16:54 GMT+0000 (GMT) Sleeping 32000ms
Mon Oct 29 2012 01:17:26 GMT+0000 (GMT) ASYNC ERROR [Error: connection pointer is NULL
]
Mon Oct 29 2012 01:17:26 GMT+0000 (GMT) SYNC ERROR [Error: Postgres returned non-1 result from query dispatch.]
Segmentation fault: 11

I know that #61 suggests the use of pauseDrain and resumeDrain, but I haven't figured out how to modify this example so that errors aren't raised.

@aslakhellesoy
Copy link

By turning on logging I noticed that the connection is attempted released twice. This might be related to #192. Updated code:

var pg = require('pg').native

var pgUrl = process.env.DATABASE_URL;
pg.defaults.poolSize = 5;
pg.defaults.poolIdleTimeout = 1000;
pg.defaults.poolLog = console.log;

var delay = 250;
function onConnect(err, client) {
  if(err) throw err;

  function run() {
    console.log('%s Sleeping %sms', new Date(), delay);
    setTimeout(function() {
      try {
        client.query('select * from docs;', [], function(err, docs) {
          if(err) {
            console.log('%s ASYNC ERROR', new Date(), err);
            // Trying to connect to fix things causes a segfault.
            pg.connect(pgUrl, onConnect);
          } else {
            delay = delay * 2;
            run();
          }
        });
      } catch(err) {
        console.log('%s SYNC ERROR', new Date(), err);
      }
    }, delay);
  }

  run();
}

pg.connect(pgUrl, onConnect);

Output:

dispense() clients=1 available=0 info
dispense() - creating obj - count=1 verbose
Mon Oct 29 2012 02:05:12 GMT+0000 (GMT) Sleeping 250ms
Mon Oct 29 2012 02:05:12 GMT+0000 (GMT) Sleeping 500ms
timeout: 1351476313607 verbose
dispense() clients=0 available=1 info
Mon Oct 29 2012 02:05:13 GMT+0000 (GMT) Sleeping 1000ms
release called twice for the same resource: Error
    at Object.exports.Pool.me.release (/Users/ahellesoy/github/demo/node_modules/pg/node_modules/generic-pool/lib/generic-pool.js:285:61)
    at Connection.PG.connect.pools.(anonymous function).genericPool.Pool.create (/Users/ahellesoy/github/demo/node_modules/pg/lib/index.js:64:14)
    at Connection.EventEmitter.emit (events.js:90:17)
    at Connection.p._pulseQueryQueue (/Users/ahellesoy/github/demo/node_modules/pg/lib/native/index.js:78:54)
    at Connection.clientBuilder (/Users/ahellesoy/github/demo/node_modules/pg/lib/native/index.js:167:18)
    at Connection.EventEmitter.emit (events.js:90:17) error
removeIdle() destroying obj - now:1351476313608 timeout:1351476313607 verbose
removeIdle() all objects removed verbose
Mon Oct 29 2012 02:05:14 GMT+0000 (GMT) ASYNC ERROR [Error: connection pointer is NULL
]
dispense() clients=1 available=0 info
dispense() - creating obj - count=1 verbose
Mon Oct 29 2012 02:05:14 GMT+0000 (GMT) SYNC ERROR [Error: Postgres returned non-1 result from query dispatch.]
Segmentation fault: 11

@brianc
Copy link
Owner

brianc commented Oct 29, 2012

If you want to use a single open client to run all your connections create the client manually via new pg.Client. The pg.connect method uses a pool of clients and is intended to be used in scenarios where you check out a client, run some queries, and return the client rapidly. i.e. in high-load website scenarios.

@scottsd
Copy link

scottsd commented Dec 30, 2012

Also encountering this randomly on simple connect & insert (both local postgres 9.2 installation and remote heroku; require('pg').native).

For subqueries and also this simple insert I'm using pauseDrain() as noted in issue 61.

Here are the logs, with a successful connect & insert, then a failed one. On retry it will work.
17:32:29 web.1 | App started on localhost:5000
17:32:29 web.1 | dispense() clients=1 available=0 info
17:32:29 web.1 | createResource() - creating obj - count=1 min=0 max=10 verbose
17:32:35 web.1 | dispense() clients=1 available=0 info
17:32:35 web.1 | createResource() - creating obj - count=2 min=0 max=10 verbose
17:32:35 web.1 | timeout: 1356795157246 verbose
17:32:35 web.1 | dispense() clients=0 available=1 info
17:32:36 web.1 | availableObjects.length=1 verbose
17:32:37 web.1 | removeIdle() destroying obj - now:1356795157254 timeout:1356795157246 verbose
17:32:37 web.1 | removeIdle() all objects removed verbose
17:33:00 web.1 | dispense() clients=1 available=0 info
17:33:00 web.1 | createResource() - creating obj - count=2 min=0 max=10 verbose
17:33:01 web.1 | dispense() clients=1 available=0 info
17:33:01 web.1 | createResource() - creating obj - count=3 min=0 max=10 verbose
17:33:01 web.1 | timeout: 1356795183343 verbose
17:33:01 web.1 | dispense() clients=0 available=1 info
17:33:01 web.1 | dispense() clients=1 available=1 info
17:33:01 web.1 | dispense() - reusing obj verbose
17:33:01 web.1 | Error: connection pointer is NULL
17:33:01 web.1 | at Connection.p._pulseQueryQueue (node_modules/pg/lib/native/index.js:97:10)
17:33:01 web.1 | at Connection.p.query (node_modules/pg/lib/native/index.js:55:8)
17:33:01 web.1 | at dispense
node_modules/pg/node_modules/generic-pool/lib/generic-pool.js:235:9)
17:33:01 web.1 | at Object.exports.Pool.me.acquire (node_modules/pg/node_modules/generic-pool/lib/generic-pool.js:304:5)
17:33:01 web.1 | at PG.connect (node_modules/pg/lib/index.js:45:24)

17:33:01 web.1 | at callbacks (node_modules/express/lib/router/index.js:160:37)
17:33:02 web.1 | removeIdle() all objects removed verbose
17:33:08 web.1 | dispense() clients=1 available=0 info
17:33:08 web.1 | createResource() - creating obj - count=4 min=0 max=10 verbose
17:33:08 web.1 | timeout: 1356795190200 verbose
17:33:08 web.1 | dispense() clients=0 available=1 info
17:33:09 web.1 | availableObjects.length=1 verbose
17:33:10 web.1 | removeIdle() destroying obj - now:1356795190206 timeout:1356795190200 verbose
17:33:10 web.1 | removeIdle() all objects removed verbose

As a separate note/request, would it be possible to have a real-life code example, with node.js and express?
Some of our requests need to connect to db, then issue various queries. The code started to get pretty big, with
client.resumeDrain()
client.end()
multiple times on various branches (including error handling). Would be great if there would be a way

  • open a connection when needed
  • leave connection open to be reused in various other modules
  • close the connection automatically on express.js request end

@brianc
Copy link
Owner

brianc commented Feb 22, 2013

I've changed the pool substantially to no longer rely on the drain event at all. No pauseDrain no resumeDrain. Documentation is a work in progress.

https://github.com/brianc/node-postgres/wiki/pg
#274

@brianc brianc closed this as completed Feb 22, 2013
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants