Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error Callback Not Initiating on PG Constraint Error #21

Closed
BrandonMcNary opened this issue Jan 20, 2015 · 6 comments
Closed

Error Callback Not Initiating on PG Constraint Error #21

BrandonMcNary opened this issue Jan 20, 2015 · 6 comments

Comments

@BrandonMcNary
Copy link

Hi Brian,

Thank you so much for this tool. It easily streams 40,000 rows + into our Postgres database. I have one issue though, I'm unable to get it to throw errors when there is a problem. Certain tables within our database have unique constraints. When a user attempts to upload a duplicated file, the database throws an error, but I can't get this to bubble back to the user. I've tinkered with the error callback provided, but for whatever reason it doesn't trigger when a database error is thrown.

Do you have any recommendations? Please let me know if I'm doing something wrong here.

Thanks again!
Brandon

I'm using this code:

pg.connect(connectionString, function(err, client, done) {
if(err) {
return console.error('could not connect to postgres', err);
}
var stream = client.query(copyFrom(query));
var s = new Readable;
s.push(csv);
s.push(null);
s.pipe(stream).on('finish', function () {
res.send(200);
done();
}).on('error', function () {
res.send(500);
done();
});
});
};

@BrandonMcNary BrandonMcNary changed the title Error Tracking Error Callback Not Initiating on PG Constraint Error Jan 21, 2015
@brianc
Copy link
Owner

brianc commented Jan 21, 2015

Oh sorry that's happening! Hmm would it be possible to sumit a fully
executable sample that reproduces the problem? I have tests To check for
errors being emitted properly but streams emit errors in confusing ways. A
working snippet should help me pinpoint it.

On Tuesday, January 20, 2015, Brandon McNary notifications@github.com
wrote:

Hi Brian,

Thank you so much for this tool. It easily streams 40,000 rows + into our
Postgres database. I have one issue though, I'm unable to get it to throw
errors when there is a problem. Certain tables within our database have
unique constraints. When a user attempts to upload a duplicated file, the
database throws an error, but I can't get this to bubble back to the user.
I've tinkered with the error callback provided, but for whatever reason it
doesn't trigger when a database error is thrown.

Do you have any recommendations? Please let me know if I'm doing something
wrong here.

Thanks again!
Brandon

I'm using this code:

pg.connect(connectionString, function(err, client, done) {
if(err) {
return console.error('could not connect to postgres', err);
}
var stream = client.query(copyFrom(query));
var s = new Readable;
s.push(csv);
s.push(null);
s.pipe(stream).on('finish', function () {
res.send(200);
done();
}).on('error', function () {
res.send(500);
done();
});
});
};


Reply to this email directly or view it on GitHub
#21.

@BrandonMcNary
Copy link
Author

Hi Brian,

Thanks for the quick reply. I was trying to figure out how to achieve this, as a front to back sample would be quite a bit of work. Is there something specifically you'd like to see? The code I pasted was a portion of the uploadData function. I pasted the rest of the server code surrounding the uploadData function below. I've also included the angular service which handles the HTTP request to the server after a user has submitted the file. Sorry it's not the sample you wanted. If you have any suggestions to get you what you're looking for, please let me know.

Kind Regards,
Brandon

'use strict'
var knex = require(__dirname+'/../../knexconfig.js').knex; 
var connectionString = require(__dirname+'/../../knexconfig.js').connectionString;
var pg = require('pg');
var copyFrom = require('pg-copy-streams').from;
var fs = require ('fs');
var Readable = require('stream').Readable;

var uploadData = function(req, res){
ConvertToCSV(req.body).then(function(csv){
    var table,
    trendType = (req.param('trendType') === 'Energy') ? 'electric' : 'power';

if (req.param('uploadType') === 'dataInterval'){
    table = trendType + 'datainterval';
}
if (req.param('uploadType') === 'waterInterval'){
    table = 'waterdata';
}
var query = 'COPY ' + table + ' FROM STDIN';
pg.connect(connectionString, function(err, client, done) {
    if(err) {
        return console.error('could not connect to postgres', err);
    }
    var stream = client.query(copyFrom(query));
    var s = new Readable;
    s.push(csv);
    s.push(null);

    s.pipe(stream).on('finish', function () {
        res.send(200);
        done();
    }).on('error', function () {
        res.send(500);
        done();
    });
  });
 });
};

var ConvertToCSV = function(objArray) {
    return new Promise(function(resolve){
    var array = typeof objArray != 'object' ? JSON.parse(objArray) : objArray;
    var str = '';
    for (var i = 0; i < array.length; i++) {
       var line = '';
       for (var index in array[i]) {
         if (line != '') line += '\t'
         line += array[i][index];
       }
    str += line + '\r\n';
}
resolve(str);
}
};

--- End Server
-- Start Angular HTTP Service
//HTTP query launched when a user submits a CSV file (fileObject is the CSV)

 return helpers = {
   saveEnergyUse: function(fileObject, trendType, uploadType, farmId, companyId){
     var deferred = $q.defer();
     $http({
       method: 'POST',
       url: '/api/uploadData/' + uploadType + '/' + trendType,
       data: fileObject,
       }).success(function(record){
        deferred.resolve(record);
       }).error(function(info, status){
       console.error(info);
       deferred.reject('Error', info, 'Status', status);

       });
      return deferred.promise;
       }
       }

--- END Angular Service

@BrandonMcNary
Copy link
Author

Hi Brian,

Just following up on this. Please let me know if you require anything further!

Thanks,
Brandon

@dimfeld
Copy link

dimfeld commented Mar 21, 2015

I'm seeing the same problem. Here's a self-contained test that reproduces the issue: https://gist.github.com/dimfeld/6bc7d921040f9b6eeaed

I've seen the problem on both Mac and Ubuntu 14.04, with Postgres 9.4.0 and 9.4.1. The Postgres console prints the following for the failing case:

ERROR:  invalid input syntax for integer: "bbb"
CONTEXT:  COPY test_table, line 2, column b: "bbb"
STATEMENT:  COPY test_table FROM STDIN(FORMAT csv, DELIMITER ',')

@dimfeld
Copy link

dimfeld commented Mar 22, 2015

Hmm,, this may have been my mistake. In both my test case here and my original code, I was handling the finish event. If I handle end instead then I see the error as expected. This doesn't really fit with how I thought Node streams were supposed to work though, so I'm a bit confused.

Brandon, if you change your code .on('end', does it start working for you?

@jeromew
Copy link
Collaborator

jeromew commented Aug 23, 2016

I just fixed the documentation regarding finish vs end.
The current implementation inherits from a Transform stream. 'finish' is triggered too early by the Transform stream because we need to way from an async ACK from the database even if all the data has already been written to the socket.
As of now, you need to wait for 'end' which is triggered after we receice this ACK.

In both your cases you are sending a csv blob to the db, which triggers the 'finish' once it is all gone on the db socket, but probably before the db has time to realise there is an error.

Please try with version 1.2.0 + listening for 'end' instead of 'finish'.

You can read more on #15 regarding the finish vs end debate.

I am closing this issue. Please reopen if this does not fix your problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants