You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm building an application that will allow a user to upload a .CSV file. Once doing so, I parse the data to JSON format, validate JSON objects, and then I send the valid entries to a POST request using Knex and a Postgres DB.
At first, I was running into some issues on the server side for the sheer size of some bulk uploads when I try to 'insert' one large JSON object of thousands of entries. For instance, one CSV file I have has 9000 rows. The error on the server side was about the size of the upload, so I fixed that with this on my Express.js server...
This is my method (useable for both a single upload or a bulk upload JSON object):
async function create(req, res) {
//create new asset in the system
const chunkSize = 12000;
const result = !Array.isArray(req.body.data) ? { //stringify single asset history into json array
...req.body.data, history: JSON.stringify(req.body.data.history)
} : req.body.data.map(data => {return {...data, history: JSON.stringify(data.history)}}); //stringify each bulk asset's history into json array
const data = await knex
// .insert(result)
.batchInsert('assets', result, chunkSize)
.returning("*")
.then((results) => results[0]); //return result
res.status(201).json({ data });
}
With 'insert' I get the issue:
bind message has 3416 parameter formats
but 0 parameters
I've tried doing some research on this, and I've seen that COPY seems to help, but it should copy straight from the CSV file (although I've read that you can copy from straight JSON data). I don't see support for this in Knex so I tried with a raw SQL, and I get:
'select * from COPY result (data) FROM STDIN; - syntax error at or near "FROM"'
Some articles mentioned using Knex 'batchInsert' but I get errors saying this function does not exist...
When I try to use batchInsert, I get this error:
'Maximum call stack size exceeded at scheduleFibersWithFamiliesRecursively
I am newer to this, so I really appreciate all the help.
A little extra info -- There should rarely (if ever) be more than 10,000 entries. I could set the threshold at 15,000 or something if possible, and that should suffice.
Thank you!
The text was updated successfully, but these errors were encountered:
I'm building an application that will allow a user to upload a .CSV file. Once doing so, I parse the data to JSON format, validate JSON objects, and then I send the valid entries to a POST request using Knex and a Postgres DB.
At first, I was running into some issues on the server side for the sheer size of some bulk uploads when I try to 'insert' one large JSON object of thousands of entries. For instance, one CSV file I have has 9000 rows. The error on the server side was about the size of the upload, so I fixed that with this on my Express.js server...
This is my method (useable for both a single upload or a bulk upload JSON object):
With 'insert' I get the issue:
I've tried doing some research on this, and I've seen that COPY seems to help, but it should copy straight from the CSV file (although I've read that you can copy from straight JSON data). I don't see support for this in Knex so I tried with a raw SQL, and I get:
'select * from COPY result (data) FROM STDIN; - syntax error at or near "FROM"'
Some articles mentioned using Knex 'batchInsert' but I get errors saying this function does not exist...
When I try to use batchInsert, I get this error:
'Maximum call stack size exceeded at scheduleFibersWithFamiliesRecursively
I am newer to this, so I really appreciate all the help.
A little extra info -- There should rarely (if ever) be more than 10,000 entries. I could set the threshold at 15,000 or something if possible, and that should suffice.
Thank you!
The text was updated successfully, but these errors were encountered: