-
Notifications
You must be signed in to change notification settings - Fork 443
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance Issue with the TokenStreamParser #781
Comments
@mihaistoie Hey there! 👋 Thank you for reporting this issue. Could you also supply us with a bit more data to help us reproduce and fix this problem? 🙇 Could you provide us with the DDL statement to recreate the table structure? Could you also provide sample data that highlights the performance issue? Also, if you could provide the SQL Server version you're running this against, that would be very helpful! Thank you! ❤️ |
Hello @arthurschreiber , Before we used the version 1.x of tedious. Everything works perfectly. We tried to update our application but the execution times exploded (sometimes 10x, same server, same sql query). I looked at the code of stream-parser, I am sure that the problem comes from callbacks (even if the code is syncrone). In debugger you can see that the call stack is huge (parsevalue). This is the change between 1x and 2.x. SQL Server Express (64-bit) 2014 [refLot] -> varchar 30 Thank you. |
Whoa, that's crazy - and very unexpected. Which version of Node.js are you running on? I've been working on a new parser that's not using callbacks at all to parse tokens, and this one should be a lot faster than the parser that is currently being used in tedious, but it's still very much work in progress and not in a usable state. Nevertheless, the current parser should be faster than the one used in 1.x. 🤔 I'll see if I can find something over the next days. Thank you so much for reporting this! 🙇 |
We use node 8x. I tried with 10 .. same results. const te = new Date();
var ret = this.sendDataToTokenStreamParser(_data7);
console.log(new Date() - te); // 48ms For my query he reports 48 ms !!!!. The execution time is 4ms: const ts = new Date();
this.messageIo.sendMessage(packetType, payload.data, this.resetConnectionOnNextRequest);
// ...
this.messageIo.on('data', (data) => {
console.log(new Date() - te); // 4ms
this.dispatchEvent('data', data); });
} Now, we use knexjs to generate our sql's. On postgresSql the query is very fast. |
Hi @mihaistoie, are you still experiencing this same issue when using the latest tedious version? |
I am using "tedious", last version (git clone) and the function "sendDataToTokenStreamParser" is very slow.
My query takes 52 ms: 4ms execution + 48 ms in sendDataToTokenStreamParser !!!
SQL Server Express (64-bit) 2014
select [refLot], [prixHT], [prixTTC], [remiseHT], [remiseTTC], [tauxTVA], [dateLivraisonContractuelle], [dateConvocationLivraison], [refLotEcheancier], [parentId], [_dtm], [tenantId], [id] from [RVDossierLot] where [refLot] = 'CODE_1'
The query returns 3 rows.
To measure I have trace in connection.js :
data: function data(_data7) {
this.clearRequestTimer(); // request timer is stopped on first data package
const te = new Date();
var ret = this.sendDataToTokenStreamParser(_data7);
console.log(new Date() - te);
if (ret === false) {
// Bridge backpressure from the token stream parser transform to the
// packet stream transform.
this.messageIo.pause();
}
},
The text was updated successfully, but these errors were encountered: