You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The test demonstrate that writing 150,000 lines straight in a FileStream takes about 22 seconds until the file content stabilizes. When calling logger.debug() 150,000 times, the file stabilizes to its final content after 229s ( almost 4 minutes ! ).
After investigation, it turns out that the problem is using an Array() to accumulate the data. Pushing the data in the Array with Array.push() is quick, but the code flushing the buffer uses Array.shift(), which forces re-indexing of all 149,999 elements remaining in the Array. This is exponentially slower as the buffer grows.
The solution is to use something else than an Array to accumulate the messages.
A quick search revealed a very simple node module called Dequeue. I installed this module inside yours and by replacing the this.buffer = [] in BufferedWriteStream.js with this.buffer = new Dequeue(), I brought the logging of 150,000 messages back down to 31s. Seven times faster than the previous 229s.
There is a caveat that each log event is slightly longer due to the need to create an object to put in the double-ended queue inside the Dequeue object. According to a quick test, it takes about 4% more time per call to logger.debug().
So, knowing how faster it makes your module for big load and the caveat for small loads, would you like me to formulate a pull request for this ?
The text was updated successfully, but these errors were encountered:
During an evaluation of multiple loggers, I saw a slow down when trying to quickly log more than 100,000 messages to a file:
My detailed test can be found here: https://gist.github.com/NicolasPelletier/4773843
The test demonstrate that writing 150,000 lines straight in a FileStream takes about 22 seconds until the file content stabilizes. When calling logger.debug() 150,000 times, the file stabilizes to its final content after 229s ( almost 4 minutes ! ).
After investigation, it turns out that the problem is using an Array() to accumulate the data. Pushing the data in the Array with
Array.push()
is quick, but the code flushing the buffer usesArray.shift()
, which forces re-indexing of all 149,999 elements remaining in the Array. This is exponentially slower as the buffer grows.The solution is to use something else than an Array to accumulate the messages.
A quick search revealed a very simple node module called Dequeue. I installed this module inside yours and by replacing the
this.buffer = []
in BufferedWriteStream.js withthis.buffer = new Dequeue()
, I brought the logging of 150,000 messages back down to 31s. Seven times faster than the previous 229s.There is a caveat that each log event is slightly longer due to the need to create an object to put in the double-ended queue inside the
Dequeue
object. According to a quick test, it takes about 4% more time per call to logger.debug().So, knowing how faster it makes your module for big load and the caveat for small loads, would you like me to formulate a pull request for this ?
The text was updated successfully, but these errors were encountered: