You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ContextLinesIntegration reads files from error stack traces in order to show the surrounding source lines. When new errors occur, the SDK will read these files regardless of their size and store them in an LRU cach indefinitely (or until they are evicted) so that subsequent contexts can be resolved faster.
This can have a large impact on the amount of memory an application uses (especially in the case of bundling), which I suspect is also causing longer GC sweeps.
Solution Brainstorm
leverage process.getMemoryUsage to determine a smarter LRU cache size or the maximum file size that we are willing to read
periodically clear the LRU cache and reclaim memory
use readline to extract only the lines we are interested in
It is hard to say how effective 1 and 2 would be, made up limitations like these are brittle at best + we risk breaking the product in its current form without providing an explanation.
3 could bring a substantial improvement. If we assume that a particular line of code is likely to crash again (as opposed to a line from anywhere in that file), then the surrounding context lines could also be cached.
A good next step would be to benchmark readline vs readfile to see the performance characteristics and evaluate if we could reasonably replace it. One benefit of readline would be that we can avoid copying the file in order to split it by line, which I suspect is also the culprit of high memory usage right now.
The text was updated successfully, but these errors were encountered:
Can you not get a reasonably reliable heuristic by just looking at the line lengths?
Is there any point including context lines if they're over ~500 characters?
Problem Statement
ContextLinesIntegration reads files from error stack traces in order to show the surrounding source lines. When new errors occur, the SDK will read these files regardless of their size and store them in an LRU cach indefinitely (or until they are evicted) so that subsequent contexts can be resolved faster.
This can have a large impact on the amount of memory an application uses (especially in the case of bundling), which I suspect is also causing longer GC sweeps.
Solution Brainstorm
It is hard to say how effective 1 and 2 would be, made up limitations like these are brittle at best + we risk breaking the product in its current form without providing an explanation.
3 could bring a substantial improvement. If we assume that a particular line of code is likely to crash again (as opposed to a line from anywhere in that file), then the surrounding context lines could also be cached.
A good next step would be to benchmark readline vs readfile to see the performance characteristics and evaluate if we could reasonably replace it. One benefit of readline would be that we can avoid copying the file in order to split it by line, which I suspect is also the culprit of high memory usage right now.
The text was updated successfully, but these errors were encountered: