Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak in transactions with many queries #1186

Open
jroith opened this issue Mar 5, 2024 · 3 comments
Open

Memory leak in transactions with many queries #1186

jroith opened this issue Mar 5, 2024 · 3 comments
Labels

Comments

@jroith
Copy link

jroith commented Mar 5, 2024

I have a situation where the driver can cause nodeJS to run out of memory if a transaction with many queries is run (700k very simple ones in my case). While it may seems unreasonable to execute so many queries instead of executing a large query, the driver should consume more or less constant memory if possible.

The source is an array _results that grows with every call to run():

this._results.push(result)

It seems like why this array exists in the first place is to be able to check isOpen() in the _onErrorCallback and to await the summary() of each result in pendingResults.

While I don't know why exactly it has to be done like that, I would like to suggest a simple fix. In most cases the call to run() will have been either awaited or if not (and subscribed) the promise will have been resolved soon (in practice often before the next query, certainly before a far later query). It seems like the results do not need to be sorted and therefore it should be possible to turn _results into a Set instead of an array and to drop the result from that set as soon as the summary is available.

In that way the size of the set would in my case like be 1 and in most other cases it would still be prevented from growing unboundedly and eventually causing nodeJS to crash.

@jroith jroith added the bug label Mar 5, 2024
@bigmontz
Copy link
Contributor

bigmontz commented Mar 6, 2024

Hey @jroith, the result is consumed before commit. Since, if you don't not consume the result, the database might commit partial changes (so, ignoring the non consumed parts).

Observe on the result for removing it from the pending results list when completed can solve this issue. However, the user stills able to messing up with everything if they didn't try to consume the results at all.

I will push this to the backlog, so it can be addressed in the future.

@jroith
Copy link
Author

jroith commented Mar 6, 2024

@bigmontz The user would not really be able to mess things up, because the results that were not removed from the pending list would still be there and can still be resolved by the commit/rollback callback. Unless you mean that the user can cause unbounded memory use in those cases - that's true but kind of expected and all other cases would be handled.

Would you possibly accept a PR?

@bigmontz
Copy link
Contributor

bigmontz commented Mar 6, 2024

Sure, can you take a look at our contributing guides?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants