-
-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Crashes with 'Too many pending tasks in queue' #490
Comments
I see why you're hitting the limit right away. function getValueIfExists(db, search_path, get_path = search_path) {
return await db.exists(search_path) ? db.getData(get_path) : "-";
} Doesn't await the getData(). function getValueIfExists(db, search_path, get_path = search_path) {
return await db.exists(search_path) ? await db.getData(get_path) : "-";
} If you're looking for performance, there is even a faster way: import {DataError} from 'node-json-db';
function getValueIfExists(db, search_path) {
try {
return await db.getData(search_path)
} catch (e) {
if (e instanceof DataError) {
return '-'
}
throw e
}
} This way you only have one call of |
I replaced the function with the one you suggested, now I don't have to call const results = await Promise.all(images.map((value) => {
return getImageTag(value, search_term_split[0]);
})); works on an array of 4000 entries |
Instead of creating a promise for each image, you should chunk the image array and run a promise on each chunk of 500 images. You'll get the same speed increase without reaching any limit (I'm even suprised NodeJS let your create 4000 Promises in one go). Check Lodash Chunk: |
As far as I know, the maximum number of promises created with |
## [2.1.3](v2.1.2...v2.1.3) (2022-09-21) ### Performance Improvements * **Locking:** Use proper read write lock instead of one lock for all operation ([f3f422a](f3f422a)), closes [#490](#490)
🎉 This issue has been resolved in version 2.1.3 🎉 The release is available on: Your semantic-release bot 📦🚀 |
I've rewrote the locking logic and change dependency. v2.1.3 shouldn't need to set any maxPending etc ... Concurrent read won't be blocking anymore. |
Great |
In my app I need to search through all the entries in the database according to some search criteria (about 4000 entries), so I load all the entries using Promise.all
under the hood
getImageTag
tries to get a value from the databaseand this crashes the app with
Too many pending tasks in queue
A possible solution would be to add an optional parameter to
JsonDBConfig
to set maximum pending tasks because if I modify AsyncLock constructor in JsonDB.js toeverything works and is lightning fast
The text was updated successfully, but these errors were encountered: