Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crashes with 'Too many pending tasks in queue' #490

Closed
dgudim opened this issue Sep 18, 2022 · 7 comments
Closed

Crashes with 'Too many pending tasks in queue' #490

dgudim opened this issue Sep 18, 2022 · 7 comments
Labels

Comments

@dgudim
Copy link

dgudim commented Sep 18, 2022

In my app I need to search through all the entries in the database according to some search criteria (about 4000 entries), so I load all the entries using Promise.all

const results = await Promise.all(images.map((value) => {
      return getImageTag(value, search_term_split[0]);
}));

under the hood getImageTag tries to get a value from the database

function getImageTag(file: string, arg: string): Promise<string> {
    return getValueIfExists(db, `^${file}^tags^${mapArgToXmp(arg)}`);
}

function getValueIfExists(db, search_path, get_path = search_path) {
    return await db.exists(search_path) ? db.getData(get_path) : "-";
}

and this crashes the app with Too many pending tasks in queue
A possible solution would be to add an optional parameter to JsonDBConfig to set maximum pending tasks because if I modify AsyncLock constructor in JsonDB.js to

lock = new AsyncLock({maxPending: 10000});

everything works and is lightning fast

@Belphemur
Copy link
Owner

Belphemur commented Sep 18, 2022

I see why you're hitting the limit right away.

function getValueIfExists(db, search_path, get_path = search_path) {
    return await db.exists(search_path) ? db.getData(get_path) : "-";
}

Doesn't await the getData().

function getValueIfExists(db, search_path, get_path = search_path) {
    return await db.exists(search_path) ? await db.getData(get_path) : "-";
}

If you're looking for performance, there is even a faster way:

import {DataError} from 'node-json-db';

function getValueIfExists(db, search_path) {

	   try {
	              return await db.getData(search_path)
	          } catch (e) {
	              if (e instanceof DataError) {
	                  return '-'
	              }
	              throw e
	          }
}

This way you only have one call of getData instead of two (exists does a call under the hood)

@dgudim
Copy link
Author

dgudim commented Sep 18, 2022

I replaced the function with the one you suggested, now I don't have to call exists twice, that's great, but the issue still persists because

const results = await Promise.all(images.map((value) => {
      return getImageTag(value, search_term_split[0]);
}));

works on an array of 4000 entries
I still want to be able to change maxPending value

@Belphemur
Copy link
Owner

Instead of creating a promise for each image, you should chunk the image array and run a promise on each chunk of 500 images. You'll get the same speed increase without reaching any limit (I'm even suprised NodeJS let your create 4000 Promises in one go).

Check Lodash Chunk:
https://lodash.com/docs#chunk
https://www.npmjs.com/package/lodash.chunk

@dgudim
Copy link
Author

dgudim commented Sep 20, 2022

As far as I know, the maximum number of promises created with Promise.all can be as high as 2 million, so there is no need to chunk the data, chunking the data actually results in a little worse performance and less understandable code. What is the problem with letting the user set the maxPending value, I am confused

github-actions bot pushed a commit that referenced this issue Sep 21, 2022
## [2.1.3](v2.1.2...v2.1.3) (2022-09-21)

### Performance Improvements

* **Locking:** Use proper read write lock instead of one lock for all operation ([f3f422a](f3f422a)), closes [#490](#490)
@github-actions
Copy link
Contributor

🎉 This issue has been resolved in version 2.1.3 🎉

The release is available on:

Your semantic-release bot 📦🚀

@Belphemur
Copy link
Owner

I've rewrote the locking logic and change dependency.

v2.1.3 shouldn't need to set any maxPending etc ...

Concurrent read won't be blocking anymore.

@dgudim
Copy link
Author

dgudim commented Sep 21, 2022

Great

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants