Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error running out of memory #25

Closed
frznlogic opened this issue May 29, 2021 · 1 comment
Closed

Error running out of memory #25

frznlogic opened this issue May 29, 2021 · 1 comment

Comments

@frznlogic
Copy link

Hi,

I've been running into errors running out of memory, unfortunately I can't the complete message right now, but it seems that some images are causing a maxMemoryUsageInMB limit to be exceeded.

Running this command:
docker exec -it -u www-data docker.nextcloud.service ./occ recognize:classify

And then I get this error:

RuntimeError: abort(Error: maxMemoryUsageInMB limit exceeded by at least 3MB). Build with -s ASSERTIONS=1 for more info.
    at process.abort (/var/www/html/custom_apps/recognize/node_modules/@tensorflow/tfjs-backend-wasm/dist/tf-backend-wasm.node.js:4685:9342)
    at process.emit (events.js:314:20)
    at processPromiseRejections (internal/process/promises.js:245:33)
    at processTicksAndRejections (internal/process/task_queues.js:94:32)

Using Recognize 1.3.1 with NextCloud 21.0.2

Currently no memory limits on the docker container it's running in, and I've got 5269MiB free memory and plenty of swap if necessary.

Can I somehow bump the maxMemoryUsageInMB parameter somewhere, or is there some hack I can do for now to see if I can solve it? Not very used to working with php, node.js or tensorflow for that matter, but curious and really liking this so far;).

@marcelklehr
Copy link
Member

marcelklehr commented Jun 10, 2021

It appears you are trying to classify images larger than 512MiB.

You can change this line to read https://github.com/marcelklehr/recognize/blob/master/src/classifier.js#L32

const imageData = jpeg.decode(imageBuffer, {useTArray: true, formatAsRGBA: false, maxMemoryUsageInMB: 1024})

It should now be able to read images up to 1GiB.

I'm not sure how common images larger than 500MiB are in the wild... I'm happy to adjust this with the next release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants