-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some images use too much memory when getting the thumbnail #226
Comments
I will try setting the limit to 1 gigapixel |
This was
linked to
pull requests
Jun 20, 2024
So this issue also effects the DSA in general.. and causes very very
difficult to deal with behavior. Particularly when ingesting images,
sometimes files are weird (as in the SVS you mentioned), and the DSA will
keep trying large_image tile sources until it finds one that can read the
image... then making things worse, particularly when we used to use
bioformats as a fallback, it will try and generate a thumbnail using the
full res image... and depending on what's going on with the computer it
will either take a weird amount of time and complete, or just kill the
system when it runs out of memory. Making things even worse, is that when
you then re-ingest the same file store, it will go right back to the file
causing the goofy behavior.
Not sure if there's a way to wrap any of these calls with some
timer/watchdog that says if the thread takes > some # of seconds, it will
just give up, unless a flag is set to "keep trying anyway"...
…On Mon, Jun 17, 2024 at 12:22 PM David Manthey ***@***.***> wrote:
For instance,
TCGA-5P-A9KA-01Z-00-DX1.6F4914E0-AB5D-4D5F-8BF6-FB862AA63A87.svs has only
one tiled layer that is 373 Gb. This ends up entirely in memory. We need a
guard that if the number of pixels is greater than some large value we
don't use the current technique to generate a thumbnail. For now, this
should just gracefully fail and we should make an issue to come back to
handle these cases.
As a reference, the large_image library works around this but this is very
slow (since you still have to decode the entire image to generate a
thumbnail).
—
Reply to this email directly, view it on GitHub
<#226>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAFODTQFKSIXF4HWIVCIMYDZH4EMDAVCNFSM6AAAAABJOKQKUKVHI2DSMVQWIX3LMV43ASLTON2WKOZSGM2TONZUGIYDGMY>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
--
David A Gutman, M.D. Ph.D.
Associate Professor of Pathology
Emory University School of Medicine
|
@dgutman This code base is different than DSA; if you actually have any images that cause memory issues in DSA, can you file a bug there and share the image? In general, we should gracefully (if slowly) handle badly encoded images in the DSA. |
aah ok my confusion.. It's been a while since the DSA has choked on any
large images, but this comment just made me think of that issue.
…On Tue, Jun 25, 2024 at 12:04 PM David Manthey ***@***.***> wrote:
@dgutman <https://github.com/dgutman> This code base is different than
DSA; if you actually have any images that cause memory issues in DSA, can
you file a bug there and share the image? In general, we should gracefully
(if slowly) handle badly encoded images in the DSA.
—
Reply to this email directly, view it on GitHub
<#226 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAFODTTYGECL4UM33Y5W67TZJGII3AVCNFSM6AAAAABJOKQKUKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOBZGM2TQNJTGQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
--
David A Gutman, M.D. Ph.D.
Associate Professor of Pathology
Emory University School of Medicine
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
For instance, TCGA-5P-A9KA-01Z-00-DX1.6F4914E0-AB5D-4D5F-8BF6-FB862AA63A87.svs has only one tiled layer that is 373 Gb. This ends up entirely in memory. We need a guard that if the number of pixels is greater than some large value we don't use the current technique to generate a thumbnail. For now, this should just gracefully fail and we should make an issue to come back to handle these cases.
As a reference, the large_image library works around this but this is very slow (since you still have to decode the entire image to generate a thumbnail).
The text was updated successfully, but these errors were encountered: