Replies: 5 comments 13 replies
-
Hi @aevitas Are you using the latest version v3.1.6? That negative number looks to me like we are suffering from some sort of arithmetic overflow issue. Were there additional lines in the stack trace? |
Beta Was this translation helpful? Give feedback.
-
Yes, I'm using ImageSharp 3.1.6. I think it's overflowing somewhere in
If needed I can provide a fairly minimal application to reproduce with some of the actual TIFF data I'm using. |
Beta Was this translation helpful? Give feedback.
-
I've been diving a little deeper into this, and I've found that I'm not that well-versed in the nitty gritty of the TIF file format, but would it perhaps be possible to determine tiles using specific regions of the image so that I'd never have to allocate the full image data, but only specific sections? Is it even possible to load images this large since considering the maximum length of an |
Beta Was this translation helpful? Give feedback.
-
With the changes proposed in #2874 , these images now load properly. However, when loading the larger ones, I run into the allocation limit set by Would it be problematic if I'd run ImageSharp with such a high maximum allocation size? |
Beta Was this translation helpful? Give feedback.
-
I am trying to decode this extremely large 14GB TIF file: https://rubinobservatory.org/gallery/collections/first-look-gallery/mlis3sriah6pn6nfr5ecp46h3i and get the same error. My machine has 64GB RAM and I set the buffer sizes to 40,000 (40GB) and it still gives me the "Failed to allocate buffers for possibly degenerate dimensions: 0x0." error. My aim is to split the image into smaller tiles, but it appears that I must load the original image fully into memory before doing the split operations. So is there any way to fix the large TIF problem? Or is there a way to only load a piece of the image at a time? Thanks. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I've been a long time user of ImageSharp for all my imaging needs, so when I started a recent project using satellite imagery and drone footage, I once again came back to this library.
The images I'm working with are large geographical TIFF images, also known as orthomosaics. They can be stitched together from numerous other images, and as a result can become quite large (some of our current images are north of 1.5GB).
I'd like to use ImageSharp to transform, transcode and "tile" these images into smaller images, but when I attempt to load one of these images, I get the following exception:
Debugging this, I can see the following number of bytes being allocated every time:
Which I assume is four metadata allocations, followed by an allocation for the actual (overflowed size) image data.
Given that the TIFF I'm trying to load is 740MB, I fail to see why it would need to allocate over 2.14GB of memory to process the image. Other libraries such as GDAL are perfectly fine loading these images in an instant, but don't offer the image manipulation I need.
Is this a limitation with ImageSharp specifically? Is there something I'm doing wrong and is there a way to use these images with ImageSharp?
I've explored making my own fork of ImageSharp and replacing the allocation with just flat direct allocations as I'd be running this from a command line so the heap would never be long-lived, but it seemed like a non-trivial modification so I'd like to make sure there aren't any more sensible approaches.
Can provide sample files if needed.
Beta Was this translation helpful? Give feedback.
All reactions