-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bigWigSummary produces needLargeMem errors on many bigwigs #25
Comments
Ooh, interesting bug! We'll look into this further (unless someone already knows the answer and I get to learn something new), but I suspect that the problem may have occurred during the generation of the bigWig files (so something wrong with the bigWig writer, not the reader). If I'm reading the bytes right, the first bigWig file internally claims to have a size of 0xffffffff8855ee2f bytes, or about 1.8e+19. As you suggest, that seems unlikely. Using only the low-order 32 bits of that value gives 2287332911, which seems much more reasonable. |
Hi Nezar, can you tell us a bit of context ? We are you trying to do that
requires bigWigSummary?
…On Sun 2 Feb 2020 at 18:28, Jonathan Casper ***@***.***> wrote:
Ooh, interesting bug! We'll look into this further (unless someone already
knows the answer and I get to learn something new), but I suspect that the
problem may have occurred during the generation of the bigWig files (so
something wrong with the bigWig writer, not the reader). If I'm reading the
bytes right, the first bigWig file internally claims to have a size of
0xffffffff8855ee2f bytes, or about 1.8e+19. As you suggest, that seems
unlikely. Using only the low-order 32 bits of that value gives 2287332911,
which seems much more reasonable.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#25?email_source=notifications&email_token=AACL4TPTBMXEKQPEZ72UKJLRA37E7A5CNFSM4KOX7NQ2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEKR4GGA#issuecomment-581157656>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AACL4TNXNGQ47W3MMTOS3HLRA37E7ANCNFSM4KOX7NQQ>
.
|
So, I wasn't actually using I have an easy workaround: don't use the summary functionality and just do the binning and averaging in Python. It's more accurate anyway, since the other way is really interpolating from the nearest zoom level. pyBigWig, which uses its own bigwig lib, also seems to be able to execute the same queries. So this isn't an impediment for me, but I thought I'd report it. |
Thanks for reporting this @nvictus . It does look like this is a problem with how these files were created on the encode portal. I've sent them mail in an attempt to track down how this problem was introduced. |
I talked a but to Encode and we've been unable to track down this source of this problem. I've encouraged them to use our most recent code to build new big files since there have been several bug fixes and one may have resolved this problem. Do let us know if you run into this again. |
I've recently run into this exact issue on additional files from Encode beyond those in @nvictus's list.
It can really throw a wrench into a processing pipeline, so if you have any additional updates on a work-around or solution either on your end or on Encode's, I'd appreciate it. |
On certain files, it happens specifically when querying
chr8
withend
close to the chromosome size and a large number of bins (> 100,000).$ bigWigSummary ENCFF856LYZ.bigWig chr8 0 145138636 120000 > /dev/null needLargeMem: trying to allocate 18446744069429595380 bytes (limit: 17179869184)
The attempted allocation is clearly not reasonable.
I'm running into this same issue with the following files from encode:
The text was updated successfully, but these errors were encountered: