VCF hitting chunkSizeLimit #486

Closed
cmdcolin opened this Issue Jun 3, 2014 · 0 comments

Comments

Projects
None yet
1 participant
@cmdcolin
Contributor

cmdcolin commented Jun 3, 2014

Richard Hayes wrote recently about this problem on the mailing list:

Hi, 

I have encountered a strange bit of behavior. We have a VCF file that underwent a bit of
 post-processing to removed some data (SNPs from embargoed data from a larger joint call, 
essentially).

The unfiltered file contains ~6 million SNPs, and is about 449 Mb when compressed by 
bgzip. This displays fine in JBrowse v1.11.4.

The filtered file contains ~3.2 million SNPs, and is about 126 Mb when compressed 
by bgzip.This will not display in JBrowse. I get only the "Too much data. Chunk size 
7,329,488 bytes exceeds chunkSizeLimit of 1,000,000. zoom in to see detail." But, this 
error message persists even when zoomed into a 105 bp region. 

Both files were compressed by the same version of bgzip and indexed by the same tabix.

Is this a bug? Why would the more feature dense file render fine? I'm a bit flummoxed. 

We confirmed that this was a bug during the WebApollo hackathon

@rdhayes rdhayes closed this in 93698c6 Jun 3, 2014

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment