Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

$.getJSON on large compressed file #4835

Closed
babarlelephant opened this issue Feb 2, 2021 · 4 comments
Closed

$.getJSON on large compressed file #4835

babarlelephant opened this issue Feb 2, 2021 · 4 comments

Comments

@babarlelephant
Copy link

babarlelephant commented Feb 2, 2021

Here on windows 7 32bit and chrome $.getJSON('big.json.gz') fails when the uncompressed file gets larger than 256MB while
splitting the file in two parts and merging the obtained objects in javascript works fine.

I'd like to know at which step the large file is failing, if there is any chance it can be solved, otherwise what is the technical solution and implementation (streamed parsing..) that has some chance to work.

The demo is there http://cov2.infinityfreeapp.com/bigjson.html the code that works is

               $.getJSON( "./big.part1.json.gz").done(function( res ){ 
			$.getJSON( "./big.part2.json.gz").done(function( res2 ){
				result = res.concat(res2);
			});
		});


There is the python code I used to generate the uncompressed json files (as the equivalent code failed on node.js)

import json

str = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"

data = list()
i=0
filesize = 280*1024*1024   # 280MB
while i*(128+4) < filesize :
	data.append(str)
	i += 1

with open('big.json', 'w') as outfile:
    json.dump(data, outfile)
	
data = list()
i=0
filesize = 140*1024*1024   # 140MB
while i*(128+4) < filesize :
	data.append(str)
	i += 1

with open('big.part1.json', 'w') as outfile:
    json.dump(data, outfile)

with open('big.part2.json', 'w') as outfile:
    json.dump(data, outfile)
@ashwalk33r
Copy link

What browser did you use?

@babarlelephant
Copy link
Author

babarlelephant commented Feb 21, 2021

@ashwalk33r Chrome 32bit.

I hadn't checked correctly on firefox (32bit), I needed to play with the .htaccess file, now it gives mostly the same behavior as on Chrome.

@babarlelephant
Copy link
Author

babarlelephant commented Feb 21, 2021

I must say that

var req = new XMLHttpRequest(),req2 = new XMLHttpRequest();
req.responseType = "arraybuffer"; 
req.onload = function(oEvent) {
  var arrayBuffer = req.response;
  var data = new Uint8Array(arrayBuffer);
};
req.open("GET", "big.json.gz", true);
req.send();

seems to give the correct uncompressed data. So it remains to JSON parse it without trying to create a large string.

@timmywil
Copy link
Member

Splitting large strings etc. is outside of the scope of $.ajax. In such exceptional cases it's better to download as raw text and merge manually. There's going to be a different strategy based on the payload used. I don't think we can build in a universal solution to $.getJSON that will work for all cases.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants