Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error decoding content #55

Closed
apitts opened this issue Jun 4, 2015 · 5 comments
Closed

Error decoding content #55

apitts opened this issue Jun 4, 2015 · 5 comments

Comments

@apitts
Copy link

apitts commented Jun 4, 2015

I am trying to upload some gzipped json content to s3 and keep running into an ERR_CONTENT_DECODING_FAILED error, which would suggest there is something awry with the encoding. Hopefully there is something straightforward that I've missed.

Here is what I'm using on the client:

      $upload.http({
        method: 'PUT',
        headers: {'Content-Type': 'application/json', 'Content-Encoding': 'gzip'},
        url: response.s3URL,
        data: pako.gzip(JSON.stringify(networkForUpload), {to: 'string'})
      }).success(function(data) {
       //Do more stuff
      });

And the server to get the signed url:

      s3.getSignedUrl('putObject', {
        Bucket: config.bucketNames.uploadedNetworks,
        Expires: 300,
        Key: network._id.toString(),
        ContentType: 'application/json',
        ContentEncoding: 'gzip',
        ACL: 'public-read'
      }, function(err, data) {
      //Do more stuff
      });

@puzrin
Copy link
Member

puzrin commented Jun 4, 2015

data: pako.gzip(JSON.stringify(networkForUpload), {to: 'string'}) i'm not sure this will be converted in request to proper equivalent of gzipped body.

It's better to ask such question at SO, because it's not related anyhow to gzipping algorythm. But you can be sure, that pako's output is 100% equal to one from zlib.

@puzrin puzrin closed this as completed Jun 4, 2015
@apitts
Copy link
Author

apitts commented Jun 4, 2015

Understood. Thanks @puzrin! I will ask on SO...have tried everything I could think of to solve the issue today.

@puzrin
Copy link
Member

puzrin commented Jun 4, 2015

You probably need to pass ArrayBuffer to xhr, and make sure server side can really decode gzipped data. But i never did it, and can not say exactly how to implement gzipped http request from client.

@apitts
Copy link
Author

apitts commented Jun 4, 2015

Thanks! I'll give that a go and keep plugging away at it.

@apitts
Copy link
Author

apitts commented Jul 21, 2015

Just in case anyone comes across this issue. A nice solution to this problem was to write a simple Lambda function that gzip's an object on an S3 PUT event.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants