Skip to content
This repository has been archived by the owner on Mar 16, 2019. It is now read-only.

Are there s3 examples? #354

Open
jmparsons opened this issue May 11, 2017 · 10 comments
Open

Are there s3 examples? #354

jmparsons opened this issue May 11, 2017 · 10 comments

Comments

@jmparsons
Copy link

Is there a way to use this as a wrapper for s3?

I'm trying to get binary data and upload it to my bucket:

const s3 = new AWS.S3();
const params = {
  Bucket: 'mybucket',
  Key: 'myfile.ext',
  Body: file,
  ACL: 'public-read',
};
s3.upload(params, (err, data) => {
  console.log(err, data);
});

I've tried using streams with data chunk, the base 64 with content encoding etc. I see that fetch will convert a base64 to binary on upload, but how do I ad hoc this into the aws sdk for react native?

@jmparsons
Copy link
Author

Basically I'm looking for the equivalent of this node script:

fs.readFile('my-image.jpg', function (err, data) {
  if (err) { throw err; }
  var base64data = new Buffer(data, 'binary');
  var params = {
    Bucket: 'my-bucket',
    Key: 'my-image.jpg',
    Body: base64data,
    ContentType: 'image/jpg',
    ACL: 'public-read'
  };
  s3.upload(params, function(err, data) {
    if (err) {
      console.log(err);
    } else {
      console.log(data);
    }
  });
});

I have this:

await RNFetchBlob.fs.readStream('my-image.jpg', 'base64')
.then((stream) => {
  let data = '';
  stream.open();
  stream.onData((chunk) => { data += chunk; });
  stream.onError(error => console.log(error));
  stream.onEnd(() => {
    const params = {
      Bucket: 'my-bucket',
      Key: 'my-image.jpg',
      ContentType: 'image/jpg',
      Body: data,
    };
    s3.upload(params, (error, result) => {
      console.log(error, result);
    });
  });
})
.catch(error => console.log(error));

It works, but I can't set stream to binary, so the image isn't readable. I know in fetch it processes the upload as binary, but is there a convenience method to set readFile or readStream to binary?

@wkh237
Copy link
Owner

wkh237 commented May 14, 2017

@jmparsons , I'm not familiar with S3, but I think you can try to upload the file via REST API instead of use SDK.

@jmparsons jmparsons changed the title Are there are s3 examples? Are there s3 examples? May 14, 2017
@jmparsons
Copy link
Author

jmparsons commented May 14, 2017

@wkh237 Hey thanks for getting back to me. I'm temporarily using the REST api:

const filePath = 'my-file-path.jpg';
const file = `${RNFetchBlob.fs.dirs.DocumentDir}/${filePath}`;
const params = {
  Bucket: 'my-bucket',
  Key: filePath,
  ContentType: 'image/jpg',
};
const url = `https://${params.Bucket}.s3.amazonaws.com/${params.Key}`;
const result = await RNFetchBlob.fetch('PUT', url, {
  [HeaderNames.ContentType]: params.ContentType,
}, RNFetchBlob.wrap(file));

The s3 uploader basically ties into AWS, and wraps the request with a lot of information - it could be as simple as AWS config keys such as access and secret, or headers such as authorization tokens and access control layers.

S3 can accept buffers, blobs and streams. I get different errors depending on what methods I tried, with variations. I think stream only took base64, so the end result was base64, and blob there was an issue with content length.

My previous attempts with the stream would upload, but since the buffer was set to base64 and not binary the images would be corrupted in s3.

@wkh237
Copy link
Owner

wkh237 commented May 14, 2017

@jmparsons , thanks for the explanations.

From my understanding, React Native has no typed array implementation (there are commits working on it yet it's still work in progress) so we can't make something like Node.JS's Buffer in React Native. I think your code is correct, the file will be sent as binary data but I'm not sure if the error comes from our implementation or just because something is still missing, could you provide more detail about the error?

Also, have you tried using Blob and XMLHttpRequest polyfills?

@developer239
Copy link

developer239 commented May 30, 2017

I have this upload function

function uploadFile(file, signedUrl, callback) {
  RNFetchBlob.fetch('PUT', signedUrl, {
    'Content-Type': 'octet-stream'
  }, file.data)
    .uploadProgress((written, total) => {
      // TODO: Add callback that will update state in profile page?
      console.log('[progress] uploaded:', written / total)
    })
    .then((response) => {
      console.log('[success] RNFetchBlob.fetch succeeded')
      callback(response)
    })
    .catch((error) => {
      // TODO: Add callback that will handle error?
      console.log('[error] RNFetchBlob.fetch failed : ', error)
    })
}

I created my own backend server

var express = require('express');
var cors = require('cors');
var fs = require('fs');

/**
 * Saves base64String to image with pseudo random name
 *
 * @param base64String
 */
function saveBase64StringToFile(base64String) {
  var imgName = 'img-' + Math.floor(Date.now() / 1000) + '.jpg'; // TODO: make .jpg suffix dynamic
  var cleanBase64String = base64String.replace(/^data:image\/jpg;base64,/, ''); // TODO: make .jpg suffix dynamic

  fs.writeFile(imgName, cleanBase64String, 'base64', function (err) {
    if (err) {
      console.log('[error] decode base64 image failed: ', err);
    } else {
      console.log('image ' + imgName + ' successfully written in ./' + imgName)
    }
  });
}

// Server PORT
var PORT = 5000;
// Http server
var app = express();
// Allow CORS
app.use(cors());

app.put('/stream', function (req, res, next) {
  var content = '';

  // Handle streaming
  req.on('data', function (data) {
    // Return 413 if content gets too big
    if (content.length > 1e6) {
      res
        .status(413)
        .json({ error: 'Request content got too large.' });
    }
    content += data;
  });

  // Handle end of streaming
  req.on('end', function () {
    saveBase64StringToFile(content);
    res.status(200).json({ yay: content });
  });
});

app.listen(PORT, function () {
  console.log('Web server listening on port ' + PORT)
});

When I try to upload the file to amazon signer URL i get SignatureDoesNotMatch error. My server works fine.

@jmparsons
Copy link
Author

@developer239 Thanks for that! How are you generating your signed url? I was trying to test it out and pretty much do the exact same thing, but my signed url's weren't authorizing.

@developer239
Copy link

developer239 commented May 30, 2017

@jmparsons My team resolved the problem. It was because of bad signed url generation. I believe that this code could work?

    const params = {
      Key: `${body.objectType}/${uuid}.${fileSuffix[1]}`,
      ContentType: body.mimeType,
      ACL: 'public-read', // not sure about this line
    }
    if (body.base64) {
      params.ContentEncoding = 'base64'
    }

    let uploadUrl = null
    try {
      uploadUrl = await Promise.fromCallback(done =>
        s3Bucket.getSignedUrl('putObject', params, done))
    } catch (err) {
      log.info({ errMessage: err }, 'Generating asw signed url')
      throw new errors.InternalServerError('E_INTERNAL_SERVER_ERROR',
        'Cannot generate aws signed url')
    }

    log.info({ uploadUrl }, 'AWS signed upload URL obtained.')

    return { signedUrl: uploadUrl }

If not I I intentionally included the server file. If you install

var express = require('express');
var cors = require('cors');
var fs = require('fs');

then run node index in the root directory. Then your signer url on client will be http://localhost:5000/stream when you send octo stream on that url as base64string it should be able to write the file in the server project directory.

@jmparsons
Copy link
Author

@developer239

My signedUrl method looked like this - using API gateway via serverless with cors set:

getSignedUrl = async (event, context, callback) => {
  const s3 = new AWS.S3();
  const body = JSON.parse(event.body);
  const params = { Bucket: this.mediaBucket, Key: body.filename };
  s3.getSignedUrl('putObject', params, (err, url) => {
    if (err) {
      callback(null, failure({ status: false, error: 'Signed url failed.' }));
    } else {
      callback(null, success(url));
    }
  });
}

I believe I had tried content type and acl, and was going back and forth on the encoding. I'll come back to it soon and see if I could get it to work. It wasn't authorizing for me before, but those values could be the ticket combined with octet-stream.

@tomduncalf
Copy link

For future reference, I figured out how to do this eventually! https://gist.github.com/tomduncalf/17f57adf5a1343d20b3b3eee11cc7893

@jmandel1027
Copy link

@tomduncalf ty sir! 🌮

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants