-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Are there s3 examples? #354
Comments
Basically I'm looking for the equivalent of this node script: fs.readFile('my-image.jpg', function (err, data) {
if (err) { throw err; }
var base64data = new Buffer(data, 'binary');
var params = {
Bucket: 'my-bucket',
Key: 'my-image.jpg',
Body: base64data,
ContentType: 'image/jpg',
ACL: 'public-read'
};
s3.upload(params, function(err, data) {
if (err) {
console.log(err);
} else {
console.log(data);
}
});
}); I have this: await RNFetchBlob.fs.readStream('my-image.jpg', 'base64')
.then((stream) => {
let data = '';
stream.open();
stream.onData((chunk) => { data += chunk; });
stream.onError(error => console.log(error));
stream.onEnd(() => {
const params = {
Bucket: 'my-bucket',
Key: 'my-image.jpg',
ContentType: 'image/jpg',
Body: data,
};
s3.upload(params, (error, result) => {
console.log(error, result);
});
});
})
.catch(error => console.log(error)); It works, but I can't set stream to binary, so the image isn't readable. I know in fetch it processes the upload as binary, but is there a convenience method to set readFile or readStream to binary? |
@jmparsons , I'm not familiar with S3, but I think you can try to upload the file via REST API instead of use SDK. |
@wkh237 Hey thanks for getting back to me. I'm temporarily using the REST api: const filePath = 'my-file-path.jpg';
const file = `${RNFetchBlob.fs.dirs.DocumentDir}/${filePath}`;
const params = {
Bucket: 'my-bucket',
Key: filePath,
ContentType: 'image/jpg',
};
const url = `https://${params.Bucket}.s3.amazonaws.com/${params.Key}`;
const result = await RNFetchBlob.fetch('PUT', url, {
[HeaderNames.ContentType]: params.ContentType,
}, RNFetchBlob.wrap(file)); The s3 uploader basically ties into AWS, and wraps the request with a lot of information - it could be as simple as AWS config keys such as access and secret, or headers such as authorization tokens and access control layers. S3 can accept buffers, blobs and streams. I get different errors depending on what methods I tried, with variations. I think stream only took base64, so the end result was base64, and blob there was an issue with content length. My previous attempts with the stream would upload, but since the buffer was set to base64 and not binary the images would be corrupted in s3. |
@jmparsons , thanks for the explanations. From my understanding, React Native has no typed array implementation (there are commits working on it yet it's still work in progress) so we can't make something like Node.JS's Buffer in React Native. I think your code is correct, the file will be sent as binary data but I'm not sure if the error comes from our implementation or just because something is still missing, could you provide more detail about the error? Also, have you tried using Blob and XMLHttpRequest polyfills? |
I have this upload function
I created my own backend server
When I try to upload the file to amazon signer URL i get |
@developer239 Thanks for that! How are you generating your signed url? I was trying to test it out and pretty much do the exact same thing, but my signed url's weren't authorizing. |
@jmparsons My team resolved the problem. It was because of bad signed url generation. I believe that this code could work?
If not I I intentionally included the server file. If you install
then run |
My signedUrl method looked like this - using API gateway via serverless with cors set: getSignedUrl = async (event, context, callback) => {
const s3 = new AWS.S3();
const body = JSON.parse(event.body);
const params = { Bucket: this.mediaBucket, Key: body.filename };
s3.getSignedUrl('putObject', params, (err, url) => {
if (err) {
callback(null, failure({ status: false, error: 'Signed url failed.' }));
} else {
callback(null, success(url));
}
});
} I believe I had tried content type and acl, and was going back and forth on the encoding. I'll come back to it soon and see if I could get it to work. It wasn't authorizing for me before, but those values could be the ticket combined with octet-stream. |
For future reference, I figured out how to do this eventually! https://gist.github.com/tomduncalf/17f57adf5a1343d20b3b3eee11cc7893 |
@tomduncalf ty sir! 🌮 |
Is there a way to use this as a wrapper for s3?
I'm trying to get binary data and upload it to my bucket:
I've tried using streams with data chunk, the base 64 with content encoding etc. I see that fetch will convert a base64 to binary on upload, but how do I ad hoc this into the aws sdk for react native?
The text was updated successfully, but these errors were encountered: