New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementing upload to S3 #33

Closed
lox opened this Issue Feb 19, 2013 · 63 comments

Comments

Projects
None yet
@lox

lox commented Feb 19, 2013

I'm trying to get Dropzone to upload directly to Amazon S3 using CORS:

http://www.ioncannon.net/programming/1539/direct-browser-uploading-amazon-s3-cors-fileapi-xhr2-and-signed-puts/

For each upload, I need to be able to customize the url it uploads to and provide additional form data. Any tips on where would be best to do that?

@lox

This comment has been minimized.

lox commented Feb 19, 2013

Seems to me passing in the FormData object to the send event would cover it nicely?

@enyo

This comment has been minimized.

Owner

enyo commented Feb 20, 2013

The FormData object is already filled with the necessary information to upload the file. If you can extract and extend it that would be a good workaround for now. I'm currently implementing the possibility to add additional data for each file. Should be done in about 1 1/2 weeks.

@enyo

This comment has been minimized.

Owner

enyo commented Feb 20, 2013

That's #42 now

@enyo enyo closed this Feb 20, 2013

@lox

This comment has been minimized.

lox commented Feb 21, 2013

How would I extract and extend the FormData object?

@tienshiao

This comment has been minimized.

tienshiao commented Feb 23, 2013

I've made a couple of tiny updates to my local version and I have it uploading to S3. It should work with multiple files, though I'm only using it with a single file upload.

You need to set up your CORS policy on S3 to allow your origin, various headers, and your method (I'm using POST).

I'm using signed POSTs, and the policy I'm generating does a begins-with check on the S3 key and expires after a day, so it should allow for multiple files over a period of time.

Changes I made to dropzone.js to support uploads to S3:

  1. S3 expects the file to be the last parameter and does not evaluate parameters after the file parameter. Right now dropzone adds the file parameter first. My fix was to modify uploadFile to add the file after it loops through the from inputs.
  2. Dropzone checks the content-type of the response for JSON, but my S3 responses do not have a content-type. I just updated the condition to check if there is a "content-type" header before calling indexOf() on it.

Hope that helps,

Tienshiao

@BenoitLefebvre

This comment has been minimized.

BenoitLefebvre commented Feb 24, 2013

@lox , the link you shared, the tutorial.. looks like it's not working when uploading files with spaces in the filename.. you checked that? Any solution?

enyo added a commit that referenced this issue Feb 26, 2013

enyo added a commit that referenced this issue Feb 26, 2013

enyo added a commit that referenced this issue Feb 26, 2013

@enyo

This comment has been minimized.

Owner

enyo commented Feb 26, 2013

@lox I now added the formData event to the sending event. Thanks for your input.

@lox

This comment has been minimized.

lox commented Mar 18, 2013

I have this working nicely now, thanks.

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Apr 29, 2013

I'm just having a look at the moment and I'm reading some documentation on AWS site: http://aws.amazon.com/articles/1434

If I succeed I'll add some documentation on how to setup S3 CORS rules, setup the policy and how to handle it in dropzone. I'll post it here first to make sure that I'm not doing something wrong or that could be done much easier ;)

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Apr 29, 2013

All right. I'm facing few issues at the moment.
First I've updated dropzone from 1.3.9 to 2.0.14.

This is my dropzone configuration:

var dropbox = $('#dropbox'),
        uploaded_images = '.uploaded_images',
        message = $('.message', dropbox);

    dropbox.dropzone({
        // The name of the $_FILES entry:
        paramname: 'pic',
        maxfiles: 5,
        parallelUploads: 2,
        maxFilesize: 30, // in mb
        maxThumbnailFilesize: 8, // 3MB
        thumbnailWidth:250,
        thumbnailHeight:150,
        previewsContainer: uploaded_images,
        url: 'http://my-bucket.s3.amazonaws.com',

        addedfile: function(file) {

            var tpl = twig({href: '/bundles/acmecontent/js/template/image_upload.html.twig', async:false});
            file.template = $(tpl.render());
            $(this.previewsContainer).append(file.template);
            file.template.find(".filename span").text(file.name);
            file.template.find("#filename").html(file.name);
            return file.template.find("#filesize").html(this.filesize(file.size));
        },
        sending: function(file, xhr, formData){
            $.post('/prepare-upload', {filename: file.name}, function(response){
                $.each(response, function(k, v){
                    formData.append(k, v);
                });
            }, 'json');
        },
        thumbnail: function(file, dataUrl) {
            file.template.removeClass("file-preview").addClass("image-preview");
            return file.template.find(".details img").attr('alt', file.name).attr('src', dataUrl);
        },
        processingfile: function(file) {
            return file.template.addClass("processing");
        },
        uploadprogress: function(file, progress) {
            return file.template.find(".progress .upload").css({
                width: "" + progress + "%"
            });
        },
        success: function(file, serverResponse, event) {
            var _this = this;
            file.template.find('.status_message .inputImage').val(serverResponse.file_name);

            file.template.find('.btn-delete').on('click', function(){
                _this.removeFile(file);
            });

            return file.template.addClass("done");
        },
        removedfile: function(file) {
            var _this = this;
            return file.template.fadeOut('fast', function(){
                this.remove();
            });
        },
        error: function(file, response) {
            response = $.parseJSON(response);
            file.template.addClass("error");
            return file.template.find(".error-message span").html(response.message);
        }
    });

This is the json that the server is returning when calling the prepare upload script:

{
"access_key":"AKIAIYXXXXXXXXXXXXXX",
"key":"path/to/object/ce9551636287f76881ec0df392db99eaa31acb27c.jpg",
"policy":"eyJleHBpcmF0aW9uIjoiMjAxMy0wNC0zMFQwNTozMDoxMyswMDowMCIsImNvbmRpdGlvbnMiOnsiYnVja2V0IjoiaW1nLnN0YWdlLnp1bWXXXXXXXXXXXXXX....",
"signature":"4fe9b1486ad27662f47cXXXXXXXXXXXXXXXX",
"success_action_status":"201"
}

It is supposed to append those value to the POST values before the file key.

  1. The first problem I had was with the POST URL, using HTTPS instead of HTTP return a 500 for the OPTION request to amazon. After switching to HTTP it's now passing the OPTION request and fail with a 400 on the POST
  2. So the second problem is that I'm receiving a 400 response from amazon. When I have a look to the POST request in chrome I can't see any of the parameter that sending is suppose to set using formData.append(key, value). The response from amazon is the following:
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>InvalidArgument</Code><Message>Bucket POST must contain a field named 'key'.  If it is specified, please check the order of the fields.</Message><ArgumentValue></ArgumentValue><ArgumentName>key</ArgumentName><RequestId>5A8B1AAXXXXXXXX</RequestId><HostId>9j9Qb/L69GIiyLriejb/xdJJ/ts7nzFC22XXXXXXXXXXXXXXXXXXXXXX</HostId></Error>

This is the Request payload from the network tab in Chrome:

------WebKitFormBoundaryc2oPmNW82hkO442Y
Content-Disposition: form-data; name="file"; filename="560866_10151578668261632_175677954_n.jpg"
Content-Type: image/jpeg

------WebKitFormBoundaryc2oPmNW82hkO442Y--

  • Does anyone knows why my POST variables are not sent with the file?
  • Can I use success_action_redirect to redirect to a script on my server, perform some actions and return a JSON response to dropzone success method?
  • If not how can I parse the amazon XML response that the server is returning?

I'll keep investigating in the mean time.
Thanks,

Maxime

@lox

This comment has been minimized.

lox commented Apr 29, 2013

I just kept it very simple and did the signing on the server side before I generated the form page:

<form action="https://swiftly-uploads.s3.amazonaws.com/" id="s3dropzone">
  <input name="key" type="hidden" value="#random sha1 hash goes here#/${filename}">
  <input name="acl" type="hidden" value="private">
  <input name="policy" type="hidden" value="#encoded policy goes here#">
  <input name="signature" type="hidden" value="<<signature goes here>>">
  <input name="content-type" type="hidden" value="application/octet-stream">
  <input name="AWSAccessKeyId" type="hidden" value="AXXXXXXXXXXXXXXX">
  <input name="success_action_status" type="hidden" value="200">
</form>

The policy is good for several hours and for all the files uploaded. Happy to share more config if that helps.

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Apr 29, 2013

In my case, I don't have any form because I'm using multi-file upload. I was previously posting the files to my server and then upload it to S3 (I was also creating a record into my database to point to this file on S3).

But this is just a waist of time and bandwidth so I was looking in signed POST to upload the file directly from the client instead.

I think the only problem I have is that the POST variables are not set using formData.append() but I don't understand why and how to debug it because you can't check what is inside the formData object.

I reckon the xhr.send(formData) is done before the sending event are finished to set the formData object but I'm not sure yet. I'll try to append some values manually to see how it goes.

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Apr 29, 2013

Yeah, I think this is the problem. In the sending method (the one replacing the old preparingupload method), If I set a new key manually like this:

sending: function(file, xhr, formData){
            // Manual test
            formData.append('test', 'amazon');

            $.post('/prepare-upload', {filename: file.name}, function(response){
                console.log(response);
                $.each(response, function(k, v){
                    formData.append(k, v);
                });
            }, 'json');
        },

In the Request Payload I can see the test one but not the rest:

------WebKitFormBoundaryaC3Tyb90bBXLFgYp
Content-Disposition: form-data; name="test"

amazon

So xhr.send(formData) is not waiting for the $.post request to set the value before to be executed. This result in the fact that the post variables are not set properly before the xhr request.

Is there any workaround ?

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Apr 29, 2013

I've tried using .done() method but it's not changing anything.

$.post('/prepare-upload', {filename: file.name}).done(function(response){
    $.each(response, function(k, v){
        formData.append(k, v);
    });
}, 'json');
@lox

This comment has been minimized.

lox commented Apr 29, 2013

Not sure what you mean by not having a form. The form I included above is used to upload multiple files directly to S3 with a signed post.

@lox

This comment has been minimized.

lox commented Apr 29, 2013

Oh, unless you meant that your form is static HTML and you aren't able to generate it uniquely each time.

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Apr 29, 2013

Humm, no. I have no static HTML form and I have no form tags anywhere in my code. I just have the dropzone plugin attach to a div that's it.

I finally fixed it using the following:

...
sending: function(file, xhr, formData){
    $.ajax({
        url: '/prepare-upload',
        data: {filename: file.name},
        type: 'POST',
        async: false,
        success: function(response){
            $.each(response, function(k, v){
                formData.append(k, v);
            });
        }
    });
},
...

async: false fixed it.

Now I have a problem in the signature of the request and currently investigating.

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Apr 29, 2013

Ok, I've fixed the issue I had with the signature of the policy (using PHP, here is the sample of code: gist).

What I'd like to do is using the success_action_redirect to call a script on my server to create a record in the database. I know that I could use the success event to do so but I'd like to do it directly use the redirect URL to perform this.

The reason why is that I could create the record in the database and if anything wrong happen at this stage I can return an error message directly at this point. Also it saves me another ajax request to my server.

So I've tried to set this up specifying the success_action_redirect to http:\\localhost\callback.php where I have a script that is waiting for some parameter.

But it looks like this script is never called and the response of the xhr.send() is empty.

I think it's a cross-browser issue and I'm wondering if it would be possible to use jsonp somehow to pass-by this?
Any ideas?

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Apr 30, 2013

I've abandoned the idea of using the success_action_redirect and handle it doing a second ajax call when I receive the XML response from S3.

@matoho

This comment has been minimized.

matoho commented May 7, 2013

@lox, I would be very interested to find out more about your solution. So if I understand correctly, you are reading the files-array server-side and then generate multiple forms with the signed amazon data and then submit it? How do you know the progress of the files and is your solution robust also with large files? Would it be possible, by any chance, to see some server-side code of your process? It would really help me with the implementation..

@lox

This comment has been minimized.

lox commented May 7, 2013

Just realized I already posted the form further up :) @maxgoesup I'll dig up the code I'm using server-side.

@enyo

This comment has been minimized.

Owner

enyo commented May 7, 2013

If somebody would be willing to write a wiki entry for this I would be very thankful :)

Apparently many people try to use Dropzone in combination with Amazon and it seems to be a bit of a hassle.

@lox

This comment has been minimized.

lox commented May 7, 2013

The server-side of my code is here: https://gist.github.com/lox/5532281

Perhaps I'll just build a simple example app.

@lox

This comment has been minimized.

lox commented May 7, 2013

It might be fairly obvious, but it took me a while to realize that this hidden field in the form is key:

<input name="key" type="hidden" value="some_prefix_I_choose/${filename}">

The ${filename} is interpreted by S3, and used to set the name and path of the uploaded file. This lets you use the one form for multiple uploads.

http://aws.amazon.com/articles/1434

@matoho

This comment has been minimized.

matoho commented May 7, 2013

Thanks a lot for your code. I will study it and will try to implement a solution in node/express and post it here when I get done.

@core2kx

This comment has been minimized.

core2kx commented Dec 14, 2013

@philippfrank Hmm I have always used drag and drop for the Dropzone. I actually don't set the uploadMultiple setting at all. See if that helps. Which of course contradicts my earlier comment.

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Dec 14, 2013

@philippfrank I'm not using HTML form element but the formData object and multiple upload are working great with it. I'm storing the S3 data I need in the file object (this is done in the accept() method) and set those data before sending the file (in sending() method):

accept: function(file, done)
        {
            file.postData = [];
            $.ajax({
                url: '/webservice/content/prepare-upload',
                data: {name: file.name, type: file.type, size: file.size},
                type: 'POST',
                success: function(response)
                {
                    file.custom_status = 'ready';
                    file.postData = response.post;
                    file.guid = response.data.guid;
                    file.s3 = response.post.key;
                    $(file.previewTemplate).addClass('uploading');
                    updateForm();
                    done();
                },
                error: function(response)
                {
                    file.custom_status = 'rejected';
                    updateForm();

                    if (response.responseText) {
                        response = parseJsonMsg(response.responseText);
                    }
                    if (response.message) {
                        done(response.message);
                    } else {
                        done('error preparing the file');
                    }
                }
            });
        },

and

sending: function(file, xhr, formData)
        {
            $.each(file.postData, function(k, v){
                formData.append(k, v);
            });
        },

I hope this help

@philippfrank

This comment has been minimized.

philippfrank commented Dec 14, 2013

@Maxwell2022 Thx for sharing. What's updateForm() doing?

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Dec 14, 2013

it's just for me to update some stuff in the UI

@ChickenFur

This comment has been minimized.

ChickenFur commented Feb 10, 2014

@Maxwell2022 Thx for this. What does an example file.postData look like? I looked at your gist https://gist.github.com/Maxwell2022/5480701#file-s3_policy_signature-php

And see you are returning and object but I don't see where you set the response.post value you are assigning to file.postData.

Is it simply the url to the bucket and key like the below string which I got from http://www.ioncannon.net/programming/1539/direct-browser-uploading-amazon-s3-cors-fileapi-xhr2-and-signed-puts/?

$url = urlencode("$S3_URL$S3_BUCKET$objectName?AWSAccessKeyId=$S3_KEY&Expires=$expires&Signature=$sig");

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented Feb 10, 2014

@ChickenFur here is the dropbox code I'm using right now. I'm setting the S3 parameters in the accept method because it's synchronous (so the upload cannot start without S3 parameters).
I'm getting the signature information calling a webservice that return the information from the gist you are referring to.

$dropzone.dropzone({
        paramname: 'pic',
        autoProcessQueue: false,
        clickable: true,
        maxfiles: 5,
        parallelUploads: 2,
        maxFilesize: 7, // in mb
        maxThumbnailFilesize: 8, // 3MB
        thumbnailWidth:250,
        thumbnailHeight:150,
        previewsContainer: uploaded_images,
        acceptedMimeTypes: "image/bmp,image/gif,image/jpg,image/jpeg,image/png",
        url: 'https://s3.amazonaws.com/'+bucket,

        init: function()
        {
        },
        addedfile: function(file)
        {
        ...
        }
        accept: function(file, done)
        {
            file.postData = [];
            $.ajax({
                url: '/dashboard/webservice/content/prepare-upload',
                data: {name: file.name, type: file.type, size: file.size},
                type: 'POST',
                success: function(response)
                {
                    file.custom_status = 'ready';
                    file.postData = response.post;
                    file.guid = response.data.guid;
                    file.s3 = response.post.key;
                    $(file.previewTemplate).addClass('uploading');
                    done();
                },
                error: function(response)
                {
                    file.custom_status = 'rejected';

                    if (response.responseText) {
                        response = parseJsonMsg(response.responseText);
                    }
                    if (response.message) {
                        done(response.message);
                    } else {
                        done('error preparing the upload');
                    }
                }
            });
        },
        sending: function(file, xhr, formData)
        {
            $.each(file.postData, function(k, v){
                formData.append(k, v);
            });
        },
        ...
@ChickenFur

This comment has been minimized.

ChickenFur commented Feb 11, 2014

@Maxwell2022 Works great. I also referenced this rails cast here: http://railscasts.com/episodes/383-uploading-to-amazon-s3?view=asciicast

Here is the code I used for Ruby to create the signed file

  def policy
    Base64.encode64(policy_data.to_json).gsub("\n", "")
  end

  def policy_data
    {
      expiration: EXPIRE_TIME,
      conditions: [
        ["starts-with", "$key", ""],
        ["content-length-range", 0, MAX_SIZE],
        {bucket: S3_BUCKET},
        {acl: ACL}
      ]
    }
  end

  def signature
      Base64.encode64(
        OpenSSL::HMAC.digest(
          OpenSSL::Digest::Digest.new('sha1'),
          S3_SECRET, policy
        )
      ).gsub("\n", "")
  end


  def create
    if session[:user_id]
      name = params['name']
      mime_type = params['type']
      uniqueNum = Time.new.nsec
      folder = "#{params[:courseName]}"
      folder.gsub! " ", "-"
      key = "userid-#{session[:user_id]}-#{uniqueNum}#{name}"
      key = "#{folder}/#{key}"
      responseData = {}
      responseData[:key] = key
      responseData[:acl] = ACL
      responseData[:policy] =  policy
      responseData[:signature] = signature
      responseData[:AWSAccessKeyId] = S3_KEY
      render :json => responseData
    else
      render text: "not signed in"
    end
  end
@ericchernuka

This comment has been minimized.

ericchernuka commented May 15, 2014

Does anyone have a blog post on this? I see you guys are using Ruby/Rails and I'm a little lost on how to get this going.

Thanks!

@Maxwell2022

This comment has been minimized.

Maxwell2022 commented May 15, 2014

@SdShadab

This comment has been minimized.

SdShadab commented Oct 7, 2014

Hello everyone,

I have recently been trying to upload via Dropzone to S3 as well, and here is where I am so far

Form code:

<form id="my-awesome-dropzone" class="dropzone" enctype="multipart/form-data" action="http://bwvids.s3-website-ap-southeast-1.amazonaws.com"><input type="hidden" name="key" value="uploads/${filename}"><input type="hidden" name="acl" value="private"><input type="hidden" id="fld_Policy" name="policy" value="YOUR_POLICY_DOCUMENT_BASE64_ENCODED"><input type="hidden" id="fld_Signature" name="signature" value="YOUR_CALCULATED_SIGNATURE"><input type="hidden" id="fld_AWSAccessKeyId" name="AWSAccessKeyId" value="YOUR_AWS_ACCESS_KEY"></form>

Here is the AJAX call I am making to my policy and signature generator:

var dropzone = '<form id="my-awesome-dropzone" class="dropzone" enctype="multipart/form-data" action="http://bwvids.s3-website-ap-southeast-1.amazonaws.com"><input type="hidden" name="key" value="uploads/${filename}"><input type="hidden" name="acl" value="private"><input type="hidden" id="fld_Policy" name="policy" value="YOUR_POLICY_DOCUMENT_BASE64_ENCODED"><input type="hidden" id="fld_Signature" name="signature" value="YOUR_CALCULATED_SIGNATURE"><input type="hidden" id="fld_AWSAccessKeyId" name="AWSAccessKeyId" value="YOUR_AWS_ACCESS_KEY"></form>';
    Dropzone.options.myAwesomeDropzone = { maxFilesize: 1, autoProcessQueue: true, init: function(){
                        this.on("sending", function(file) {
                          _file = file.name;
                          $.ajax({
                              url: "/gets3credentials",
                              dataType: "JSONP",
                              data: {filename: file.name},
                              type: 'POST', 
                              success: processResponse,
                              error: function(res, status, error) {}
                            })
                          });

Here is my gets3credentials file:

var createS3Policy;
var s3Signature;
var s3Credentials;

createS3Policy = function( mimetype, callback ) {
  var s3PolicyBase64, _date, _s3Policy;
  _date = new Date();
  s3Policy = {
    "expiration": "" + (_date.getFullYear()) + "-" + (_date.getMonth() + 1) + "-" + (_date.getDate()) + "T" + (_date.getHours() + 1) + ":" + (_date.getMinutes()) + ":" + (_date.getSeconds()) + "Z",
    "conditions": [
      { "bucket": "bwvids" }, 
      ["starts-with", "$Content-Disposition", ""], 
      ["starts-with", "$key", "uploads"], 
      { "acl": "public-read" }, 
      ["content-length-range", 0, 2147483648], 
      ["eq", "$Content-Type", mimetype]
    ]
  };

s3Credentials = {
    s3PolicyBase64: new Buffer( JSON.stringify( s3Policy ) ).toString( 'base64' ),
    s3Signature: CryptoJS.createHmac( "sha1", "kLpxywU7LLCbzn0y7djpqJQQcyc5WqP3ZE+TggOl" ).update( s3Policy ).digest( "base64" ),
    s3Key: "AKIAJSTK75RZKM5OWRLQ",
    s3Policy: s3Policy
};

  callback( s3Credentials );
};
    Meteor.startup(function () {
      Router.map(function() {
        this.route('gets3credentials', {
          path: '/gets3credentials',
          template: getTemplate('uploadvideo')
        });
      });
    });
}
});

And here is how I am placing the return values back into my form:

  function processResponse( res ) {
        $("#fld_AWSAccessKeyId").val(res.s3Key);
        $("#fld_Policy").val(res.s3PolicyBase64);
        $("#fld_Signature").val(res.s3Signature);
        $("#my-awesome-dropzone").submit();
        console.log('Getting Uploaded')
        };
      }};
    return dropzone
    }
});

However, I get a 405 error and here is my request payload:

------WebKitFormBoundaryXFobZqaE9NglBQZ7
Content-Disposition: form-data; name="key"

uploads/${filename}
------WebKitFormBoundaryXFobZqaE9NglBQZ7
Content-Disposition: form-data; name="acl"

private
------WebKitFormBoundaryXFobZqaE9NglBQZ7
Content-Disposition: form-data; name="policy"

YOUR_POLICY_DOCUMENT_BASE64_ENCODED
------WebKitFormBoundaryXFobZqaE9NglBQZ7
Content-Disposition: form-data; name="signature"

YOUR_CALCULATED_SIGNATURE
------WebKitFormBoundaryXFobZqaE9NglBQZ7
Content-Disposition: form-data; name="AWSAccessKeyId"

YOUR_AWS_ACCESS_KEY
------WebKitFormBoundaryXFobZqaE9NglBQZ7
Content-Disposition: form-data; name="file"; filename="logo-large.png"
Content-Type: image/png


------WebKitFormBoundaryXFobZqaE9NglBQZ7--
Response Headersview source

I have set up CORS properly on my bucket, and have enabled static website hosting. Cannot figure out what causing the error.

@oscar-g

This comment has been minimized.

oscar-g commented Mar 19, 2015

Thanks for this! I was able to implement direct uploads in a Python-based app.

Has anybody tried implementing chunked file uploads using Dropzone and the AWS REST API for multipart upload?

I was thinking it would be possible to create a Dropzone instance for the file and drop all the chunks in that instance. The REST API uses PUT requests for multipart uploads, so it may take some Dropzone modification.

Has anybody tried anything like this?

@billyshena

This comment has been minimized.

billyshena commented Mar 26, 2015

Hello guys ! Has anyone tried to implement Dropzone js with the Amazon V4 Signature? (My bucket is located in eu-west-1 "Frankfurt") and some changes has to be done with the new signature system.

Would be great to see some code examples of implementing Dropzone, because the params for S3 now should be sent in the url as GET parameters: ex: https://s3.amazonaws.com?key=....&policy=...&signature=.....

Thanks in advance !

@oscar-g

This comment has been minimized.

oscar-g commented Mar 26, 2015

I implemented the AWS V4 examples here
http://docs.aws.amazon.com/general/latest/gr/signature-version-4.html
using Python and the front-end as discussed in previous comments of the
issue. I am using us-west-2 "Oregon" region; are you not able to use POST
in eu-west-1?

On Thu, Mar 26, 2015 at 10:22 AM, billyshena notifications@github.com
wrote:

Hello guys ! Has anyone tried to implement Dropzone js with the Amazon V4
Signature? (My bucket is located in eu-west-1 "Frankfurt") and some changes
has to be done with the new signature system.

Would be great to see some code examples of implementing Dropzone, because
the params for S3 now should be sent in the url as GET parameters: ex:
https://s3.amazonaws.com?key=....&policy=...&signature=.....

Thanks in advance !


Reply to this email directly or view it on GitHub
#33 (comment).

@billyshena

This comment has been minimized.

billyshena commented Mar 26, 2015

Hi Oscar ! Thanks for your quick reply 👍
Well I'm trying to do a direct upload to Amazon S3 from my Angular js client (using Dropzone js plugin) with AWS V4 signature but I have no idea how to do that for the moment.

@oscar-g

This comment has been minimized.

oscar-g commented Mar 28, 2015

I set up my javascript similar to #33 (comment) and a correction . I generated the signature and the request variables on the server by following the AWS examples.
What exactly are you having issues with?

@ajitpawar

This comment has been minimized.

ajitpawar commented Apr 20, 2015

Here is my working version of AWS S3 policy for Node.js v0.12.2
Note: The crypto module comes pre-installed in v0.12.2 and documentation can be found here. You can use other 3rd party libraries for sha1 if you want.

Controller:

module.exports = {
  index: function (req, res) {

    var crypto = require('crypto');     // pre-installed in v0.12.2

    var bucket = "xxx";
    var accessKeyId = 'xxxxxxxx';
    var secret = 'xxxxxxxxxxxxxxxxxxxx';
    var _date = new Date();

    var s3Policy = {
    "expiration": "" + (_date.getFullYear()) + "-" + (_date.getMonth() + 1) + "-" + (_date.getDate()) + "T" + (_date.getHours() + 1) + ":" + (_date.getMinutes()) + ":" + (_date.getSeconds()) + "Z",
    "conditions": [
          {"bucket": bucket},
          {"acl": "public-read"},
          ["starts-with", "$key", ""],
          ["starts-with", "$Content-Type", "image/"],
          ["starts-with", "$name", ""],
          ["starts-with", "$Filename", ""]
        ]
    };

    var base64 = new Buffer(JSON.stringify(s3Policy)).toString('base64');
    var sign = crypto.createHmac("sha1", secret).update(base64).digest("base64");

    var s3Credentials = {
        s3PolicyBase64: base64,
        s3Signature: sign,
        s3Key: accessKeyId,
        s3Bucket: bucket
    };

    // Pass it to the view
    res.view({credentials: s3Credentials });
  }
};

View:

<html>
   <%= credentials.s3Key %>
   <%= credentials.s3PolicyBase64 %>
   <%= credentials.s3Signature %>
</html>
@tombroomfield

This comment has been minimized.

tombroomfield commented May 18, 2015

Hi guys, thanks for all the good information in this thread.
With the new params method on dropzone, you can make a request to the sever and set all the information dynamically instead of through the hidden fields, helpful if different files might have different policy requirements.

@deepwell

This comment has been minimized.

deepwell commented Oct 23, 2015

I had the same problem with uploading files directly to S3, this is one way to do it with Dropzone 4.

First use the validation to get a signed upload URL every time a file is dropped into Dropzone (worked because it uses a callback api).

  var dz = new Dropzone(element, {
    accept: this.getUploadUrl,
  });
  dz.on('processing', function(file) {
    // change url before sending
    this.options.url = file.uploadUrl;
  });

  function getUploadUrl(file, cb) {
    var params = {
      fileName: file.name,
      fileType: file.type,
    };
    $.getJSON('/api/signed_s3_url', params).done(function(data) {
      if (!data.signedRequest) {
        return cb('Failed to receive an upload url');
      }

      file.uploadUrl = data.signedRequest;
      cb();
    }).fail(function() {
      return cb('Failed to receive an upload url');
    });
  }

This adds the URL to the file object as uploadUrl and uses it in the processing event to set the upload url.

File uploads were then failing with:

<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><AWSAccessKeyId>AKIA...

And this was because Dropzone was sending a multipart upload with WebKitFormBoundary (Chrome).

To upload just the file content, I switch Dropzone to use PUT and overrode the sending method.

  new Dropzone(element, {
    method: 'put',
    sending: function(file, xhr) {
      var _send = xhr.send;
      xhr.send = function() {
        _send.call(xhr, file);
      };
    },
  });
@rnjailamba

This comment has been minimized.

rnjailamba commented Jan 29, 2016

My code which worked -

var myDropzone = new Dropzone(document.body, { // Make the whole body a dropzone
  url: "https://blogimages.s3-ap-southeast-1.amazonaws.com/testdropzone.txt?AWSAccessKeyId=AKIAJDTELPCXKBE3LBQ&Content-Type=text%2Fplain%3Bcharset%3DUTF-8&Expires=1460047721&Signature=oMYqfWg0Q%2FOi3kX%2BcDfaRdokvA8%3D",
  thumbnailWidth: 80,
  thumbnailHeight: 80,
  parallelUploads: 20,
  previewTemplate: previewTemplate,
  autoQueue: false, // Make sure the files aren't queued until manually added
  previewsContainer: "#previews", // Define the container to display the previews
  acceptedMimeTypes: "text/plain",
//  acceptedMimeTypes: "image/bmp,image/gif,image/jpg,image/jpeg,image/png",
  headers: {'Content-Type': 'text/plain;charset=UTF-8'},
  method: 'put',
  sending: function(file, xhr) {
    var _send = xhr.send;
    xhr.send = function() {
      _send.call(xhr, file);
    };
  },
  clickable: ".fileinput-button" // Define the element that should be used as click trigger to select files.
});
@monicao

This comment has been minimized.

monicao commented Mar 2, 2016

Thanks everyone for your posts. This was very helpful.

Here is a version that worked for me. It is using the sending callback to apply the s3 policy to the formData object instead of building a url. I thought this was a bit cleaner, although this solution still monkeypatches accept.

Also, the complete callback is a place where you can get access to the file url returned by S3. This is important, because S3 will url encode the key supplied in the policy.

    var dropzone = new Dropzone(".dropzone", {
      maxFilesize: 12, // MB
      maxFiles: 20,
      acceptedFiles: "image/png,image/jpg,image/jpeg",
      paramName: "file", // This line might not be necessary b/c "file" might be the default.
      clickable: true, // User can click on the dropzone to open file upload window
      url: `https://${config.s3_bucket}.s3.amazonaws.com/`,
      accept: (file, cb) => {
        // my app uses sockets to communicate with the server.
        // this socket message requests the s3 policy.  
        // this could just as well be an ajax request, the point is to have your backend return a policy object.
        socket.pushGetUploadPolicy({
          filename: file.name,
          mimetype: file.type,
          callback: (response) => {
            if(response.error) {
              cb(response.error); return
            }
            file.policy = response.success.policy // see below for an example of what this object looks like
            cb() // don't forget to call the callback, otherwise the upload will not start.
          }
        })
      }
    })
    // called just before the file is sent
    dropzone.on('sending', (file, xhr, formData) => {
      for(var key in file.policy) {
        formData.append(key, file.policy[key])
      }
    })
    // called for each file when the file upload is complete
    dropzone.on('complete', (file) => {
      var uploadSuccessful = file.status >= 200 && file.status < 300
      if(uploadSuccessful) {
        // S3 will url encode the key supplied with the policy.
        // ex: fakeslug/fatboy_sand.jpg -> fakeslug%2Ffatboy_sand.jpg
        var encodedFileUrl = file.xhr.responseXML.querySelector("PostResponse Location").innerHTML
        // TODO: save the encodedFileUrl on the server
      }
    })

The policy object return by the server looks something like this:

{
  "success_action_status"=>"201",
  "signature"=>"djfh2938f/djhf29efkkf=",
  "policy"=>"asdfjhdf982flj0d9uefpgj303r9gjILF9fg3b25kaXRpb25zIjpbeyJidWNrZXQiOiJkZXNpZ25kcm9wLWRldiJ9LHsiYWNsIjoicHVibGljLXJlYWQifSxbInN0YXJ0cy13aXRoIiwiJENvbnRlbnQtVHlwZSIsleSIsImZha2VzbHVnL2ZhdGJveV9zYW5kLmpwZyJdLHsic3VjY2Vzc19hY3Rpb25fc3RhdHVzIjoiMjAxIn1dfQ==",
  "key"=>"fakeslug/fatboy_sand.jpg",
  "acl"=>"public-read",
  "Content-Type"=>"image/jpeg",
  "AWSAccessKeyId"=>"ADK384KDFLKDKFJDF"
}
@kfei

This comment has been minimized.

kfei commented Nov 27, 2016

Thanks everyone in this thread and I want to share my working code as well. 😄

My scenario:

  1. The client (browser) calls an AWS Lambda function to get the pre-signed upload URL for each file being added.
  2. When the pre-signed URL returned in response, the client will trigger dropzone.processFile immediately.
  3. When the file being processing, change dropzone.options.url for the file accordingly.

Hints:

  • As I'm signing a PUT upload-able URL, I'm going to hijack the xhr.send as @deepwell already mentioned.

The final code:

// In the `accept` function we request a signed upload URL when a file being accepted
accept (file, done) {
  lambda.getSignedURL(file)
    .then((url) => {
      file.uploadURL = url
      done()
      // And process each file immediately
      setTimeout(() => dropzone.processFile(file))
    })
    .catch((err) => {
      done('Failed to get an S3 signed upload URL', err)
    })
}

// Set signed upload URL for each file being processing
dropzone.on('processing', (file) => {
  dropzone.options.url = file.uploadURL
})

A full example can be found in my Vue S3 dropzone component (the code related to Dropzone and S3 are actually framework agnostic).

@mmoehrlein

This comment has been minimized.

mmoehrlein commented Mar 8, 2017

I have an solution using only a single php file.
For authorization it uses Amz-Signature.

Here the link to the php file.
and the CORS Policy
`

<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
	<CORSRule>
	        <AllowedOrigin>*</AllowedOrigin>
	        <AllowedMethod>GET</AllowedMethod>
	        <AllowedMethod>POST</AllowedMethod>
	        <MaxAgeSeconds>3000</MaxAgeSeconds>
	        <AllowedHeader>*</AllowedHeader>
	   </CORSRule>
</CORSConfiguration>`

It is a simple and ready-to-run solution wich can be hosted on a php-server and opened as a web page.
You just have to insert your credentials, bucket and region

@haoxi911

This comment has been minimized.

haoxi911 commented Apr 26, 2017

I am doing the same in my website, and everything is working now.

What I would like to explore is, when my customer added 50 files into Dropzone, currently I have to send 50 requests to backend to get signed URLs, and I may want to only send 1 request with 50 file names.

Can we add an event in Dropzone.js which may send an array of new added files (triggered when user close the open file dialog, or when finish a drag&drop)?

Thanks,
Kevin

@AnalogJ AnalogJ referenced this issue Aug 24, 2017

Closed

quietthyme storage enable: #94

3 of 3 tasks complete
@badosu

This comment has been minimized.

badosu commented Apr 24, 2018

I was having a weird error in which the images uploaded to S3 were prepending the payload headers in their content (e.g. uploading the whole payload instead of just the image), it was very hard to diagnose but this configuration snippet fixed it:

sending: function(file, xhr) {
  var _send = xhr.send;
  xhr.send = function() {
    _send.call(xhr, file);
  };
}
@paulwilton

This comment has been minimized.

paulwilton commented Jun 18, 2018

How to do fully managed, chunked uploads to S3 using AWS-SDK with Dropzone
https://datalanguage.com/news/s3-managed-uploads

@fittyCent

This comment has been minimized.

fittyCent commented Sep 9, 2018

I was trying to upload images to s3 and it is frustrating for sure. I tried overriding the sending function and while it did finally upload the images without the multipart form info, there were a few issues:

  • dropzone's resizing routine didn't get fired prior to upload. All images in s3 were huge.
  • I haven't confirmed this but it seemed that multi file upload didnt work very well. I'd get unpredictable number (always less than actually uploaded) of files in s3.

I resorted to posting the request to my node server and using multer-s3 there. Here's my implementation:

  const multer = require('multer');
  multerS3 = require('multer-s3');
  let s3 = new aws.S3();
  var upload = multer({
    storage: multerS3({
        s3: s3,
        bucket: S3_BUCKET_NAME,
        acl: 'public-read',
        key: function (req, file, cb) {
            cb(null, Date.now()+file.originalname); 
        }
    })
  });

  //use by upload form
  app.post('/uploadHandler', upload.single('file'), function (req, res, next) {

    if (req.file && req.file.originalname) {
      console.log(`Received file ${req.file.originalname}`);
    }
    res.status(200).json({
      fileURL: req.file.location
    })
  });

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment