Skip to content
This repository has been archived by the owner on May 25, 2023. It is now read-only.

Upload multiple files directly to S3? #910

Closed
gangsteryepyedye opened this issue Dec 30, 2011 · 50 comments
Closed

Upload multiple files directly to S3? #910

gangsteryepyedye opened this issue Dec 30, 2011 · 50 comments

Comments

@gangsteryepyedye
Copy link

I have been using Jquery-file-upload plugin with paperclip and S3 for quite a while now and I really like it. However, as the files I need to upload become bigger, uploading directly to S3 becomes my only choice.

This article demonstrates how to upload a single file to S3, unfortunately it doesn't do multiple files uploading.
https://github.com/blueimp/jQuery-File-Upload/wiki/Upload-directly-to-S3

so my question is, is it possible to iterate through a list of files and create a single post form for each file on the fly?

Or is it possible to create an independent form and then repopulate the field in each iteration?

Any tips are greatly appreciated.

@chrisjdavis
Copy link

You would handle this in your processing script, which JFU hands off to. I have implemented this approach in two PHP based projects I am working on, and it isn't too difficult.

What language are you working with?

@eltoro
Copy link

eltoro commented Jan 16, 2012

I'm going to be doing this same thing - uploading multiple files directly to S3. I will be using node.js/express. Anyone with experience with that?

@gorman
Copy link

gorman commented Jan 17, 2012

I'm in the exact same boat.

The best I've been able to figure out is to loop through the .files of the $fileinput and then pass those into the "add" API of the plugin. I can't seem to get the form post to S3 quite right though. I assign data.form to the actual form, since the plugin can't figure it out from what gets passed into add (just a file object). That ends up not including a file input (since it's just working with a file object instead of the entire file input), which I think is part of the problem. It looks like the plugin _should be able to take just a file object and post it somehow, but that doesn't seem to work. Furthermore, I end up with this response from S3 that doesn't make a ton of sense "InvalidArgumentConflicting query string parameters: acl, policyaclResourceType"

Anyway, I'm not surprised it isn't working, since I've really had to hack my way just to get this far. I started working with the same guide, and before I tried to enable multiple file support, it worked great. Any hints on the right direction to head would be greatly appreciated.

@238357
Copy link

238357 commented Jan 17, 2012

I was using JFU with Ruby before I switched to a flash plugin. The flash one works well except it does not support drag and drop feature, but I am willing to trade it with the speed that flash plugin provides.

@gorman
Copy link

gorman commented Jan 17, 2012

Thanks for the quick response and suggestion. I really was hoping to stick with a non-flash solution, but I may look into it.

On Jan 16, 2012, at 11:57 PM, 238357reply@reply.github.com wrote:

I was using JFU with Ruby before I switched to a flash plugin. The flash one works well except it does not support drag and drop feature, but I am willing to trade it with the speed that flash plugin provides.


Reply to this email directly or view it on GitHub:
#910 (comment)

@eltoro
Copy link

eltoro commented Jan 18, 2012

I'm just starting it but I'll let you know how it goes. I'm using express so I'll grab the uploads with req.files and then I'm using Knox to read the files into memory and then do a PUT.

@JonnyBGod
Copy link

Why not a full browser-side solution?

I implemented JFU to upload files directly to S3 with no need to contact the server apart from the inicial generation of the page. Just need it to upload each file separately instead of Posting all files once which is not supported by S3.

Any way to force iframe based posting to make a single post for each file like the default behavior? Or do we need to hack a bit to get there?

@eltoro
Copy link

eltoro commented Feb 5, 2012

I need to collect some info on the upload and store it in mongo. So now I'm looping through req.files.uploads and then uploading to S# with Knox, inserting into mongo, building the return json and then sending the response. It all works fine but I'm having to write the files because I'm not sure how to grab the writeStream directly.

@ncri
Copy link

ncri commented Feb 18, 2012

I'm able to upload straight to S3 with multiple files in parallel and file progress using this plugin. The trick is to create the upload form on the fly and store it on S3 as well to get around the same origin policy. I will post some code soon.

@gorman
Copy link

gorman commented Feb 18, 2012

Another option is to just have a static file on S3 and pass the necessary fields to the static file using something like http://benalman.com/projects/jquery-postmessage-plugin/ -- that way you're not having to keep regenerating the file and you can pass other information to and from the iframe (upload progress, guid, etc). With this iframe approach, I've managed to get direct S3 uploading with multiple files, dropzones and upload progress working all without the user realizing there's an iframe/external page involved.

@ncri
Copy link

ncri commented Feb 18, 2012

Sounds cool! With my solution though it works transparently too, the user doesn't notice being on an external page/iframe... But using postMessage sounds more elegant...

@ncri
Copy link

ncri commented Feb 18, 2012

Hm, from your link, do I understand it right that you can only communicate from the frame to its parent, not the other way round? To embed an upload form in an iFrame with dynamic params you need to communicate from parent to frame. Are you simply passing query url parameters? If so, postMessage is not even necessary...

@ncri
Copy link

ncri commented Feb 19, 2012

Actually it works fine simply passing the dynamic parameters to the iframe and then reading them from within the form on s3. I just tested it. So no need for postMessage.

@gorman
Copy link

gorman commented Feb 19, 2012

Yeah, that's true -- all of the necessary parameters can be passed directly when calling the iframe. postMessage is useful for sending triggers back to the parent though -- when the file is uploading, what the progress is and when it's done.

@ncri
Copy link

ncri commented Feb 19, 2012

Well, you can also display the progress in the iframe, that's what I do.

@samnang
Copy link

samnang commented Feb 20, 2012

Could you guys share us a working example for this?

@ncri
Copy link

ncri commented Feb 20, 2012

Hi @samnang, I will post an example rails application as soon as it's ready. It's working, but I have some issue with after upload callbacks. Crossdomain Ajax is not supported in some browsers (like Opera), so I will use postMessage as @gorman suggested, as it has wider browser support. Should have something ready later this week.

@gorman
Copy link

gorman commented Feb 20, 2012

I should mention that I actually decided to switch to Plupload because I had better luck getting it to work cross-browser (unfortunately, I can't remember exactly why), although I do like jQuery File Upload better (I don't use any of it's Flash stuff though). I'm curious to see @ncri's sample in action, as I'd love to switch back.

@ncri
Copy link

ncri commented Feb 20, 2012

Okay, stay tuned... ;-)

@ncri
Copy link

ncri commented Feb 22, 2012

Okay, I got an example app up: https://github.com/ncri/s3_uploader_example
Please let me know what you think, or if you have troubles getting it working.

@calvincorreli
Copy link

Hi Nico

I played with it. Looks pretty neat.

I don't see that there's a dropzone outside the file input field. Is there?

Ideally, I'd like to hide the file input field entirely, use a link button to open up the dialog, and have a large dropzone as an alternative. How would I go about doing that?

Is it only Opera that has the crossdomain problem? 'cuz I have zero Opera users, so there might be a simpler solution.

I'm trying to get the base fileupload-direct-to-S3 example to behave with multilple file uploads and drag-and-drop, but it insists on trying to upload multiple files in one go, which doesn't work, and when dropping files in the dropzone, the file input field is left untouched, so either nothing's uploaded, or the last selected file is uploaded again :(

//Lars

@calvincorreli
Copy link

Hm, I am running into some problems when uploading large files - 800mb, for example, which isn't unusual for my needs.

The browser (FF 10.0.2 or OS X 10.7.3) stalls for a long time before it starts uploading (doing something with the file?).

Then it starts uploading, but after half a minute or so, Amazon gives a 400 Bad Request error with the following message: "Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.".

Any ideas about what would cause this?

//Lars

@ncri
Copy link

ncri commented Mar 5, 2012

Hi Lars, thanks for the feedback.

To answer your questions: no, there is no dropzone outside the file input yet. I think you should be able to have a dropzone outside the iframe. But the built in dropzone support of jQuery file upload might not work across the iframe. So you will probably have to make your own dropzone outside the iframe and notify the uploader via postMessage of dropped files. I haven't used a link button to open the file dialog yet, but I think that should be possible. Maybe check the existing jQuery file upload examples for that.

About the crossdomain issue: it's an Amazon S3 issue, not a browser issue. So it affects all browsers. Basically Amazon S3 only allows files from the same origin - which is, well S3. If you don't need progress bars and parallel uploads the solution from the first message of this issue might be sufficient for you: https://github.com/blueimp/jQuery-File-Upload/wiki/Upload-directly-to-S3.

About the large files: I assume you have set the file size limit correctly, so that it is larger than 800MB? I haven't tried large files myself yet, as my internet connection here is not very fast. I will do that as soon as I have a chance. I have read that s3 has some issues with large files sometimes.

Good luck! If you like to extend/fix the example app, you are welcome, I'm happy to accept pull requests. ;-)

Nico

@calvincorreli
Copy link

HI Nico

Thanks for your help with this.

I get it now. It's the XMLHttpRequest upload thing you can't do to S3 because of the missing cross-domain header - but you can do iframe instead. Got it.

Still having the issue with uploading large files.

I will probably end up just sticking with the solution I have using nathancolgate/s3-swf-upload-plugin right now.

It's Flash, and it doesn't do drag-and-drop, but it's already in place, and it works reliably, including with files several gigabytes in size, which is pretty common in my app.

I looked into getting drag & drop working, but I cant' figure out how to get them from the drop into the the file input widget on the form that you upload - you can't set the value of a file input widget using scripting for security reasons, right?

//Lars

@ncri
Copy link

ncri commented Mar 5, 2012

Yeah, so far I also use a flash uploader, which works (mostly) quite well. This is all pretty new and experimental, but I try to make it production ready. I will keep you updated if I can get large files and drag and drop working properly.

@ncri
Copy link

ncri commented Mar 5, 2012

Okay, I just tried to upload a 800 MB file and it started just fine. A quick google search for the error you get suggests it is potentially related to your isp or other local network problems. My connection here is terribly slow, so I was canceling the upload at about 20% (after one hour :), but it looked like it would have continued just fine... Speed seems similar to ftp speed, I just compared, my ftp program estimates about 6 hours for the 800MB file, so actually the uploader was even slightly faster.

@calvincorreli
Copy link

Weird, I'm still having some issues here. Anyway, I'll let this rest until there's progress on the front. Not in the mood for breaking frontiers on this particular front just now :)

Thanks so much for your assistance and for contributing your solution.

//Lars

http://zenbilling.com

Twitter: @larspind

Other contact info:
http://pinds.com/contact

On Monday, March 5, 2012 at 1:45 PM, Nico Ritsche wrote:

Okay, I just tried to upload a 800 MB file and it started just fine. A quick google search for the error you get suggests it is potentially related to your isp or other local network problems. My connection here is terribly slow, so I was canceling the upload at about 20% (after one hour :), but it looked like it would have continued just fine... Speed seems similar to ftp speed, I just compared, my ftp program estimates about 6 hours for the 800MB file, so actually the uploader was even slightly faster.


Reply to this email directly or view it on GitHub:
#910 (comment)

@ncri
Copy link

ncri commented Mar 6, 2012

Yeah, sure! It's just good to know what kind of issues there are. Still having the 400 Bad Request? You can by the way have the dropzone in the uploader iframe, that works for sure.

@jimlyndon
Copy link

Excellent - thanks folks for the work on this. Just what I was looking for. Also @ncri thanks for your solution, I just ported your example app to Asp.Net MVC for any .net folk out there: https://github.com/jimlyndon/S3Uploader

@tim-peterson
Copy link

I trying to use JFU to directly upload multiple files to my S3 bucket using PHP, has anyone made progress with that?

@chrisjdavis
Copy link

Yeah I have it working in 3 different projects. Not too difficult. I can post some code samples later if that would help.

Sent from my iPhone

On Apr 21, 2012, at 8:50 AM, tim petersonreply@reply.github.com wrote:

I trying to use JFU to directly upload multiple files to my S3 bucket using PHP, has anyone made progress with that?


Reply to this email directly or view it on GitHub:
#910 (comment)

@calvincorreli
Copy link

With drag-and-drop also?

Sent from my iPad

On Apr 21, 2012, at 3:57 PM, "Chris J. Davis"
reply@reply.github.com
wrote:

Yeah I have it working in 3 different projects. Not too difficult. I can post some code samples later if that would help.

Sent from my iPhone

On Apr 21, 2012, at 8:50 AM, tim petersonreply@reply.github.com wrote:

I trying to use JFU to directly upload multiple files to my S3 bucket using PHP, has anyone made progress with that?


Reply to this email directly or view it on GitHub:
#910 (comment)


Reply to this email directly or view it on GitHub:
#910 (comment)

@tim-peterson
Copy link

hi Chris, If you'd be willing to share some code that would be super helpful, thanks so much! I've got the JFU and S3 parts figured out separately but connecting the 2 is giving me headaches,

@tim-peterson
Copy link

hi Chris, Iarspind and I guess maybe all those interested in AWS S3, does directly uploading to S3 negate the need for a queueing script like beanstalkd for large files (movies)? I thinking this cause S3 would seem to have huge bandwidth relative to one's server.

It looks like you can just use the AWS SDK S3 multipart_upload API calls. Am I right that multipart upload can take the place of a queueing script? I apologize if I'm talking apples and organes here.

@chrisjdavis
Copy link

Of course, why else would you use this bit of js?

Sent from my iPhone

On Apr 21, 2012, at 9:00 AM, Lars Pindreply@reply.github.com wrote:

With drag-and-drop also?

Sent from my iPad

On Apr 21, 2012, at 3:57 PM, "Chris J. Davis"
reply@reply.github.com
wrote:

Yeah I have it working in 3 different projects. Not too difficult. I can post some code samples later if that would help.

Sent from my iPhone

On Apr 21, 2012, at 8:50 AM, tim petersonreply@reply.github.com wrote:

I trying to use JFU to directly upload multiple files to my S3 bucket using PHP, has anyone made progress with that?


Reply to this email directly or view it on GitHub:
#910 (comment)


Reply to this email directly or view it on GitHub:
#910 (comment)


Reply to this email directly or view it on GitHub:
#910 (comment)

@calvincorreli
Copy link

I had trouble getting drag-and-drop to work with direct upload to S3. People are uploading files that are frequently 2-3GB in size. But it seemed that drag-and-drop was incompatible with drag&drop without an iframe hosted on S3 or something.

In particular, I wanted the dropzone to be outside the iframe ... that didn't seem to be possbile.

@tim-peterson
Copy link

sorry Chris, was just trying to give a sense of my level of expertise (not high)

@tim-peterson
Copy link

Larspind, good to know you can do 2-3GB no problem, is that with breaking the file up into multiple parts or keeping intact?

@calvincorreli
Copy link

Intact.

@stephentaylor-com
Copy link

@chrisjdavis could you please post your php/s3 example for JFU? I've been looking for insight on this problem for a few days. Thanks!

@JonnyBGod
Copy link

Would it be possible to use the new Cloudfront's dynamic content functionality to easily upload directly to S3 through XHR?

The idea would be to setup a cloudfront distribution for the s3 bucket with a second origin pointing to webserver to serve xhr header.

Im not an expert on the matter so please forgive me if I just committed some kind of crime.

@tim-peterson
Copy link

thanks @JonnyBGod , i'm not sure either. I'd like to use Cloudfront too so i'll investigate...

@JonnyBGod
Copy link

I believe it can be possible to upload directly to S3 without cors being need using cloudfront as middle routing:

Domain.com
|
cloudfront
|
(/s3/* >> S3 Bucket) (* >> Web Server)

This way you can have same have same domain for webserver and s3 bucket.

Not tested the solution yet.
Only problem I think should come up is ssl support as Cloudfront only supports ssl connection through xxx.cloudfron.net urls.

@jvdp
Copy link

jvdp commented Aug 9, 2012

Sadly the cloudfront idea doesn't work :-(
To begin with, CloudFront does not allow any POST requests: http://aws.amazon.com/cloudfront/faqs/#What_types_of_HTTP_requests_are_supported_by_Amazon_CloudFront

@tim-peterson
Copy link

thanks @jvdp, for anyone who cares I decided to first upload the files to my web server, an EC2 instance, and then after the user is happy with the upload then I send it off to s3/Cloudfront.

Since the s3 API is limited in what you can do to/with the files, for my project it made sense to give the files a temp home where I can play with them as needed.

@JonnyBGod
Copy link

Amazon S3 announces Cross-Origin Resource Sharing (CORS) support:

http://aws.amazon.com/about-aws/whats-new/2012/08/31/amazon-s3-announces-cross-origin-resource-sharing-CORS-support/

@tim-peterson
Copy link

@JonnyBGod thanks for the link this is awesome news!

Has anyone adapted to jQuery-file-upload plugin to make use of CORS direct to S3?

@mvanleest
Copy link

Not yet, but I hope somebody is hacking the code right now to make it work :-)

@bamarni
Copy link

bamarni commented Sep 4, 2012

This looks great! I'm also looking for a way to implement this feature, direct upload from the browser to S3 with CORS.

I don't fully understand how it works, currently I'm only using the API.

@blueimp
Copy link
Owner

blueimp commented Sep 6, 2012

Please read:
https://github.com/blueimp/jQuery-File-Upload#support

I'm cleaning up the issues tracker so it only contains actual bugs or feature requests directly related to the plugin itself.
Meaning, it should only contain issues that me or another developer should be working on to improve the plugin itself.

Please continue this discussion on the support forum or another location of your choice.
Feel free to add links to your posts here.

@blueimp blueimp closed this as completed Sep 6, 2012
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests