-
-
Notifications
You must be signed in to change notification settings - Fork 372
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Option to upload zips to S3 to bypass size limit & multi-part uploads #248
Comments
Hmmm yeah maybe a PR. I'd like to avoid it if possible since the FunctionCode stuff is so simple, no need to clean up after old functions etc, but I'm not opposed to S3 either. I thinkkkkk they may be raising this limit soon, so maybe it's not worth adding quite yet |
Actually, maybe s3-only would be good at some point, we could do the multi-part upload to speed things up. Out of curiosity are you using Node? Maybe bundling will help there in the meantime. |
@tj |
@franciscocpg nice! Yeah I agree, long-term it would be sweet if we could chunk it and speed those up |
I ran into a somewhat related issue but not exactly the same. I'm trying to deploy a package that has a size of 38 MB. So it is still smaller than the 50MB limit. But I'm on an internet connection that is bad (320 Kbps). During deployment it takes a long time then fails with: So probably multi-part uploads would help also in cases where the file size is less than 50MB but the internet connection is bad. |
Hi @komuw Also serverless framework which is a mature and well battle tested framework right now always use IMO |
Yeah as long as we "clean" the bucket so it's not littered with old deploys, things should be ok, just a bit more manual work. I believe the minimum chunk size is 5mb, though on a bad connection maybe ~4-5 5mb chunks won't really improve uploads much. CI is definitely a nicer option there if possible |
+1, have seen this happen on crappy internet when running
|
Closed by #272 |
I'm trying to deploy a package that has a size of 109 MB (so, more than 50MB Lambda function deployment package size but less than 250MB Size of code/dependencies that you can zip into a deployment package, ref: http://docs.aws.amazon.com/lambda/latest/dg/limits.html).
To get rid of this 50MB deployment package size limitation we could do the following steps if zip file is larger than 50 MB (and less than 250 MB, of course) :
S3Bucket
andS3Key
fields ofFunctionCode
/UpdateFunctionCodeInput
structs (in replacement ofZipFile
field).What do you think about this solution?
May I propose a PR?
The text was updated successfully, but these errors were encountered: