Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uploading to S3 large files fails #173

Closed
levyitay opened this issue Oct 14, 2013 · 3 comments
Closed

Uploading to S3 large files fails #173

levyitay opened this issue Oct 14, 2013 · 3 comments
Labels
duplicate This issue is a duplicate.

Comments

@levyitay
Copy link

This code fails to upload large files to S3.

I get a spam (more than 200 lines at once) of the following:

(node) warning: Recursive process.nextTick detected. This will break in the next version of node. Please use setImmediate for recursive deferral.
.
.
.
.
RangeError: Maximum call stack size exceeded

Until the process dies.

When running with --throw-deprecation flag I get the following stack trace:


node.js:375
        throw new Error(msg);
              ^
Error: (node) warning: Recursive process.nextTick detected. This will break in the next version of node. Please use setImmediate for recursive deferral.
    at maxTickWarn (node.js:375:15)
    at process.nextTick (node.js:480:9)
    at emitReadable (_stream_readable.js:400:13)
    at readableAddChunk (_stream_readable.js:165:9)
    at EncryptedStream.Readable.push (_stream_readable.js:127:10)
    at EncryptedStream.read [as _read] (tls.js:510:10)
    at EncryptedStream.Readable.read (_stream_readable.js:320:10)
    at flow (_stream_readable.js:579:52)
    at Socket.<anonymous> (_stream_readable.js:563:7)
    at Socket.EventEmitter.emit (events.js:117:20)

This file size I uploaded is ~ 150Mb.
Node Version : v0.10.15

See attached code.

config.json:

{ "accessKeyId": "myAccesssKeyId", "secretAccessKey": "mySecret", "region": "us-east-1"   }

var AWS = require('aws-sdk');
AWS.config.loadFromPath('./config.json');
var s3 = new AWS.S3();
var localFile = '/var/log/myLog.log';

fs.readFile(localFile, function(err, data) {
        if (err) {
            console.log('failed to read file before uploading to s3 due to %s', err);
            return;
        }

        if (data.length > 0) {
            console.log('uploading %s to s3', localFile);

            s3.putObject({
                Body: data,
                Key: '/',
                Bucket: 'myBucket'
            }, function(err, data) {
                if (err){
                    console.log(err);
                 }
                if (!data){
                    console.log('data returned null');
                }
            }
            );
        }
    });
@lsegal
Copy link
Contributor

lsegal commented Oct 14, 2013

This is the same issue reported in #150 and #158. Unfortunately the SDK never actually calls process.nextTick in the portion of code you are using, so it would most likely be a third party issue with another library (possibly the async module if you are using it) or Node.js itself. I would recommend trying this with the latest stable release of Node.js v0.10.20.

@lsegal lsegal closed this as completed Oct 14, 2013
@lsegal
Copy link
Contributor

lsegal commented Oct 14, 2013

I would also recommend using multi part uploads (createMultipartUpload, uploadPart and completeMultipartUpload) for larger files to reduce the time spent recovering from failed uploads.

@lock
Copy link

lock bot commented Sep 30, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.

@lock lock bot locked as resolved and limited conversation to collaborators Sep 30, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
duplicate This issue is a duplicate.
Projects
None yet
Development

No branches or pull requests

2 participants