Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

Is there a way to retrieve progress for a .addFile request? #28

Open
MeanwhileMedia opened this Issue · 6 comments

2 participants

@MeanwhileMedia

I thoroughly searched the object returned from a .addFile request, but cannot determine if it is possible to retrieve progress while a file is being sent. Any help would be much appreciated. Happy holidays!

@bmeck

What are you looking for exactly? The amount of data sent?

@MeanwhileMedia

Yes sir. I thought maybe I could track it by looking at the data listener on 'readStream', the original write stream that I sent to the .addFile request (since it uses a .pipe). However, the readStream doesn't get paused, the system immediately reads right through it. That is when I started looking at the object returned by the .addFile request (reqStream), but I can't see how I can use this to track the amount of data sent.

var readStream = fs.createReadStream(path+'.'+extension, streamopts);
var upOpts = {
    headers: {
        'content-type': 'video/'+extension,
        'content-length': totalBytes
    },
    remote: CDNfilename, 
    stream: readStream
};

var reqStream = cloudClient.addFile(Container.name, upOpts, function (err, uploaded) {
    if (err) { console.log(err); }

});
@bmeck

Ok, so exposing the stream on the result is one thing so that we reach compatibility with raw filepaths. However, I am unsure what you need. Right now you can do something similar to the following (this is just taken as a simple example without cloudfiles but the implementation is the same)

var http=require('http'),
    fs=require('fs'),
    request = require('request');

var file='bigFile.tgz';
server.listen(8008, function() {
    var req=request({
        method:'POST',
        url:'http://127.0.0.1:8008'
    });
    var stream=fs.createReadStream(file);
    var total=fs.statSync(file).size;
    var sent=0;
    stream.on('data',function(data){
        sent += data.length;
        console.log('sending', data.length, 'progress', Math.floor(sent / total * 100));
    })
    stream.pipe(req);
});

Is this not doable for you?

@MeanwhileMedia

So, if I'm understanding correctly, this would be the same as accessing the 'data' listener on my stream 'readStream'. The stream that is being piped to the 'request' module. This is what I tried before, but for some reason, all the data events for a 5mb stream finish in about 15ms (obviously not how long it actually takes to send to cloudfiles). However, if I listen for the 'end' event on the same stream, it does accurately represent the completion time for the upload (although that doesn't really help me in tracking progress).

I assume that readStream is getting paused, since it part of a .pipe, however, it doesn't seem to affect how quickly the system reads through the file. Could this have something to do with my buffer? I have it set at 64kbs.

@bmeck
@MeanwhileMedia

Now that I can send the file in chunks (thanks to you), I won't even need to fetch progress reports in my application. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.