Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error after downloading lots of files #115

Open
alexcroox opened this issue May 1, 2015 · 15 comments
Open

Error after downloading lots of files #115

alexcroox opened this issue May 1, 2015 · 15 comments

Comments

@alexcroox
Copy link

I have a timer that runs every 100ms. It goes through a list of known remote file paths on the FTP server and attempts to download them to the local machine.

The timer has locks to ensure that only 3 requests are active at any one time (every time a download finishes it releases another slot to the pool).

This works well for about 20 seconds before get() errors out with

Unable to make data connection

Here is a reduced example of what is happening

function ftpController() {

    this.conn = new ftpClient();
}

ftpController.prototype.downloadFile = function(remotePath, cb) {

    cb = cb || function() {};

    localPath = mods.folderPath + remotePath;

    ftp.conn.get(remotePath, function(err, stream) {

        if (err) {

            debug('** Error downloading ' + remotePath, err);

            cb(false);
        } else {

            stream.pipe(fs.createWriteStream(localPath));
            cb(true);
        }
    });
};

// Stripped code example
ftp.downloadFile('/test/test1.txt', releaseAnotherDownload);
ftp.downloadFile('/test/test2.txt', releaseAnotherDownload);
ftp.downloadFile('/test/test3.txt', releaseAnotherDownload);

Should I be creating a new connection for each download rather than trying to use the same, is my FTP server throttling me because I'm not making a new connection for each file?

@alexcroox
Copy link
Author

OK after creating a new connection for every single file I seem to be having more success! Is this the correct behaviour? It seems like it should be slower and inefficient but seems OK in my testing!

@alexcroox
Copy link
Author

OK the connection per file won't work as a tester of mine got error too many connections from this IP after a while. Back to the drawing board....

@phillipgreenii
Copy link

I was just bitten by this as well. It looks like the problem is in connection.js with self._pasvSocket = socket;. It replaces _pasvSocket each time. So for example, GET 1 sets _pasvSocket, GET 2 overrides _pasvSocket, then GET 1 completes, which clears _pasvSocket, so GET 2 fails with Unable to make data connection. In my testing, it was always the third GET that was causing the problem, but I would guess that it depends on the timing for each GET.

@vintuwei
Copy link

vintuwei commented Feb 5, 2016

@phillipgreenii Did you find a solution for this?

@phillipgreenii
Copy link

@vintuwei I did not. I ended up enforcing one file at a time via async.series. You can see my code at https://github.com/phillipgreenii/node-ftp-stream/blob/master/src/index.js

@vintuwei
Copy link

vintuwei commented Feb 7, 2016

Thank you @phillipgreenii

@mcmunder
Copy link

Thank you as well @phillipgreenii

I also found that using ftp.destroy() instead of ftp.end() can resolve the issue. Although that is probably not the right way to do it...

@dustinbolton
Copy link

dustinbolton commented Mar 25, 2017

@phillipgreenii @vintuwei I'm using https://github.com/coopernurse/node-pool to try and pool ftp instances to transfer multiple files in parallel. Seems to work okay although I'm having issues with the transfer processes just hanging indefinitely intermittently. Really frustrating...

@yonib05
Copy link

yonib05 commented Jan 16, 2018

+1

@johnfdhartman
Copy link

+1, this is still a thing

@rastalamm
Copy link

+1

@rlueder
Copy link

rlueder commented Aug 12, 2019

I'm having a similar issue, though it seems that the client is able to recover from the error and still process all the files in the directory I have.

I couldn't find a pattern to how many or how often the errors occur, seems somehow related to quality of internet connectivity (i.e. if I'm at the office in the local network I see less errors than while connected over wifi from home over a VPN).

@YiyuanYin
Copy link

I'm having the same problem when downloading multiple files at the same time, but when I download files one by one asynchronously, it solves the problem.

@sudhir-pandey24
Copy link

@YiyuanYin Can you please share code. I am facing same issue

@steven-tey
Copy link

steven-tey commented Jan 30, 2023

Ran into this issue today as well – I ended up wrapping a Promise around my c.get() and it worked like a charm!

Code:

c.on("ready", async() => {
   await new Promise((resolve) => {
      c.get(filename, (err, stream) => {
         ...
         resolve()
      })
   });
   await new Promise((resolve) => {
      c.get(filename2, (err, stream) => {
         ...
         resolve()
      })
   })
   ...
   c.end()
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests