New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error after downloading lots of files #115
Comments
OK after creating a new connection for every single file I seem to be having more success! Is this the correct behaviour? It seems like it should be slower and inefficient but seems OK in my testing! |
OK the connection per file won't work as a tester of mine got error too many connections from this IP after a while. Back to the drawing board.... |
I was just bitten by this as well. It looks like the problem is in |
@phillipgreenii Did you find a solution for this? |
@vintuwei I did not. I ended up enforcing one file at a time via |
Thank you @phillipgreenii |
Thank you as well @phillipgreenii I also found that using |
@phillipgreenii @vintuwei I'm using https://github.com/coopernurse/node-pool to try and pool ftp instances to transfer multiple files in parallel. Seems to work okay although I'm having issues with the transfer processes just hanging indefinitely intermittently. Really frustrating... |
+1 |
+1, this is still a thing |
+1 |
I'm having a similar issue, though it seems that the client is able to recover from the error and still process all the files in the directory I have. I couldn't find a pattern to how many or how often the errors occur, seems somehow related to quality of internet connectivity (i.e. if I'm at the office in the local network I see less errors than while connected over wifi from home over a VPN). |
I'm having the same problem when downloading multiple files at the same time, but when I download files one by one asynchronously, it solves the problem. |
@YiyuanYin Can you please share code. I am facing same issue |
Ran into this issue today as well – I ended up wrapping a Code:
|
I have a timer that runs every 100ms. It goes through a list of known remote file paths on the FTP server and attempts to download them to the local machine.
The timer has locks to ensure that only 3 requests are active at any one time (every time a download finishes it releases another slot to the pool).
This works well for about 20 seconds before
get()
errors out withUnable to make data connection
Here is a reduced example of what is happening
Should I be creating a new connection for each download rather than trying to use the same, is my FTP server throttling me because I'm not making a new connection for each file?
The text was updated successfully, but these errors were encountered: