-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uncaught range error #56
Comments
After staring at the stack trace for a while I realised this might have been caused by the recursive calls. It might be worth while for the module to catch the exception. Feel free to close the issue |
Sounds like it's running out of memory. Are you transferring a lot of files in parallel? |
@QwertyZW Can you try applying the following patch to diff --git a/lib/sftp.js b/lib/sftp.js
index dc45e3f..5466627 100644
--- a/lib/sftp.js
+++ b/lib/sftp.js
@@ -971,6 +971,13 @@ SFTPStream.prototype.writeData = function(handle, buf, off, len, position, cb) {
this.debug('DEBUG[SFTP]: Outgoing: Writing WRITE');
return this.push(out);
};
+function tryCreateBuffer(size) {
+ try {
+ return new Buffer(size);
+ } catch (ex) {
+ return ex;
+ }
+}
function fastXfer(src, dst, srcPath, dstPath, opts, cb) {
var concurrency = 64;
var chunkSize = 32768;
@@ -1006,7 +1013,8 @@ function fastXfer(src, dst, srcPath, dstPath, opts, cb) {
var hadError = false;
var srcHandle;
var dstHandle;
- var readbuf = new Buffer(chunkSize * concurrency);
+ var readbuf;
+ var bufsize = chunkSize * concurrency;
function onerror(err) {
if (hadError)
@@ -1065,6 +1073,20 @@ function fastXfer(src, dst, srcPath, dstPath, opts, cb) {
if (fsize <= 0)
return onerror();
+ // Use less memory where possible
+ while (bufsize > fsize) {
+ if (concurrency === 1) {
+ bufsize = fsize;
+ break;
+ }
+ bufsize -= chunkSize;
+ --concurrency;
+ }
+
+ readbuf = tryCreateBuffer(bufsize);
+ if (readbuf instanceof Error)
+ return onerror(readbuf);
+
if (mode !== undefined) {
dst.fchmod(dstHandle, mode, function tryAgain(err) {
if (err) { |
Will get to it tonight hopefully. Thanks for the response I'm definitely transferring a lot of files. Whether its in parallel or not I'd have to take a closer look at the remote-ftp wrapper. |
Not sure for what ssh2-streams version this patch was meant but I tried it on 2 versions by manually applying the patch, the one that the other package (remote-ftp 0.9.4) uses and the current master branch. It definitely got a lot further on both versions, There are no stack traces this time either. It reaches a point where it (remote-ftp 0.9.4) chokes but this is probably not ssh2-streams' problem. One of the functions in the other package is hitting a recursion depth of 15718 (according to the console logs). I think that needs to be looked into first for now. |
@QwertyZW I used the current master branch of Either way I will probably end up pushing these changes since it should help reduce memory usage in case of small files. Thanks for the feedback! |
@QwertyZW As far as stack traces go, you won't get any useful stack traces due to the asynchronous nature of everything (unless you use something like the |
I'm pretty sure I applied the patch cleanly(manually) but if you like you can point me to a branch that includes this patch and I'll test it out Not really sure what's going on when I'm doing
Thank you for your help! Feel free to close this |
FWIW the buffer size change has been pushed in 32523f5. Let me know if there is anything else this module can do to help with this particular issue. |
OS: Windows 10 Pro version 1607
Atom's Node version: 6.3.0
Atom's version (not sure if relevant): 1.12.1
Getting this stack trace by using the sync local <- remote feature of https://github.com/mgrenier/remote-ftp version 0.9.4 on a large directory
The text was updated successfully, but these errors were encountered: