Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uncaught range error #56

Closed
QwertyZW opened this issue Nov 11, 2016 · 9 comments
Closed

Uncaught range error #56

QwertyZW opened this issue Nov 11, 2016 · 9 comments

Comments

@QwertyZW
Copy link

QwertyZW commented Nov 11, 2016

OS: Windows 10 Pro version 1607
Atom's Node version: 6.3.0
Atom's version (not sure if relevant): 1.12.1
Getting this stack trace by using the sync local <- remote feature of https://github.com/mgrenier/remote-ftp version 0.9.4 on a large directory

buffer.js:8 Uncaught RangeError: 
Array buffer allocation failed
FastBuffer @ buffer.js:8
createUnsafeBuffer @ buffer.js:33
allocate @ buffer.js:176
Buffer.allocUnsafe @ buffer.js:136
Buffer @ buffer.js:73
fastXfer @ C:\Users\zaid\.atom\packages\remote-ftp\node_modules\ssh2-streams\lib\sftp.js:1003
SFTPStream.fastGet @ C:\Users\zaid\.atom\packages\remote-ftp\node_modules\ssh2-streams\lib\sftp.js:1128
SFTPWrapper.fastGet @ C:\Users\zaid\.atom\packages\remote-ftp\node_modules\ssh2\lib\SFTPWrapper.js:51
(anonymous function) @ C:\Users\zaid\.atom\packages\remote-ftp\lib\connectors\sftp.js:228
SFTPStream._transform @ C:\Users\zaid\.atom\packages\remote-ftp\node_modules\ssh2-streams\lib\sftp.js:496
Transform._read @ _stream_transform.js:167
SFTPStream._read @ C:\Users\zaid\.atom\packages\remote-ftp\node_modules\ssh2-streams\lib\sftp.js:181
Transform._write @ _stream_transform.js:155
doWrite @ _stream_writable.js:307
writeOrBuffer @ _stream_writable.js:293
Writable.write @ _stream_writable.js:220
ondata @ _stream_readable.js:556
emitOne @ events.js:96
emit @ events.js:188
readableAddChunk @ _stream_readable.js:177
Readable.push @ _stream_readable.js:135
(anonymous function) @ C:\Users\zaid\.atom\packages\remote-ftp\node_modules\ssh2\lib\Channel.js:166
emitOne @ events.js:96
emit @ events.js:188
parsePacket @ C:\Users\zaid\.atom\packages\remote-ftp\node_modules\ssh2-streams\lib\ssh.js:3400
SSH2Stream._transform @ C:\Users\zaid\.atom\packages\remote-ftp\node_modules\ssh2-streams\lib\ssh.js:665
Transform._read @ _stream_transform.js:167
SSH2Stream._read @ C:\Users\zaid\.atom\packages\remote-ftp\node_modules\ssh2-streams\lib\ssh.js:249
Transform._write @ _stream_transform.js:155
doWrite @ _stream_writable.js:307
writeOrBuffer @ _stream_writable.js:293
Writable.write @ _stream_writable.js:220
ondata @ _stream_readable.js:556
emitOne @ events.js:96
emit @ events.js:188
readableAddChunk @ _stream_readable.js:177
Readable.push @ _stream_readable.js:135
onread @ net.js:542
@QwertyZW
Copy link
Author

QwertyZW commented Nov 11, 2016

After staring at the stack trace for a while I realised this might have been caused by the recursive calls. It might be worth while for the module to catch the exception.

Feel free to close the issue

@QwertyZW QwertyZW reopened this Nov 11, 2016
@mscdex
Copy link
Owner

mscdex commented Nov 11, 2016

Sounds like it's running out of memory. Are you transferring a lot of files in parallel?

@mscdex
Copy link
Owner

mscdex commented Nov 11, 2016

@QwertyZW Can you try applying the following patch to ssh2-streams and see if that helps any?

diff --git a/lib/sftp.js b/lib/sftp.js
index dc45e3f..5466627 100644
--- a/lib/sftp.js
+++ b/lib/sftp.js
@@ -971,6 +971,13 @@ SFTPStream.prototype.writeData = function(handle, buf, off, len, position, cb) {
   this.debug('DEBUG[SFTP]: Outgoing: Writing WRITE');
   return this.push(out);
 };
+function tryCreateBuffer(size) {
+  try {
+    return new Buffer(size);
+  } catch (ex) {
+    return ex;
+  }
+}
 function fastXfer(src, dst, srcPath, dstPath, opts, cb) {
   var concurrency = 64;
   var chunkSize = 32768;
@@ -1006,7 +1013,8 @@ function fastXfer(src, dst, srcPath, dstPath, opts, cb) {
   var hadError = false;
   var srcHandle;
   var dstHandle;
-  var readbuf = new Buffer(chunkSize * concurrency);
+  var readbuf;
+  var bufsize = chunkSize * concurrency;

   function onerror(err) {
     if (hadError)
@@ -1065,6 +1073,20 @@ function fastXfer(src, dst, srcPath, dstPath, opts, cb) {
         if (fsize <= 0)
           return onerror();

+        // Use less memory where possible
+        while (bufsize > fsize) {
+          if (concurrency === 1) {
+            bufsize = fsize;
+            break;
+          }
+          bufsize -= chunkSize;
+          --concurrency;
+        }
+
+        readbuf = tryCreateBuffer(bufsize);
+        if (readbuf instanceof Error)
+          return onerror(readbuf);
+
         if (mode !== undefined) {
           dst.fchmod(dstHandle, mode, function tryAgain(err) {
             if (err) {

@QwertyZW
Copy link
Author

QwertyZW commented Nov 13, 2016

Will get to it tonight hopefully. Thanks for the response

I'm definitely transferring a lot of files. Whether its in parallel or not I'd have to take a closer look at the remote-ftp wrapper.

@QwertyZW
Copy link
Author

QwertyZW commented Nov 13, 2016

Not sure for what ssh2-streams version this patch was meant but I tried it on 2 versions by manually applying the patch, the one that the other package (remote-ftp 0.9.4) uses and the current master branch.

It definitely got a lot further on both versions, There are no stack traces this time either. It reaches a point where it (remote-ftp 0.9.4) chokes but this is probably not ssh2-streams' problem. One of the functions in the other package is hitting a recursion depth of 15718 (according to the console logs). I think that needs to be looked into first for now.

@mscdex
Copy link
Owner

mscdex commented Nov 13, 2016

@QwertyZW I used the current master branch of ssh2-streams, but as long as the patch applies cleanly it should be fine.

Either way I will probably end up pushing these changes since it should help reduce memory usage in case of small files. Thanks for the feedback!

@mscdex
Copy link
Owner

mscdex commented Nov 13, 2016

@QwertyZW As far as stack traces go, you won't get any useful stack traces due to the asynchronous nature of everything (unless you use something like the longjohn module, but that shouldn't be used in production).

@QwertyZW
Copy link
Author

QwertyZW commented Nov 13, 2016

I'm pretty sure I applied the patch cleanly(manually) but if you like you can point me to a branch that includes this patch and I'll test it out

Not really sure what's going on when I'm doing git apply patch

$ git apply patch
patch:9: trailing whitespace.
function tryCreateBuffer(size) {
patch:10: trailing whitespace.
  try {
patch:11: trailing whitespace.
    return new Buffer(size);
patch:12: trailing whitespace.
  } catch (ex) {
patch:13: trailing whitespace.
    return ex;
fatal: corrupt patch at line 26

Thank you for your help!

Feel free to close this

@mscdex
Copy link
Owner

mscdex commented Nov 17, 2016

FWIW the buffer size change has been pushed in 32523f5. Let me know if there is anything else this module can do to help with this particular issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants