Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

Cannot read large files #1094

Closed
shinout opened this Issue · 0 comments

2 participants

@shinout

The code below doen't work correctly.

const fs = require('fs');
var fd = fs.openSync('largefile.txt', 'r'); // largefile.txt is a very large file.
var data = fs.readSync(fd, 100, 5294756210);

Then, the variable 'data' contains data from the beginning of the file.
A similar situation also happens in using fs.createReadStream().

I've also checked the largest start position we can put,
using code like below.

var start = 529475621;  // working correctly
var limit = 5294756210; // not working correctly
var count = 0;
while (count < 100000) {
  var data = fs.readSync(fd, 100, r);
  var invalid = (data[0].substr(0,10) == 'start chars');
  if (invalid) {
    r -= Math.floor((r - start)/2);
  }
  else {
    r += Math.floor((limit - r)/2);
  }
  count++;
  console.log(count, r, data[0].substr(0,10));
}   

then I found that if the variable [r] is the same value,
the variable [data] could change with time.

@koichik koichik closed this issue from a commit
@koichik koichik Fix fs can't handle large file on 64bit platform
fs.read() and fs.write() can't handle more than 2GB files on 64bit platform.
Also fs.truncate() can't handle more than 4GB files.

Fixes #1199.
Fixes #1094.
a3e3ad4
@koichik koichik closed this in a3e3ad4
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.