Cannot read large files #1094

shinout opened this Issue May 23, 2011 · 0 comments

2 participants


The code below doen't work correctly.

const fs = require('fs');
var fd = fs.openSync('largefile.txt', 'r'); // largefile.txt is a very large file.
var data = fs.readSync(fd, 100, 5294756210);

Then, the variable 'data' contains data from the beginning of the file.
A similar situation also happens in using fs.createReadStream().

I've also checked the largest start position we can put,
using code like below.

var start = 529475621;  // working correctly
var limit = 5294756210; // not working correctly
var count = 0;
while (count < 100000) {
  var data = fs.readSync(fd, 100, r);
  var invalid = (data[0].substr(0,10) == 'start chars');
  if (invalid) {
    r -= Math.floor((r - start)/2);
  else {
    r += Math.floor((limit - r)/2);
  console.log(count, r, data[0].substr(0,10));

then I found that if the variable [r] is the same value,
the variable [data] could change with time.

@koichik koichik added a commit that closed this issue Jul 13, 2011
@koichik koichik Fix fs can't handle large file on 64bit platform and fs.write() can't handle more than 2GB files on 64bit platform.
Also fs.truncate() can't handle more than 4GB files.

Fixes #1199.
Fixes #1094.
@koichik koichik closed this in a3e3ad4 Jul 13, 2011
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment