This repository has been archived by the owner. It is now read-only.
Permalink
Browse files

Git.pm: Use stream-like writing in cat_blob()

This commit fixes the issue with the handling of large files causing an
'Out of memory' perl exception. Instead of reading and writing the whole
blob at once now the blob is written in small pieces.

The problem was raised and discussed in this mail to the msysGit mailing
list: http://thread.gmane.org/gmane.comp.version-control.msysgit/12080

Signed-off-by: Gregor Uhlenheuer <kongo2002@googlemail.com>
Signed-off-by: Johannes Schindelin <johannes.schindelin@gmx.de>
  • Loading branch information...
1 parent 9e8201c commit b180746c78905157d0b58c3e8004ee3232d44b97 @kongo2002 kongo2002 committed with dscho Feb 18, 2011
Showing with 7 additions and 8 deletions.
  1. +7 −8 perl/Git.pm
View
@@ -888,22 +888,26 @@ sub cat_blob {
}
my $size = $1;
-
- my $blob;
my $bytesRead = 0;
while (1) {
+ my $blob;
my $bytesLeft = $size - $bytesRead;
last unless $bytesLeft;
my $bytesToRead = $bytesLeft < 1024 ? $bytesLeft : 1024;
- my $read = read($in, $blob, $bytesToRead, $bytesRead);
+ my $read = read($in, $blob, $bytesToRead);
unless (defined($read)) {
$self->_close_cat_blob();
throw Error::Simple("in pipe went bad");
}
$bytesRead += $read;
+
+ unless (print $fh $blob) {
+ $self->_close_cat_blob();
+ throw Error::Simple("couldn't write to passed in filehandle");
+ }
}
# Skip past the trailing newline.
@@ -918,11 +922,6 @@ sub cat_blob {
throw Error::Simple("didn't find newline after blob");
}
- unless (print $fh $blob) {
- $self->_close_cat_blob();
- throw Error::Simple("couldn't write to passed in filehandle");
- }
-
return $size;
}

0 comments on commit b180746

Please sign in to comment.