Skip to content
This repository has been archived by the owner on Nov 4, 2018. It is now read-only.

Commit

Permalink
* S3/Utils.py (hash_file_md5): Hash files in 32kB chunks
Browse files Browse the repository at this point in the history
  instead of reading it all up to a memory first to avoid
  OOM on large files.



git-svn-id: https://s3tools.svn.sourceforge.net/svnroot/s3tools/s3cmd/trunk@208 830e0280-6d2a-0410-9c65-932aecc39d9d
  • Loading branch information
mludvig committed Jul 29, 2008
1 parent d9a3c84 commit 9a5cde4
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 1 deletion.
6 changes: 6 additions & 0 deletions ChangeLog
@@ -1,3 +1,9 @@
2008-07-29 Michal Ludvig <michal@logix.cz>

* S3/Utils.py (hash_file_md5): Hash files in 32kB chunks
instead of reading it all up to a memory first to avoid
OOM on large files.

2008-07-07 Michal Ludvig <michal@logix.cz>

* s3cmd.1: couple of syntax fixes from Mikhail Gusarov
Expand Down
7 changes: 6 additions & 1 deletion S3/Utils.py
Expand Up @@ -139,7 +139,12 @@ def mktmpfile(prefix = "/tmp/tmpfile-", randchars = 20):
def hash_file_md5(filename):
h = md5.new()
f = open(filename, "rb")
h.update(f.read())
while True:
# Hash 32kB chunks
data = f.read(32*1024)
if not data:
break
h.update(data)
f.close()
return h.hexdigest()

Expand Down

0 comments on commit 9a5cde4

Please sign in to comment.