Calculate the MD5 value and include it in the Content-MD5 header for each multipart chunk. This has the unfortunate side-effect of calculating the MD5 for each chunk twice: once for the initial upload, and once after the upload completes. That will have to get fixed.
This lets S3 compare after receive what we thought we were sending. Disable with --no-check-md5. This does not cover multipart uploads.
If these throw a TypeError (not sure how to cause it, but had one report of same), we would die. So catch that too.
subcmd_batch_del() was sending the entire remote_list() as a single batch delete operation to S3. That fails for >1000 objects, though we were ignoring the fail. It also can timeout uploading a huge list (one example was deleting 40k objects; a 7MB deletion list XML) and trying to churn through it. The whole looking for a marker bit was poor. We have the remote_list, we just couldn't slice it up. The previous commit adds the getslice operator, so now we can. This greatly simplifies the delete operation as we can iterate over slices of 1000 until it's empty.
location using os.path.expanduser, rather than relying on $HOME to be set.
The different mime magic libraries expect input to their from_file(), file(), and id_filename() functions, respectively, to take either a filesystem-encoded (generally UTF-8) string, or a unicode filename. Different versions of the libraries each expect different inputs for filename though. This is annoying. Here, we call these functions, first with a UTF-8 encoded string filename. If that fails with a UnicodeDecodeError, we try again passing a unicode filename. Also, delete mime_magic_buffer() everywhere, and the introspection of gzip files to see what type of object is inside. It doesn't matter to the S3 web server - it needs to be type application/x-gzip, not type=application/tar encoding=gzip (as mimetypes would tell us). We stopped using the encoding value here as HTTP Content-Encoding in commit 44e3589 anyhow.
Catch the failure to import dateutil where it happens, not later in s3cmd's ImportError handler. As this is a new dependency, many people don't have it installed already. Without this, the s3cmd ImportError handler (invoked because the import in S3/Utils.py of dateutil fails), throws another uncaught exception when invoking s = u' '.join([unicodise(a) for a in sys.argv]) because unicodeise() came from S3/Utils.py which, as just noted, failed to import.