Our particular use-case is when we have a bucket containing a bunch of compressed server log files. We want to process those log files by fetching, decompressing and feeding them into an import script.
Without stdout support, we need to first list the globbed files and then cut & xargs them to get to the gunzip stage..
$ s3cmd ls s3://log-bucket/2011-07* | cut -d' ' -f8 | xargs -I@ s3cmd get @ - | gunzip | ./importlogs
Allowing stdout on multiple remote files allows this..
$ s3cmd -r get s3://log-bucket/2011-07 - | gunzip | ./importlogs
Hopefully this will help others attempting similar tasks..
Allow stdout as destination when receiving multiple remote files
- special case stdout when enforcing destination rules
- update parameter error output to indicate stdout is a valid destination specification
Merged back to s3tools/s3cmd, thanks!