Allow stdout as destination when receiving multiple remote files #1

Merged
merged 1 commit into from Oct 12, 2011

Projects

None yet

2 participants

Contributor
ohhorob commented Jul 9, 2011
  • special case stdout when enforcing destination rules
  • update parameter error output to indicate stdout is a valid destination specification

Our particular use-case is when we have a bucket containing a bunch of compressed server log files. We want to process those log files by fetching, decompressing and feeding them into an import script.

Without stdout support, we need to first list the globbed files and then cut & xargs them to get to the gunzip stage..

$ s3cmd ls s3://log-bucket/2011-07* | cut -d' ' -f8 | xargs -I@ s3cmd get @ - | gunzip | ./importlogs

Allowing stdout on multiple remote files allows this..

$ s3cmd -r get s3://log-bucket/2011-07 - | gunzip | ./importlogs

Hopefully this will help others attempting similar tasks..

Rob.

@ohhorob ohhorob Allow stdout as destination when receiving multiple remote files
- special case stdout when enforcing destination rules
- update parameter error output to indicate stdout is a valid destination specification
2320b45
@mludvig mludvig merged commit 2320b45 into s3tools:master Oct 12, 2011

Merged back to s3tools/s3cmd, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment