Skip to content
Scan for open AWS S3 buckets and dump the contents
Branch: master
Clone or download
Latest commit a386b49 Feb 10, 2019


License: MIT Build Status

A tool to find open S3 buckets and dump their contents 💧

1 -

If you've earned a bug bounty using this tool, please consider donating to support it's development



#  s3scanner - Find S3 buckets and dump!
#  Author: Dan Salmon - @bltjetpack,

positional arguments:
  buckets                Name of text file containing buckets to check

optional arguments:
  -h, --help              show this help message and exit
  -o, --out-file OUTFILE  Name of file to save the successfully checked buckets in (Default: buckets.txt)
  -d, --dump              Dump all found open buckets locally
  -l, --list              List all found open buckets locally

The tool takes in a list of bucket names to check. Found S3 buckets are output to file. The tool will also dump or list the contents of 'open' buckets locally.

Interpreting Results

This tool will attempt to get all available information about a bucket, but it's up to you to interpret the results.

Settings available for buckets:

  • Object Access (object in this case refers to files stored in the bucket)
    • List Objects
    • Write Objects
  • ACL Access
    • Read Permissions
    • Write Permissions

Any or all of these permissions can be set for the 2 main user groups:

  • Authenticated Users
  • Public Users (those without AWS credentials set)
  • (They can also be applied to specific users, but that's out of scope)

What this means: Just because a bucket returns "AccessDenied" for it's ACLs doesn't mean you can't read/write to it. Conversely, you may be able to list ACLs but not read/write to the bucket


  1. (Optional) virtualenv venv && source ./venv/bin/activate
  2. pip install -r requirements.txt
  3. python ./

(Compatibility has been tested with Python 2.7 and 3.6)

Using Docker

  1. Build the Docker image:
sudo docker build -t s3scanner
  1. Run the Docker image:
sudo docker run -v /input-data-dir/:/data s3scanner --out-file /data/results.txt /data/names.txt

This command assumes that names.txt with domains to enumerate is in /input-data-dir/ on host machine.


This tool accepts the following type of bucket formats to check:

  • bucket name - google-dev
  • domain name -,
  • full s3 url - (To easily combine with other tools like bucket-stream)
  • bucket:region -
> cat names.txt
  1. Dump all open buckets, log both open and closed buckets to found.txt

    > python ./ --include-closed --out-file found.txt --dump names.txt
  2. Just log open buckets to the default output file (buckets.txt)

    > python ./ names.txt
  3. Save file listings of all open buckets to file

    > python ./ --list names.txt


Issues are welcome and Pull Requests are appreciated. All contributions should be compatible with both Python 2.7 and 3.6.

master Build Status
enhancements Build Status
bugs Build Status


  • All test are currently in
  • Run tests with in 2.7 and 3.6 virtual environments.
  • This project uses pytest-xdist to run tests. Use pytest -n NUM where num is number of parallel processes.
  • Run individual tests like this: pytest -q -s



License: MIT

You can’t perform that action at this time.