Crawler (Bot) searching for credential leaks on different paste sites.
Clone or download
rndinfosecguy UPDATE: Timing
minor timing update
Latest commit 4e7d9a3 Jan 20, 2019


automated Snyk dependency scan result:

Known Vulnerabilities

bot in action


Just the code of my OSINT bot searching for credentials on different paste sites.

Keep in mind:

  1. This bot is not beautiful. I wrote it quick and dirty and do not care about code conventions or other shit... I will never care about those things.

  2. The code is not complete so far. Some parts like integrating the credentials in a database are missing in this online repository.

  3. If you want to use this code, feel free to do so. Keep in mind you have to customize things to make it run on your system.

  4. I know that I have some false positives and I know that I miss some credentials. So if you think this is crap...ok. leave now. If you have ideas for a better detection, just let me know!

  5. And again: QUICK AND DIRTY! Do not expect nice code.


I recently updated the bot because started to block my IP as I am scraped their website to gather the information without using their API. Now I implemented that at least the pastes itself are downloaded using their API. Therefore I implemented a switch inside the file. To activate the usage of the API you need to edit the file by changing

os.system("tmux new -d -s pastebincomCrawler './ scrape'")


os.system("tmux new -d -s pastebincomCrawler './ api'")

If you do not enable the API usage you may get blocked by too. I did not make this change the default configuration because you need a PRO account to use the API. If you have one you need to whitelist your IP on their website and you will be fine using the API :-)


To learn how to use the software you just need to call the script with the -h/--help argument.

python -h


 /   _____/ ____ _____ ___  __ ____   ____    ____   ___________
 \_____  \_/ ___\\__  \\  \/ // __ \ /    \  / ___\_/ __ \_  __ \
 /        \  \___ / __ \\   /\  ___/|   |  \/ /_/  >  ___/|  | \/
/_______  /\___  >____  /\_/  \___  >___|  /\___  / \___  >__|
        \/     \/     \/          \/     \//_____/      \/

usage: [-h] [-0] [-1] [-ps]

Control software for the different modules of this paste crawler.

optional arguments:
  -h, --help         show this help message and exit
  -0, --pastebinCOM  Activate module
  -1, --pasteORG     Activate module
  -ps, --pStatistic  Show a simple statistic.

So far I only implemented the module and I am working on I will add more modules and update this script over time.

Just start the module separately (first module I implemented)...


Pastes are stored in data/raw_pastes until they are more then 48000. When they are more then 48000 they get filtered, ziped and moved to the archive folder. All pastes which contain credentials are stored in data/files_with_passwords

Keep in mind that at the moment only combinations like USERNAME:PASSWORD and other simple combinations are detected. However, there is a tool to search for proxy logs containing credentials.

You can search for proxy logs (URLs with username and password combinations) by using file

python data/raw_pastes

If you want to search the raw data for specific strings you can do it using (really slow).


To see statistics of the bot just call


The file searches a folder (with pastes) for sensitive data like credit cards, RSA keys or mysqli_connect strings. Keep in mind that this script uses grep and therefore is really slow on a big amount of paste files. If you want to analyze a big amount of pastes I recommend an ELK-Stack.

python data/raw_pastes 

There are two scripts which can be used to monitor a specific twitter user. This means every tweet he posts gets saved and every containing URL gets downloaded. To start the stalker just execute the wrapper.


To Do

I discovered other sites like Pastebin which allow to read the latest paste and crawl them. I need to integreate them into my bot. If you know additional sites which are worth a look, just let me know.