PasteHunter is a python3 application that is designed to query a collection of sites that host publicliy pasted data. For all the pasts it finds it scans the raw contents against a series of yara rules looking for information that can be used by an org or a researcher.
Pastehunter currently has support for the following sites:
Support for the following sites is listed as ToDo:
Pastehunter supports several output modules:
- dump to ElasticSearch DB (default)
- email sending over SMTP
- dump to JSON file
- dump to CSV file
Multiple recipients can be specified, with different rulesets each. It's possible to combine these rules using simple OR or AND logic (respectively rule_list and mandatory_rule_list). You need to set SMTP_SECURITY in the config file to one of the following options:
Refer to your email provider to determine which you require.
Pastehunter comes with a couple of post process modules that extact useful data from pastes or pass them to other services The following are default modules:
- Base64 Decoders
You need a Pro account on pastebin that has access to the scraping API. https://pastebin.com/api_scraping_faq
Github needs an oauth token to stop it hitting the free ratelimit. Create one at https://github.com/settings/tokens
YOU DO NOT NEED TO GIVE IT ANY ACCESS PERMISSIONS
If you have yara errors check the installed version numbers for yara and yara-python match the lastest versions.
Python / Deps
pip3 install -r requirements.txt
Install Docker & docker-compose
docker build . -t pastehunter
Running all the applications
docker-compose up -d
Kibana is running only on the localhost interface on default port (5601).
Kibana use the default login and password :
Kibana is using the static IP address : 172.16.10.12 in the
Elasticsearch is running only on the localhost interface on default port 9200.
The mount point is
/usr/share/elasticsearch/data by default
if elastic search fails to start and you see "max virtual memory areas vm.max_map_count  likely too low" in the logs then try
sudo sysctl -w vm.max_map_count=262144
https://elk-docker.readthedocs.io/#troubleshooting Paragraph starting As from version 5
You can re-run the pastehunter script by doing
docker-compose up -d
Docker-compose will use already running instances of Elasticsearch and Kibana
copy settings.json.sample to settings.json populate the details. For the scraping API you need to whitelist your IP on pastebin. No API key is required. See the link above
The logging level can be set to one of the following values.
The default is INFO:20
Start the application with
It may be useful to run in a screen to keep it running in the background.
Service config is coming