Raptor is a subdomain enumeration tool that discovers valid subdomains for websites passively.
Raptor designed to comply with all passive sources licenses, and usage restrictions but speed in mind.
This software currently employs 22 free and commercial services. (Constantly updated if any new resource out there.)
For better results I highly encourage you to get API keys.
Also http-probe included which means user can identify dead&alive subdomains and various open-ports easily.
Think this is useful? ⭐ Star us on GitHub — it helps!
pip3 install -r requirements.txt
python3 main.py --domain example.com
Or
python3 main.py -d example.com
python3 main.py --domain example.com --output example.txt
Or
python3 main.py -d example.com -o example.txt
Use http-probe to identify dead & alive subdomains and various ports. (*This option will take for a while to finish)
python3 main.py --domain example.com --probe
Or
python3 main.py -d example.com -p
python3 main.py --domain example.com --output example.txt --verbose
Or
python3 main.py -d example.com -o example.txt -v
docker build -t hj23/raptor .
docker run --rm -v $PWD/outputs:/outputs hj23/raptor -d example.com
These are the commercial services it uses:
- Bing
- BinaryEdge
- VirusTotal
- Shodan
- UrlScan
- Censys
But all these services provide free limited request package with automatic renewal basis.
for Bing limit is 1000 requests per month.
for BinaryEdge limit is 250 requests per month.
for VirusTotal limit is 500 requests per day.
for Shodan if you have academic email limit is 100 requests per month. (1 request = 100 result)
for UrlScan limit is 1000 requests per day.
for Censys limit is 250 requests per month.
Check out our guide here : How to get API keys for Raptor ?
Well faster not always means better. API calls might take reasonable amount of time. But most importantly in order not to exceed limits stated above scripts adjusted not only for best performance but also best for API call allowance.