- Puts every specific-subdomain data to particular folder associated with that subdomain, If you get comfortable with the directory structure then using this script will be a lot easier
- The concept of this script is to convert all findings into one Markdown report, then you can import this report to
Notion
, share this file with others and then collaborate easier - I want to make it run parallel
- No Cloud subscription required. I want these script to run locally in the background with 0$
{
"username": "bbrf",
"password": "penelope",
"couchdb": "https://bbrf-server:6984/bbrf",
"slack_token": "<a slack token to receive notifications>",
"discord_webhook": "<your discord webhook if you want one>",
"ignore_ssl_errors": true
}
- The easiest way is to use my docker container bug-bounty-framework, create the
~/Pentesting
directory on the host machine and run the container - Then on the docker container change directory to this
~/Pentesting
directory and executesudo full-web.sh -d ${domain} -u ${USER-EXEC}
where ${domain} is your target domain and ${USER-EXEC} is the username home directory name this is important, because otherwise findings would be put in /home/root/ which is not-intented (and I don't know how to remove the necessity of declaring this -u flag other than not executing as root)
- Performs full scan, when you look at it you'll see exactly what the framework does and what in which steps it executes other scripts
- Gets the technologies which the site uses, for now it's only using
wappalyzer cli
, but there'll be more!
- This performs all the subdomain enumeration passively, without active scanning with
amass
- This script checks, if the addresses from
get-subdomains-passively.sh
are resolvable, and also performs subdomain brute-forcing - also withamass
- This script resolves every not_active subdomain to it's ip public ip address
- This will perform brute-forcing on services that the not-alive subdomains run
- Still in progress
- This will extract javascript from the website, and also search for other sources
- Still in progress
- This will brute and test alive subdomains with nuclei
- Still in progress
- This converts the contents of tools-io directory to markdown
- This breaks a lot, because with every new functionality I need to change this markdown-converter, so It doesn't cover much of the brute-forcing part.
- The main concept is that It keeps all the findings dynamically updating (new subdomains etc.), but the
notes.mdpp
is static and only you will be filling its content!
- Definetely finding more root domains and subdomain enumeration, also discovering technologies and ways to use that data
- Also managing scope, detecting the scope of the program, put a massive distinction between passive scanning and active scanning to not make bug-bounty programs angry! D:
- Being the most accurate, NOT FOLLOWING THE PHILOSOPHY "BRUTE SPRAY AND PRAY"
- I don't want to focus on brute-forcing mainly, because in the end everyone does that, but someday I'll get down to it
- Create separate docker container for this script to run and make it with set
cron
- Include get-technologies.sh output in markdown
- Implement uploading to imgur via their API
- Integrate nuclei scanning
- Record reports by date and check if there're any new findings worth to check out - Can be done with executing
sdiff
on each file intools-io/
but also with comparing markdown report - Make directory for Notes separate - after all the files in there would be filled the most frequent
- Also make the possibility to include ignore.txt file to ignore these new findings ( If We want to prevent them from appearing )
- Make separate shodan script with API key
- Make basic documentation with
docsify
- Jeez, just get
extracting-javascript.sh
working!!! Withscripthunter
andjsmon
- Redirect unneceseary output to /dev/null in favor of
-o
flag whenever possible - Define out of scope addresses with the help of regex expressions (and grex to generate them)
- Make it so the any setting performed in docker container with
docker attach
is persistent when doing a reboot, or store and copy the configs between Host and Container - Also have a way to manage
$SECRET_TOKENS
in a secure and simple manner, probably with env-files while on the Host machine have a bash script/docs on how to assign them - Notifications via Slack channel
- Separate things put in recon and things put on Slack - The markdown report should be source of information - not the source for 'Incident Response'(? xD) when nuclei or ZAP finds anything
- Implement ZAP with their Automation Framework
- Backups of the data (Mainly reports)
- Use EyeWitness
- After some long time, let's replace markdown with
rmarkdown
, add sweet charts, visualizations :) - Integrate
bbrf
- Make bbrf-server communicate with bug-bounty-framework-web container
- include
bbrf url add -
in getting subdomains at the end of the script - Abandon the use of subdomain files and rely only on
bbrf urls
command. Make use of bbrf tags.
- Experiment with Obsidian plugin creation, etc. :)