Skip to content

th3cyb3rc0p/Automated-BugHunting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 

Repository files navigation

Automated-BugHunting

Recon

  1. Get a target *.someurl.com
  2. Start to enum with tools.
  • Subdomains

    • Sublister
    • Amass
    amass enum -active -brute -d web.site.com -o sitesout.txt 
  • urls (results from subdomains)

    • Sort results
    cat sitesout.txt | sort -u | uniq -u | httpx -silent > $file-alive.txt
  • Parameters

    • hakrawler sort
    cat $file-alive.txt | hakrawler -depth 5 -plain -wayback | anew $hakrawler.txt &> /dev/null
    • Paramspider
    python3 ~/tools/ParamSpider/paramspider.py --domain "site.com" --exclude woff,css,js,png,svg,jpg --level high --quiet --output paramspider.txt &> /dev/null

    Sort

    cat paramspiderout.txt hakrawler.out| sort -u | uniq -u > params_outfinal.txt
  • gf

    • Sort
  cat params_outfinal.txt | gf xss | sed 's/=.*/=/' | sed 's/URL: //' > xss.txt
  cat params_outfinal.txt | gf ssrf > ssrf.txt
  cat params_outfinal.txt | gf ssti > ssti.txt
  cat params_outfinal.txt | gf redirect > redirect.txt
  cat params_outfinal.txt | gf sqli > sqli.txt
  cat params_outfinal.txt | gf lfi > lfi.txt
  cat params_outfinal.txt | gf rce > rce.txt

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published