Skip to content

RodricBr/bau

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

63 Commits
 
 
 
 
 
 

Repository files navigation

Baú

Bash All Urls

InstalationExamples

Spidery Chest


Baú is a shell/bash script program which uses curl to extract all information of a given URL from common crawl and web archive's API
Its purpose is to extract parameters to facilitate for further exploitation, such as XSS, SQLi, Open Redirects... and so on.

Instalation

Required: curl

git clone https://github.com/RodricBr/bau
cd bau/;chmod +x bau
sudo mv bau /usr/local/bin/
bau -h

Examples

# Normal use
bau vulnweb.com -ns "js|svg|png"
bau vulnweb.com -s "js|php|svg|png|jpeg|jpg"
bau vulnweb.com -ns
bau vulnweb.com -s

AiriXSS - Checks for reflected parameters

HTTPx - URL probbing

Uro - Removing unnecessary urls (Highly recommended)

qsReplace - Replaces query string values with a given value

urldedupe - Remove duplicated urls

# Pratical use with XARGS (as an idea, not really needed)
echo "vulnweb.com" | xargs -I{} bash -c 'bau {} -ns' | nilo

# XSS Hunting w/ NILO (Faster)
bau vulnweb.com -s "php|js|svg|png" | urldedupe -qs | uro | qsreplace '"><svg onload=alert(1)>' | airixss -payload "alert(1)"

# XSS Hunting on multiple domains w/ HTTPx (Probbing & slower)
bau $(cat domains.txt) -s "php|js|svg|png" | httpx -silent -mc 200 | qsreplace '"><svg onload=alert(1)>' | airixss -payload "alert(1)"

# With XARGS
echo "vulnweb.com" | xargs -I{} bash -c 'bau {} -s "php|js|svg|png"' | ...

Inspired by gau