DirHound is a web crawler. It also performs bruteforcing in order to find files and directories in the target website.
DirHound requires a base URL which will be the start point of the crawling session:
./dirhound http://www.example.com
Bruteforcing requires a wordlist containing relative paths that will be appended to every directory found on the crawled website.
The wordlist should be provided in a file containing one path per line, such as:
admin/
admin.php
phpmyadmin/
login.php
A default wordlist is provided, containing some commonly used directory and file names. If you want to provide your own wordlist, you can use the -w parameter:
./dirhound -w /some/path/wordlist http://www.example.com
DirHound generates an output file containing the links discovered while crawling and all of the successfully bruteforced paths. By default, the output will be stored in a file named dirhound.out but you can provide your own path:
./dirhound -o /tmp/my_crawl_status http://www.example.com
Sometimes, you might just want to crawl a website and disable the bruteforcing feature. You can do this by using the -d switch:
./dirhound -d http://www.example.com
The easiest way to compile DirHound is to use cabal:
cabal configure
cabal build
The generated executable will be stored in ./dist/build/dirhound/dirhound.