Skip to content
Simple, fast web crawler designed for easy, quick discovery of endpoints and assets within a web application
Branch: master
Clone or download
Latest commit cf5683c Jan 19, 2020
Type Name Latest commit message Commit time
Failed to load latest commit information.
pkg updated version to beta6 Jan 19, 2020
.gitignore ignore built binary Jan 7, 2020
LICENSE Create LICENSE Jan 6, 2020 updated version to beta6 Jan 19, 2020
go.mod added go.mod Jan 7, 2020
hakrawler-output-sample.png added some pics to pretty up the readme Dec 27, 2019
main.go bugfix - fixes a bug where program never exits Jan 17, 2020


Twitter Version License

What is it?

hakrawler is a Go web crawler designed for easy, quick discovery of endpoints and assets within a web application. It can be used to discover:

  • Forms
  • Endpoints
  • Subdomains
  • Related domains
  • JavaScript files

The goal is to create the tool in a way that it can be easily chained with other tools such as subdomain enumeration tools and vulnerability scanners in order to facilitate tool chaining, for example:

assetfinder | hakrawler | some-xss-scanner


  • Unlimited, fast web crawling for endpoint discovery
  • Fuzzy matching for domain discovery
  • robots.txt parsing
  • sitemap.xml parsing
  • Plain output for easy parsing into other tools
  • Accept domains from stdin for easier tool chaining
  • SQLMap-friendly output format
  • Link gathering from JavaScript files

Upcoming features


  • hakluke wrote the tool
  • cablej cleaned up the code
  • Corben Leo added in functionality to pull links from JavaScript files
  • delic made the code much cleaner
  • hoenn made the code even cleanerer


  • codingo and prodigysml/sml555, my favourite people to hack with. A constant source of ideas and inspiration. They also provided beta testing and a sounding board for this tool in development.
  • tomnomnom who wrote waybackurls, which powers the wayback part of this tool
  • s0md3v who wrote photon, which I took ideas from to create this tool
  • The folks from gocolly, the library which powers the crawler engine
  • oxffaa, who wrote a very efficient sitemap.xml parser which is used in this tool
  • The contributors of LinkFinder where some awesome regex was stolen to parse links from JavaScript files.


  1. Install Golang
  2. Run the command below
go get
  1. Run hakrawler from your Go bin directory. For linux systems it will likely be:

Note that if you need to do this, you probably want to add your Go bin directory to your $PATH to make things easier!


Note: multiple domains can be crawled by piping them into hakrawler from stdin. If only a single domain is being crawled, it can be added by using the -domain flag.

$ hakrawler -h
Usage of hakrawler:
    	Include everything in output - this is the default, so this option is superfluous (default true)
  -auth string
    	The value of this will be included as a Authorization header
  -cookie string
    	The value of this will be included as a Cookie header
  -depth int
    	Maximum depth to crawl, the default is 1. Anything above 1 will include URLs from robots, sitemap, waybackurls and the initial crawler as a seed. Higher numbers take longer but yield more results. (default 1)
    	Include form actions in output
    	Include links to utilised JavaScript files
    	Run linkfinder on javascript files.
  -outdir string
    	Directory to save discovered raw HTTP requests
    	Don't use colours or print the banners to allow for easier parsing
    	Include robots.txt entries in output
  -scope string
    	Scope to include:
    	strict = specified domain only
    	subs = specified domain and subdomains
    	fuzzy = anything containing the supplied domain
    	yolo = everything (default "subs")
    	Include sitemap.xml entries in output
    	Include subdomains in output
  -url string
    	The url that you wish to crawl, e.g. or Schema defaults to http
    	Include URLs in output
    	Query wayback machine for URLs and add them as seeds for the crawler
  -v	Display version and exit
    	Include wayback machine entries in output

Basic Example

Command: hakrawler -url -depth 1

sample output

You can’t perform that action at this time.