Skip to content

Commit

Permalink
Merge branch 'master' of github.com:einaros/vulnscrape
Browse files Browse the repository at this point in the history
  • Loading branch information
einaros committed Mar 22, 2012
2 parents dcf7bd4 + ddd3256 commit 44721ff
Showing 1 changed file with 14 additions and 3 deletions.
17 changes: 14 additions & 3 deletions README.md
Expand Up @@ -8,7 +8,7 @@ This is a rather naïve link scraper-driven web vulnerability scanner. Use it re

## Usage

vulnscrape.rb [options]
Usage: vulnscrape.rb [options]
-u, --url URL The url to scan.
-m, --max count Max urls to scrape for.
-i, --skip count Numer of scraped urls to skip.
Expand All @@ -17,12 +17,23 @@ This is a rather naïve link scraper-driven web vulnerability scanner. Use it re
-r, --restriction REGEX Url restriction
Only collect URLs matching REGEX.
Typically more restrictive than the scraper restriction.
-k, --[no-]keep Keep duplicate urls.
Enabling this will make the link collector keep urls with the same host and path.
Default: false
-h, --[no-]header Include header heuristics. Default: false
-p, --[no-]split Include response splitting heuristics. Default: false
-n, --[no-]mhtml Include MHTML heuristics. Default: false
-x, --[no-]hash Include hash heuristics. Default: false
-q, --[no-]query Include query heuristics. Default: true
-f, --[no-]fof Include 404 page. Default: true
-s, --[no-]single Single run. Default: false
--user USERNAME Basic auth username
--pass PASSWORD Basic auth password
--cookie COOKIE Cookie string
--load FILENAME Load urls from FILENAME
The scraper can save urls using --save.
--save FILENAME Save urls to FILENAME
Saved urls can be reloaded later with --load

## Examples

Expand All @@ -37,6 +48,6 @@ Will scrape http://mydomain.com for at least 50 urls, and start running various
Will start scraping at http://services.mydomain.com, and only follow (continue scraping) urls on that subdomain. All links from
all mydomain.com subdomains will eventually be run through the heuristics scanner.

./vulnscrape.rb -u http://xss.progphp.com -h
./vulnscrape.rb -u http://xss.progphp.com -h -p -m -x

Includes header heuristics, as demonstrated by a few of the XSS vectors on the progphp test site.
Includes query string, header, response splitting and hash heuristics, as demonstrated by a few of the XSS vectors on the progphp test site.

0 comments on commit 44721ff

Please sign in to comment.