Skip to content
Scan and import relevant requests directly to burp!
Branch: master
Clone or download
Latest commit 3f72e51 Aug 2, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
config Init Mar 6, 2019
inc Init Mar 6, 2019
.gitignore Init Mar 6, 2019

Extended baserequest importer

This python script is part of a larger toolset which allows importing a big list of urls together with all its found parameters using POST and GET through the Burp Suite Proxy.

Why do you need this?

Often an attacker can find vulnerabilities in parameters which are obviously used within a page. But extracting this data manually is a tidious work - wouldn't it be nice to have this process automated. That way you could send the found post and get paramters to Burp Suite's active scanner and let it do the rest of the work.


Plain and simple - it does not expect any arguments:


Don't forget to start Burp Suite Pro!

How does this tool work?


This site is well known and contains several xss. But sending this site to your active scanner will result in... nothing! The reason is: Burp doesn't know about a, b1, b2, b3, b4, c1, c2, c3, c4, c5 and c6. Maybe there are even more vulns to test this parameters against. Tunneling the following requests through (default Burp settings) will make them accessible in burp.

Prepare your tool

You should rename the to app-settings.conf. Then adjust the settings. Usually the default ones are pretty good. But there are targets where sending 10 parameters per request is "healthier"!

Step 1: Crawl the website

Using an initial request the html source code is extracted by this tool.

GET /xss.php HTTP/1.1
Accept-Encoding: gzip, deflate
Content-Type: text/html
Accept: */*
User-Agent: Mozilla/5.0 (X11; Linux i586; rv:63.0) Gecko/20100101 Firefox/63.0
Connection: close

Step 2: Extract potentially useful parameters

I am bad at regular expression but they work (more or less)... you can take a look at inc/ - using that regular expressions this tool will extract the following parameters: b2, b3, b4, c1, c2, c3, c4, c5 and c6

Step 3: Request the URL using GET/POST with those parameters

Now the tool just takes every parameter, appends a random string and requests the url again. When a lot parameters were extracted by this tool, the parameter list gets splitted in chunks with the same size. It's not good to send a GET request with 300 parameters + values. But usually you will have two requests per URL (POST and GET). They look like this:

The GET request:

GET /xss.php?0=393de39&1=e4390e4&12=7459b74&6=f9eb2f9&7=c3871c3&Find=46c5146&POST=dbfb5db&b1=cc50acc&b2=697b869&
viewport=92bb392 HTTP/1.1
Accept-Encoding: gzip, deflate
Content-Type: text/html
Accept: */*
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.0 Safari/537.36
Connection: close

The POST request:

POST /xss.php HTTP/1.1
Accept-Encoding: gzip, deflate
Content-Length: 326
Content-Type: application/x-www-form-urlencoded
Accept: */*
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:64.0) Gecko/20100101 Firefox/64.0
Connection: close


As you can see, not only the mentioned parameters were extracted, also some more are used here. A better regular expression may be a solution to this "problem". But we can work with this currently.

Step 4: Scan using Burp

By now you have those requests in your sitemap:


You can now just start your scanner on those parameters and wait for something cool to happen ;=)



Do you like that tool? Did it help you to get a bounty? Want to give something back/support me? Why not!

Donate via PayPal: CLICK

You can’t perform that action at this time.