A Python based scanner to find potential SSRF parameters in a web application.
SSRF being one of the critical vulnerabilities out there in web, i see there was no tool which would automate finding potential vulnerable parameters. See-SURF can be added to your arsenal for recon while doing bug hunting/web security testing.
Matches any GET URL Parameters containing keyword web/url (MORE TO BE ADDED).
checks the parameter values for any URL or IP address passed.
Matches any POST request INPUT params with "Name" attribute containing keyword web/url(MORE TO BE ADDED)
matches Values and Placeholder attribute containing a URL pattern.
<input type="text" name="url" value="https://google.com" placeholder="https://msn.com">
Multiple conditions to cut down false positives, as crawling pulls up a lot of stuff. Only same domain is crawled for now.
By Default, normal mode is On, with verbose switch you would see the same vulnerable param in different endpoints. Same parameter may not be sanitized at all places. But verbose mode generates a lot of noise.
Supply cookies for an authenticated scanning.
Comments on almost every logic so people who would like to contribute can understand easily.
Makes external request with the vulnerable parameter to confirm the possibility of SSRF
How to use?
[-] This would run with default threads=10, no cookies/session and NO verbose mode
python3 see-surf.py -H https://www.google.com
[-] Space separate Cookies can be supplied for an authenticated session crawling
python3 see-surf.py -H https://www.google.com -c cookie_name1=value1 cookie_name2=value2
Recently added feature
[-] Fire up burpsuite collaborator and pass the host with -p parameter Or start a simple python http server and wait for the
vulnerable param to execute your request. (Highly Recommended)
(This basically helps in exploiting GET requests, for POST you would need to try to exploit it manually)
Payload will get executed with the param at the end of the string so its easy to identify which one is vulnerable. For example: http://220.127.116.11:8000/vulnerableparam
python3 see-surf.py -H https://www.google.com -c cookie_name1=value1 cookie_name2=value2 -p http://18.104.22.168:8000
[-] Supplying no. of threads and verbose mode (VERBOSE MODE IS NOT RECOMMENDED IF YOU DON'T WANT TO SPEND LONGER TIME BUT THE
POSSIBILITY OF BUG FINDING INCREASES)
python3 see-surf.py -H https://www.google.com -c cookie_name1=value1 cookie_name2=value2 -t 20 -v
By Default, normal mode is On, with verbose switch you would see the same potential vulnerable param in different endpoints.
(Same parameter may not be sanitized at all places. But verbose mode generates a lot of noise.)
Version-2 (Best Recommended)
Provide burp sitemap files for a better discovery of potential SSRF parameters. The script would first parse the burp file and try to identify potential params and then run the built in crawler on it
Browser the target with your burpsuite running at the background, make some GET/POST requests, the more the better. Then go to target, right click-> "Save selected Items" and save it. Provide to the script as follows.
python3 see-surf.py -H https://www.google.com -c cookie_name1=value1 cookie_name2=value2 -b burp_file.xml -p http://22.214.171.124:8000
git clone https://github.com/In3tinct/See-SURF.git
pip3 install BeautifulSoup4
pip3 install requests
A basic framework has been created. More tested would be added to reduce any false positives.
- Report bugs.
- Suggestions for improvement.
- Suggestions for future extensions.
Template - https://gist.github.com/akashnimare/7b065c12d9750578de8e705fb4771d2f
Some regexes from https://www.regextester.com/97040
Stackoverflow and Entire Internet.
- Finding potential params during redirection.
- More conditions to avoid false positives.
GNUV3 © [In3tinct]
Twitter - https://twitter.com/_In3tinct