Skip to content
Permalink
Branch: master
Find file Copy path
Find file Copy path
1 contributor

Users who have contributed to this file

21 lines (8 sloc) 1003 Bytes

pyrobots

a tool that reads "robots.txt" file and append each path to the domain/subdomain you entered.

what is robots.txt?

robots.txt is a text file that tells web robots what to crawl and what not to crawl.

robots.txt files can give attackers valuable information on potential targets by giving them clues about directories their >owners are trying to protect. robots.txt files tell search engines which directories on a web server they can and cannot >read. they offer clues about where system administrators store sensitive assets because the mention of a directory in a >robots.txt file screams out that the owner has something they want to hide. In the simplest cases, it (robots.txt) will >reveal restricted paths and the technology used by your servers source:https://www.theregister.co.uk/2015/05/19/robotstxt/

Demo:

without browser withbrowser

You can’t perform that action at this time.