Skip to content

a tool that gets all paths at robots.txt and opens it in the browser.

Notifications You must be signed in to change notification settings

momenbasel/pyrobots

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

pyrobots

a tool that reads "robots.txt" file and append each path to the domain/subdomain you entered.

what is robots.txt?

robots.txt is a text file that tells web robots what to crawl and what not to crawl.

robots.txt files can give attackers valuable information on potential targets by giving them clues about directories their >owners are trying to protect. robots.txt files tell search engines which directories on a web server they can and cannot >read. they offer clues about where system administrators store sensitive assets because the mention of a directory in a >robots.txt file screams out that the owner has something they want to hide. In the simplest cases, it (robots.txt) will >reveal restricted paths and the technology used by your servers source:https://www.theregister.co.uk/2015/05/19/robotstxt/

Demo:

without browser withbrowser

About

a tool that gets all paths at robots.txt and opens it in the browser.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages