Skip to content

Latest commit

 

History

History
15 lines (8 loc) · 998 Bytes

CONTRIBUTING.md

File metadata and controls

15 lines (8 loc) · 998 Bytes

Contributing

When contributing to this repository, please first discuss the change you wish to make via issue, email, or any other method with the owners of this repository before making a change.


Unwanted Bots

Robots.txt contains 2 robots.txt file templates to help webmasters keep unwanted web robots (e.g. scraper bots, people search engines, seo tools, marketing tools, etc.) away from their websites but allow legitimate robots (e.g. search engine crawlers).

Legitimate Bots

To be legitimate and get listed, robots must fully obey the Robots Exclusion Standard. The robots.txt file templates contain a white list. Unlisted robots (User-agents) are, by the conventions of the Robots Exclusion Standard, not allowed to access.

Only legitimate bots will be added to the templates. This helps to identify unwanted bots, which can be blocked by IP, DNS, User-agent or based on behavior in regards to the robots.txt file.