Determine if a page may be crawled from robots.txt, robots meta tags and robot headers
-
Updated
May 14, 2024 - PHP
Determine if a page may be crawled from robots.txt, robots meta tags and robot headers
A Multisite Robots.txt Manager - Quickly and easily manage all robots.txt files on a WordPress Multisite Website Network.
Simple robots generation module for Silverstripe (SS 4 and above)
An extensible robots.txt parser and client library, with full support for every directive and specification.
Behat extension for testing some On-Page SEO factors: meta title/description, canonical, hreflang, meta robots, robots.txt, redirects, sitemap validation, HTML validation, performance...
Generator robots.txt
Robots for Kirby CMS
User-Agent parser for robots.txt, X-Robots-tag and Robots-meta-tag rule sets
Robots Exclusion Standard/Protocol Parser for Web Crawling/Scraping
This Laravel Nova Tool gives your admins the ability to edit the robots.txt file from within the Nova control panel.
PSR-15 middleware to enable/disable the robots of the search engines
🔧 Robots.txt generator component for Nette framework.
This is Pico's official robots plugin to add a robots.txt and sitemap.xml to your website. Pico is a stupidly simple, blazing fast, flat file CMS.
Declarative, scriptable web robot (crawler) and scrapper
TYPO3 sitemap crawler
This is ready to use template to quickly start selling domain with minimum setup.
Add a description, image, and links to the robots-txt topic page so that developers can more easily learn about it.
To associate your repository with the robots-txt topic, visit your repo's landing page and select "manage topics."