A tool for debugging robots.txt
-
Updated
Mar 23, 2018 - JavaScript
A tool for debugging robots.txt
Fully native robots.txt parsing component without any dependencies.
Sharp SEO Tools is collection of free web tools completely written in Javascript (19 tools available), feel free to use
A Webpack 3 plugin for generating robots.txt file
Robots.js is a tool used to generate robots.txt according to your rules. Adapted from FastGitORG/SpiderFucker & Kinetix-Lee/spiderfucker-python.
Typescript robots.txt parser with support for wildcard (*) matching.
A simple script to open all the pages in a website's robots.txt files
A robots.txt generating Express Middleware
🌐 Displays the contents of robots.txt and sitemap.xml files of a website google extension
Front-end workflow to start a new project with Eleventy and Webpack.
An express.js middleware for handling noisy robots.txt
Chrome extension which blocks urls based on robots.txt (compatible to Chrome 41)
🤖 Robots.txt generator done right.
🤖 Browser extension to check for and preview a site's robots.txt in a new tab (if it exists)
🤖 Handle and parse a site's robots.txt file and extract actionable information
A robots.txt script for Lambda Edge
nodejs web crawler
Generates a robots.txt
Generate sitemap and robots.txt for NextJS used web hook from STRAPI
Add a description, image, and links to the robots-txt topic page so that developers can more easily learn about it.
To associate your repository with the robots-txt topic, visit your repo's landing page and select "manage topics."