NodeJS robots.txt parser with support for wildcard (*) matching.
-
Updated
Jun 19, 2024 - JavaScript
NodeJS robots.txt parser with support for wildcard (*) matching.
Gatsby plugin that automatically creates robots.txt for your site
Parser for robots.txt for node.js
Generator robots.txt for node js
A webpack plugin to generate a robots.txt file
🧑🏻👩🏻 "We are people, not machines" - An initiative to know the creators of a website. Contains the information about humans to the web building - A Nuxt Module to statically integrate and generate a humans.txt author file - Based on the HumansTxt Project.
A lightweight robots.txt parser for Node.js with support for wildcards, caching and promises.
Higher order Next.js config to generate sitemap.xml and robots.txt
⚠Experimental ⚠ An Eleventy plugin to generate a robots.txt file for your static site
Generate sitemap and robots.txt for NextJS used web hook from STRAPI
Generates a robots.txt
🤖 Robots.txt generator done right.
🤖 Handle and parse a site's robots.txt file and extract actionable information
🤖 Browser extension to check for and preview a site's robots.txt in a new tab (if it exists)
A robots.txt script for Lambda Edge
nodejs web crawler
Chrome extension which blocks urls based on robots.txt (compatible to Chrome 41)
🌐 Displays the contents of robots.txt and sitemap.xml files of a website google extension
Front-end workflow to start a new project with Eleventy and Webpack.
An express.js middleware for handling noisy robots.txt
Add a description, image, and links to the robots-txt topic page so that developers can more easily learn about it.
To associate your repository with the robots-txt topic, visit your repo's landing page and select "manage topics."