NodeJS robots.txt parser with support for wildcard (*) matching.
-
Updated
Jun 19, 2024 - JavaScript
NodeJS robots.txt parser with support for wildcard (*) matching.
⚠Experimental ⚠ An Eleventy plugin to generate a robots.txt file for your static site
Gatsby plugin that automatically creates robots.txt for your site
🌐 Displays the contents of robots.txt and sitemap.xml files of a website google extension
A lightweight robots.txt parser for Node.js with support for wildcards, caching and promises.
A webpack plugin to generate a robots.txt file
Generator robots.txt for node js
An express.js middleware for handling noisy robots.txt
Sharp SEO Tools is collection of free web tools completely written in Javascript (19 tools available), feel free to use
Fully native robots.txt parsing component without any dependencies.
Higher order Next.js config to generate sitemap.xml and robots.txt
🧑🏻👩🏻 "We are people, not machines" - An initiative to know the creators of a website. Contains the information about humans to the web building - A Nuxt Module to statically integrate and generate a humans.txt author file - Based on the HumansTxt Project.
Parser for robots.txt for node.js
Generate sitemap and robots.txt for NextJS used web hook from STRAPI
Front-end workflow to start a new project with Eleventy and Webpack.
Robots.js is a tool used to generate robots.txt according to your rules. Adapted from FastGitORG/SpiderFucker & Kinetix-Lee/spiderfucker-python.
Typescript robots.txt parser with support for wildcard (*) matching.
Generates a robots.txt
Chrome extension which blocks urls based on robots.txt (compatible to Chrome 41)
Add a description, image, and links to the robots-txt topic page so that developers can more easily learn about it.
To associate your repository with the robots-txt topic, visit your repo's landing page and select "manage topics."