Tame the robots crawling and indexing your Nuxt site.
-
Updated
Jul 15, 2024 - TypeScript
Tame the robots crawling and indexing your Nuxt site.
An extension for Visual Studio Code that enables support for robots.txt files. 🤖
Collection of SEO utilities like sitemap, robots.txt, etc. for a Remix application. Forked from https://github.com/balavishnuvj/remix-seo
The right robots.txt file for your project
RFC 5234 spec compliant robots.txt builder and parser. 🦾
Website of Alp Ozkan: Product Manager living in Istanbul, passionate about UX, Software Development and Web3.
Makes it easy to add robots.txt, sitemap and web app manifest during build to your Astro app.
A lightweight and simple robots.txt parser in node
"Page Auditor for Technical SEO" is an open source Google Chrome Extension created by Franco Folini. Once you added Page Auditor to your browser it will let you explore and analyze Structured Data, JavaScript scripts, Meta-Tags, Robots.txt and Sitemap.xml files from any webpage. All these elements are critical to improve the on-page SEO.
A lightweight crawler frontier implementation in TypeScript using Redis.
Super lightweight plain TypeScript parser for robots.txt with 0 dependencies.
Robots.txt parser / generator
Add a description, image, and links to the robots-txt topic page so that developers can more easily learn about it.
To associate your repository with the robots-txt topic, visit your repo's landing page and select "manage topics."