Skip to content

jaredhanson/kerouac-robotstxt

Repository files navigation

kerouac-robotstxt

Kerouac middleware that gives instructions to web crawlers using the Robots Exclusion Protocol.

Install

$ npm install kerouac-robotstxt

Usage

Declare a robots.txt route, using this middleware.

var robots = require('kerouac-robotstxt');

site.page('/robots.txt', robots());

And map a robots.txt file when generating the site.

site.generate([
  robots.createMapper()
]);

The generated output will include a /robots.txt file. If your site contains any sitemaps, which can be generated using kerouac-sitemap, the locations of those sitemaps will be included so that search engines can automatically discover all pages of your site.

License

The MIT License

Copyright (c) 2012-2022 Jared Hanson <https://www.jaredhanson.me/>

About

robots.txt middleware for Kerouac.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published