Generates a robots.txt
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
lib
test
.editorconfig
.eslintrc
.gitattributes
.gitignore
.travis.yml
History.md
Readme.md
package-lock.json
package.json

Readme.md

robotize

npm version build status downloads

Generates a robots.txt

This module generates a robots.txt. The generated robots.txt conforms to the standards set by Google. Use it to programmatically generate a robots.txt file for your site.

Installation

$ npm install robotize

Example

const robotize = require("robotize");
const opts = {
  useragent: "googlebot",
  allow: ["index.html", "about.html"],
  disallow: ["404.html"],
  sitemap: "https://www.site.com/sitemap.xml"
};

robotize(opts, (err, robots) => {
  if (err) {
    throw new Error(err);
  } else {
    console.log(robots);
  }
});

Will log:

User-agent: googlebot
Allow: index.html
Allow: about.html
Disallow: 404.html
Sitemap: https://www.site.com/sitemap.xml

Options

Robotize accepts an object with options. The options are:

  • useragent: the useragent - String, default: *
  • allow: an array of the url(s) to allow - Array of Strings
  • disallow: an array of the url(s) to disallow - Array of Strings
  • sitemap: the sitemap url - String

Robotize expects at least one of the last three options. So either allow, disallow or sitemap must be passed.

Credits

Forked from robots-generator.

License

MIT