Skip to content

Robots.js is a tool used to generate robots.txt according to your rules. Adapted from FastGitORG/SpiderFucker & Kinetix-Lee/spiderfucker-python.

License

Notifications You must be signed in to change notification settings

Kinetix-Lee/robotsjs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

robotsjs

简体中文

Robots.js is a tool used to generate robots.txt according to your rules. (Limited features so far) Adapted from FastGitORG/SpiderFucker & Kinetix-Lee/spiderfucker-python.

Usage

Default (Recommended)

The -i and -o options are set by default, thus you can execute

npm run create

directly, which will load config/rules.json and makes config/robots.txt.

With Options

Options:

  • -i, --in [file] Specify input file. (default: "config/rules.json")
  • -o, --out [file] Specify output file. (default: "config/robots.txt")
  • -h, --help display help for command
npm run create -i [Input] -o [Output]

Testing

npm run test

Help

Check out help:

npm run rjs-help

About

Robots.js is a tool used to generate robots.txt according to your rules. Adapted from FastGitORG/SpiderFucker & Kinetix-Lee/spiderfucker-python.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published