Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support deniedDomains like allowedDomains #107

Closed
yujiosaka opened this issue Feb 21, 2018 · 0 comments
Closed

Support deniedDomains like allowedDomains #107

yujiosaka opened this issue Feb 21, 2018 · 0 comments
Labels

Comments

@yujiosaka
Copy link
Owner

What is the current behavior?

Only allowedDomains option is supported.

If the current behavior is a bug, please provide the steps to reproduce

What is the expected behavior?

Support deniedDomains option.

What is the motivation / use case for changing the behavior?

I noticed that Amazon accepts bots and crawlers in its robots.txt.
However, it explicitly says it does not allow those bots in its conditions of use.

I believe it's useful to provide a feature to kindly avoid crawling these sites.

Please tell us about your environment:

  • Version:
  • Platform / OS version:
  • Node.js version:
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant