We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What is the current behavior?
Only allowedDomains option is supported.
allowedDomains
If the current behavior is a bug, please provide the steps to reproduce
What is the expected behavior?
Support deniedDomains option.
deniedDomains
What is the motivation / use case for changing the behavior?
I noticed that Amazon accepts bots and crawlers in its robots.txt. However, it explicitly says it does not allow those bots in its conditions of use.
I believe it's useful to provide a feature to kindly avoid crawling these sites.
Please tell us about your environment:
The text was updated successfully, but these errors were encountered:
No branches or pull requests
What is the current behavior?
Only
allowedDomains
option is supported.If the current behavior is a bug, please provide the steps to reproduce
What is the expected behavior?
Support
deniedDomains
option.What is the motivation / use case for changing the behavior?
I noticed that Amazon accepts bots and crawlers in its robots.txt.
However, it explicitly says it does not allow those bots in its conditions of use.
I believe it's useful to provide a feature to kindly avoid crawling these sites.
Please tell us about your environment:
The text was updated successfully, but these errors were encountered: