You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Unless your website is written in Russian or Chinese, you probably don't get any traffic from them. They mostly just waste bandwidth and consume resources.
I'm surprised they decided to block Yandex and Baidu. These two search engines have quite a significant market share. Also, yandex.com is in English!
Do you think we could add a bad user agent / bad IP list to this, to make sure we're properly covering all the bases?
@harvest316 it won't be a good default. IMHO, the decision should be left to the developers, as they can better decide what to block based on the requests their server is getting and/or their own preferences.
Also, the UA strings / IPs of the malicious bots can change quite often, so if the list is not consistently updated (maybe through some automatic process, or by the developers), it can become quite useless very fast.
Hi guys.
Do you think we could add a bad user agent / bad IP list to this, to make sure we're properly covering all the bases?
Perhaps we can pull from one of these:
https://github.com/bluedragonz/bad-bot-blocker/
http://perishablepress.com/2013-user-agent-blacklist/
http://perishablepress.com/6g-beta/
Paul
The text was updated successfully, but these errors were encountered: