-
-
Notifications
You must be signed in to change notification settings - Fork 958
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Another bot problem #7514
Comments
This issue looks more like a support question than an issue. We strive to answer these reasonably fast, but purchasing the support subscription is not only more responsible and faster for your business but also makes Weblate stronger. In case your question is already answered, making a donation is the right way to say thank you! |
weblate/weblate/templates/robots.txt Line 13 in 28bf9ca
|
This is not a malicious bot and should apply to your robots. How can I edit the roobots.txt file in docker myself? |
Create custom |
I temporarily blocked all bots.
It would be nice to have a real ip so I could block it on the firewall. |
Weird that robots obey the "block User-agent" entry, not "block /changes/". Noticed that robots read also /static/, /matrix/, should it be like that?
|
That should be done since WeblateOrg/docker#1306
Static files are okay, matrix is disallowed since b255385 |
For me, the topic has been completed, I created my roobots.txt file and blocked all User-Agents (no search engine indexation). When real IP in the future appears in the future, the default robots.txt restores. As the logs will be real IP check if the IP belongs to a real bot or malicious. At the moment, some bots do not apply to robots.txt and continue to read prohibited folders. After complete prohibiting visits, the user-agent stops visiting folders. |
The issue you have reported is now resolved. If you don’t feel it’s right, please follow its labels to get a clue for further steps.
|
All bots that do not respect the robots.txt file are malicious. Now, when you can see the ip, they can be easily tracked and blocked. |
Describe the issue
Queries suck two CPU cores (200% CPU). I would like to completely block robots from accessing my projects (no indexing needed). How can I edit the robots.txt file?
I already tried
Steps to reproduce the behavior
No response
Expected behavior
No response
Screenshots
No response
Exception traceback
No response
Additional context
Docker
The text was updated successfully, but these errors were encountered: