Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Low performance if many Virtual Hosts #64 (Next) #130

Closed
ZerooCool opened this issue Jun 28, 2019 · 5 comments
Closed

Low performance if many Virtual Hosts #64 (Next) #130

ZerooCool opened this issue Jun 28, 2019 · 5 comments

Comments

@ZerooCool
Copy link

ZerooCool commented Jun 28, 2019

I have read this issue : #64

My configuration :
Apache 2.4
CMS Joomla

I add the ultimate bad bot blocker with my virtualhost :
https://wiki.visionduweb.fr/index.php?title=Configurer_le_fichier_.htaccess#Une_liste_de_plus_de_7000_bots_bloqu.C3.A9s_avec_.htaccess

My VirtualHost ( for visionduweb.fr ) :
https://wiki.visionduweb.fr/index.php?title=VirtualHosts_des_domaines_enregistr%C3%A9s#visionduweb.fr_.C3.A9coute_du_port_SSL_443

To the bottom from the VirtualHost from visionduweb.fr , i add :

# Inclure la liste noire :
<Location "/">
AuthMerging and
Include /etc/apache2/custom.d/globalblacklist.conf
</Location>

</VirtualHost>

In /etc/apache2/custom.d/globalblacklist.conf i have add all from https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/blob/master/_htaccess_versions/htaccess-mod_rewrite.txt and remove the code for apache 2.2

I restart apache2.

I make a test, if the bad bots are bloqued :
curl -A "IRLbot" -I https://www.visionduweb.fr/robots.txt

HTTP/1.1 302 Found
...
Location: https://www.visionduweb.fr/403-forbidden.php

My website work with the domain name : visionduweb.fr
But, if i want use visionduweb.fr/forum ..... visionduweb.fr/blog ...... visionduweb.fr/annuaire .... don't work now.

->

Not Found

The requested URL /blog was not found on this server.

If i comment this code my website work good, but then, i don't have ultimate bad bot blocker.

<Location "/">
AuthMerging And
Include /etc/apache2/custom.d/globalblacklist.conf
</Location>
@ZerooCool
Copy link
Author

I integrate the list directly into the VirtualHost, without going through a secondary file:
By cons, it is not at all practical for maintenance, and it really adds a lot of lines to the configuration.

I really wish I could integrate an external file.

<IfModule mod_rewrite.c>
RewriteEngine on
...

# Bloquer les Bad Bots
RewriteCond %{HTTP_USER_AGENT} \b360Spider\b [NC,OR]
RewriteCond %{HTTP_USER_AGENT} \b404checker\b [NC,OR]
...
RewriteCond %{HTTP_USER_AGENT} \b404enemy\b [NC,OR]
RewriteCond %{HTTP_REFERER} ^http(s)?://(www.)?.*zzlgxh\.com.*$ [NC]
RewriteRule ^(.*)$ - [F,L]
</IfModule>

@mitchellkrogza
Copy link
Owner

Why are you modifying globalblacklist.conf ? Have you read the comments in that file saying don't edit anything ? Adding rewrite rules into the globalblacklist WILL break things please follow all instructions. Just use the blocker don't try and modify it.

@mitchellkrogza
Copy link
Owner

Maybe @ZerooCool we have a language barrier because I'm not understanding what it is you are trying to do or change 🤔

@mitchellkrogza
Copy link
Owner

why are you not using

	<Directory "/var/www/html">
	Options +Includes
	Options +FollowSymLinks -Indexes
	Include custom.d/globalblacklist.conf
	</Directory>

you are using the mod_rewrite which will always have performance issues, that's not a recommended method it's meant for people who do not have access to the backend of apache? Is that your case?

@ZerooCool
Copy link
Author

ZerooCool commented Jun 30, 2019

Okay ! I discovered the README which is really well written :
https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/blob/master/Apache_2.4/README.md

I modify my directory :

<Directory /var/www/visionduweb.fr>
# Empêcher le suivi des liens symboliques.
# J'autorise le suivi des liens symboliques :
# Permet l'inclusion de la liste noire anti bad bots blocker.
# Depuis le fichier de configuration complémentaire.
Options +FollowSymLinks
# Protéger l'accès aux répertoires.
Options -Indexes
# Désactiver Inclusions Côté Serveur (Server Side Includes / SSI)
Options -Includes
# Permet la prise en compte du fichier .htaccess
AllowOverride All
# Définir le fichier à appeler par défaut.
DirectoryIndex index.php index.html

## Contrôle d'accès Apache 2.4 :
# Toutes les requêtes sont autorisées.
# Fonctionne depuis que j'ai renseigné DirectoryIndex.
## Est commenté pour prendre en compte le script anti bad bot blocker.
## Require all granted

# Inclure le script de liste noire anti bad bots :
# https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/blob/master/Apache_2.4/README.md
Include custom.d/globalblacklist.conf
</Directory>

After commenting REQUIRE : # Require all granted
I allow tracking symbolic links : Options +FollowSymLinks
The Server Inclusions is disabled : Options -Includes

It works !
The site is well accessible!
The bots are denied and return to page 403.

I am surprised however, although I disabled the inclusion, the file is still taken into consideration. According to my tests, it is the order of follow-up of the symbolic links which took priority? So, if I activate +FollowSymLinks and I disable -Includes, the inclusion will still be and work good : Include custom.d/globalblacklist.conf

ByeBye mod rewrite Hello bad bot blocker.

GitHub
Apache Block Bad Bots, (Referer) Spam Referrer Blocker, Vulnerability Scanners, Malware, Adware, Ransomware, Malicious Sites, Wordpress Theme Detectors and Fail2Ban Jail for Repeat Offenders - mitc...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants