-
Notifications
You must be signed in to change notification settings - Fork 4
Known Issues
Googlebot and bingbot are known to send requests to a server’s IP address once in a while. In such cases, httpd either returns a 403 or (if on HTTP) redirects the request to HTTPS. The bots won’t take offence.
CSS/JS files produced by these plugins are named using hash digest. See Conflicts with hash digest.
Prevents webp-express/htaccess-capability-tests/pass-info-from-rewrite-to-script-through-env/test.php
from running.
This wouldn’t matter if bulk conversion and Alter HTML are used to generate static pages.
The redirect rules, if used, can also be set up manually in httpd.conf
with mod_alias
.
Filenames containing hash digest may cause false positives. Test with mod_rewrite and adjust any affected rules.
Rules under [HTTP REFERRER]
are known to block navigation when a site’s URLs do contain one of the outlawed expressions, particularly in a multilingual context. Libido, for example, is a common Latin word. Check all the URLs before deploying.
When virtual hosts are used to redirect to HTTPS, HTTP requests with a Host:
header containing the server’s IP address or no Host:
header will get redirected before they can be blocked. In such cases, one could set up an additional vhost:
# This can be placed before other vhosts
# to double as a ‘catch-all’ and block
# requests without Host: header.
<VirtualHost *:80>
ServerName your IP address
<Location />
Require all denied
</Location>
</VirtualHost>
Services that scan for vulnerabilities are known to get blocked when trying to access this file, which is a sign the firewall works as intended.
mod_rewrite should be considered a last resort, when other alternatives are found wanting. Using it when there are simpler alternatives leads to configurations which are confusing, fragile, and hard to maintain. Understanding what other alternatives are available is a very important step towards mod_rewrite mastery
— Rich Bowen