So, first, forgive me if this question isn't very informed.
I had a problem tonight where I was receiving empty js/css files for my site, and finally tracked it down to the blacklisted referers part of my configuration.
I have aliased urls and a forum topic title had the word "love" in it, causing requests for css/js it to be blocked by the bad referer check.
My question is, with aliased urls, the current implementation of the bad referer check seems really prone to failure. Does it not make sense to always allow requests coming from the current host?
Ok. The all process is kinda of "rough". It's just a thin layer to protect you from the most obvious abusers. Those that don't bother to disguise who they are. I've just commited a "fix" (I hope) in line with what you're suggesting. If the request comes from a trusted host, even if the Referer header matches the regex, it will clear the referer. Try it out.
Can confirm from the wild on an implementation and subsequent operational test I undertook today on vanilla deb stable offical dist that this works as advertised ánd as requested by the initiator.
(Read: I took an enterprise production server in to a custom 503 during scheduled maintenance. Internal LAN ip's as well as my dynamic ip from a public provider were used. to see the real errors being shown as well as bypassing the static html served as the 503 at root.