
Loading…
Dynamic filtering UI doesn't always allow selecting the SLD or some subdomains #1263
Oh, I see, it's using the Public Suffix List to determine what can't be directly whitelisted. I'm not sure if there's a good way to let the user select something from that without cluttering the list and without making them manually edit the rules.
That still doesn't explain why dpaste.com.s3.amazonaws.com isn't listed, though. Not that I need to whitelist that (although it might make sense to in case they ever add other subdomains under that).
That still doesn't explain why dpaste.com.s3.amazonaws.com isn't listed
Because there was no request to dpaste.com.s3.amazonaws.com itself. uBlock will only list subdomains for which there was actually a network request, and the base domain (request or not).
I see, that makes sense.
I thought I remembered an issue for uMatrix asking why rules couldn't be made for cloudfront.net itself, and the answer was that it's considered a public suffix, and that although the subdomains look random, they are actually statically assigned to the organizations that pay for them (rather than being randomly chosen per client request as in some CDN setups).
Here are some examples:
googleapis.com on various sites
Available:
ajax.googleapis.com,fonts.googleapis.comNot available:
googleapis.com.On github.com
Available:
avatars3.githubusercontent.comNot available:
githubusercontent.com.On dpaste.com
Available:
com.s3.amazonaws.com(primary),static.dpaste.com.s3.amazonaws.comNot available:
dpaste.com.s3.amazonaws.com,s3.amazonaws.com,amazonaws.comIf I manually change the rule to something like
github.com githubusercontent.com * noop, that works fine, but the current behavior doesn't seem to make any sense.It may be worth mentioning that in the past when I used NoScript, I ran into similar problems, except manually whitelisting unavailable options didn't actually work with it.