-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
URL match patterns #373
Comments
With this implementation I think that Dark Reader could inherit the capabilities of extensions such as Stylus, in the sense that it could inject user own CSS code through "dev tools" for every site (*), adding exceptions, etc. |
Eager to be able to exclude sub-domains 👍 |
Awesome! Definitely can't wait for this to be implemented, it will make the great experience even greater. Not sure if this is different, wanted to point out just in case, this also applies to IP URLs. For example if you invert |
Negative Patterns #2327 |
This is EXACTLY what I need. My local torrent server requires that I pass my creds in the URL, and Dark Reader refuses to recognize the domain. 😤 |
Anyone know if it is currently possible to disable dark reader on It would be a great feature to force some pages to dark mode on part of sites that already have dark mode except on some parts. |
@Fred-Vatin I get that behavior today by using "Invert listed only" and adding something like if you use the "Not invert listed" mode though (AKA opt-out vs opt-in to DR theming pages) I don't think there's currently a reasonable way to only enable it on a subset of a site's pages |
@Fred-Vatin You should be able to do this by setting Dark Reader into the "Invert listed only" option and then adding this line into the list:
you may also have to add:
This should only turn on Dark Reader on |
@Fred-Vatin - Personally, I don't like "Inverted list" because I want the default behaviour for new websites to be dark mode, so my workaround is to add these to the normal (non-inverted) list -
It's ugly, but it is the only way to achieve it until the feature requested in this thread is implemented. |
Same. For me it’s not an option. |
Can we please get an option to use regex strings... parituclarly with negative lookaheads. It would be so much more flexible. |
I just discovered Dark Reader and I love its Dark theme. Very convenient for reading. This feature will be great to implement as I need to allow only certain section in a website to be darkened. This pattern: |
Likewise, this would be very helpful to apply to google docs, but not google slides |
@ziroau The problem is the PR by @Gusted has been Dusted. Rhyme intended. Pick it up, and bring it to latest, so that it can be merged without any conflicts. I think there are also challenges about not breaking current rules. Everybody wants it to land, but nobody wants to get their hands dirty, a typical GitHub issue in the landscape, it had its opportunity, but you are free to change it to a positive outcome. |
Hello! You can now try using Regular Expressions in the Site List. Just start and end the pattern with
Please let me know if there are any issues. |
Seems the URL Pattern Syntax in Site-list is changed recently. I used to use
now it Not working anymore. and I have to change it to // but it not normal regex syntax anymore |
Hi @yurenchen000! We were not supporting the RegExps before, they were partially working, because the implementation was based on RegExps. Now you can use simple patterns like in your example ( |
FWIW, in Firefox 120.0.1, the pattern seems to match against the entire URL. For example:
works, but
does not. Also, it would be good to mention just for completeness the specific flavour of regular expression supported. I guessed it was JavaScript since this is a browser extension. |
The URL matching has been simplified since version 4.9.69. There are now 2 ways to match a website: Simple patterns
Regular expressions
|
Implement user-friendly URL glob patterns with negation ability and some special behavior. Similar to globby, extension match patterns etc.
How it should work:
*
matches everything*/*.pdf
matches PDF file extensiongoogle.com
matches www.google.com, mail.google.com, google.com/search etc.google.*
matches google.com, google.by etc.*.google.com
matches mail.google.com, inbox.google.com etc.google.com/mail/*
matches google.com/mail/inbox etc.google.com/*.pdf
matches google.com PDF files.localhost:*
should match localhost and any portftp://*
should match FTP protocol only/^.google\.com\/mail/
should behave as a regular expression` (but should not be implemented yet, maybe there is no need for it).*
can only be surrounded by dots and first/
in host part and by last/
and file extension dot at path part.*
corresponds to one host or one path part or to many parts if placed at start or end. Queries should not be allowed.*.google.com
OKgoo*.com
errorgoogle.com/blog/*
OKgoogle.com/blog*
errorgoogle.com/*/blog
OKgoogle.com/*.pdf
OKgoogle.com/2018-07-*.jpg
errorgoogle.com/search?q=cat&p=dog
error, we won't investigate if it should match search?p=dog&p=cat, use regular expressions for thatURL lists
URL lists are used in user's Site List config and fixes configurations (each record can have multiple URLs).
!
should reverse pattern result.google.com, !*.google.com
should match google.com except it's subdomains.google.com, !mail.google.com
should match everything that matchesgoogle.com
except everything that matchesmail.google.com
google.com, !mail.google.com, mail.google.com/compose
should not match everything that matchesmail.google.com
except everything that matchesmail.google.com/compose
Pattern specificity
Pattern specificity is used to determine which exact match from config file to use. It should behave similar to CSS specificity.
*
0.1.0.0google.*
1.1.0.0google.com
2.0.0.0*.google.com
2.1.0.0mail.google.com
3.0.0.0mail.google.com/*
3.0.0.1mail.google.com/mail
3.0.1.0mail.google.com/mail/*
3.0.1.1mail.google.com/mail/compose
3.0.2.0Result is an array of
*
matches (plus*
protocol and port matches).*
matches (plus*
file extension).URL list sorting
It is needed to validate alphabetical order in configuration files so that they stay maintainable. Configuration records are compared by first URL. Prefix '*.' should be skipped, resulting repeated records should be sorted by specificity. Regular expressions should not be used in comparison.
UI behavior
google.com
,google.com/maps
.The text was updated successfully, but these errors were encountered: