Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Duplicates Caused by third-party list #1650

Closed
6 tasks done
KnightmareVIIVIIXC opened this issue Mar 11, 2024 · 1 comment
Closed
6 tasks done

Duplicates Caused by third-party list #1650

KnightmareVIIVIIXC opened this issue Mar 11, 2024 · 1 comment
Labels
A: Bug The issue is caused by a program bug

Comments

@KnightmareVIIVIIXC
Copy link

KnightmareVIIVIIXC commented Mar 11, 2024

Prerequisites

  • This site DOES NOT contains sexually explicit material;
  • The problem occurs only when using AdGuard DNS or DNS filtering with AdGuard DNS filter, it is not caused by other ad blockers;
  • You're using an up-to-date version of AdGuard DNS filter;
  • Browser version is up-to-date;
  • If a website or an app is broken, disabling AdGuard DNS filter resolves the issue.

What DNS server do you use?

AdGuard private DNS

Version

NA

What DNS upstream(s) do you use in AdGuard apps or AdGuard Home?

NA

What DNS filters do you have enabled?

NA

What browser or app do you use?

Microsoft Edge

Which device type do you use?

Desktop

What type of problem have you encountered?

Missed analytics or tracker

Where did you encounter the problem?

In the list

Add your comment and screenshots

So far I've just noticed it with two but the two duplicates came from the same list:

  • allmt.com
  • afdads.com

Because these entries are in another list without the $third-party modifier, the entries show up like this:

||allmt.com
||allmt.com^
||afdads.com
||afdads.com^

This is the third party list with the modifiers that are causing the issue for the entries that I found: https://raw.githubusercontent.com/AdguardTeam/AdguardFilters/master/CyrillicFilters/common-sections/adservers.txt
I'm assuming that most of the entries in there are going to be duplicated this way throughout the list.
Maybe adding a compress transformation would fix it?

Privacy

  • I agree to follow this condition
@KnightmareVIIVIIXC KnightmareVIIVIIXC added the A: Bug The issue is caused by a program bug label Mar 11, 2024
@Alex-302
Copy link
Member

Alex-302 commented Mar 11, 2024

||example.com is not a duplicate for ||example.com^
||example.com^ is a redundant, because ||example.com will block also example.com.tr.
But it is impossible to say exactly for what purpose the separator ^ was not added. This can be either a typo or a correct rule.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A: Bug The issue is caused by a program bug
Projects
None yet
Development

No branches or pull requests

2 participants