Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Time for an update build? #56

Closed
Amoeba00 opened this issue Apr 8, 2023 · 3 comments
Closed

Time for an update build? #56

Amoeba00 opened this issue Apr 8, 2023 · 3 comments

Comments

@Amoeba00
Copy link

Amoeba00 commented Apr 8, 2023

Lots of folks are posting problems with the ERX units and it's a definite result of those two primary lists, OISD and Steve Black, growing so large.

Here are the default results from package installs:
July 2022: Total entries extracted 136104
Today: Total entries extracted 234118

I think it makes sense to remove one of them and then update the FAQ with a special ERX/ERX-SFP section with the commands that users can copy/paste if they want to remove/add the other.

Removing Steve Black / Keeping OISD
Total entries extracted 59691

Removing OISD / Keeping Steve Black
Total entries extracted 194322

Also, as an aside, I was curious about the simple_tracking list with the 34 domains. It appears that while the disconnect.me site and browser extension hasn't been updated in quite a while, they do have a json file that gets updated more frequently.
What's even stranger is that when I remove it from the default package, I get a "Total entries extracted 234360," and putting it back the number goes back down to 234118. Just thought, I pass that one.

Thanks again for all your work on this! Cheers!

@britannic
Copy link
Owner

@Amoeba00, the package would need to have JSON parsing added to get the disconnect.me lists. You're welcome to create a pull request and add the functionality. Otherwise, I can just remove disconnect.me from the initial install.

@britannic
Copy link
Owner

@Amoeba00 v1.2.4.8 has been released.

@Amoeba00
Copy link
Author

Thanks for the update. I didn't figure the package would be able to parse JSON - though it's only a few domains so it won't make a dent either way. Keep up the great work and have a great week!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants