-
Notifications
You must be signed in to change notification settings - Fork 188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixes #22: Inactive Blog Filter #143
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for your contribution, looks good to me!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did a first look over this, left some comments.
We really need to get the Redis database integration started so that people don't have to read/write files for everything. cc @mskuybeda
Made the changes to env.example so the filter module now calls process.env to retrieve variables. Non-global requires go against airbnb and fails eslint, so I've reverted back to traditional readFile for now. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, did another pass.
Pushed new changes to PR:
Potential features to address as separate issues:
|
Looks like some changes have been made since I rebased, so now there's a conflict with env.example. It will be fixed with the next rebase before the merge, if everything else checks out. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Passed all the checks, and I find nothing through first glance, other than conflicts which you said you’ll rebase! I think this is good to merge after rebase!! Amazing job Jerry!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. We might need to get @humphd to review or dismiss his old reviews as stale for the merge.
3d69332
to
040c676
Compare
Changes will be addressed in independant issue
Sorry, I had to rebase this PR again as there were several merges during the time I addressed the last fix. If I could get another set of reviews that would be great. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I took a glance at the code and it looks good to me. Good documentation, also I can create another issue for the testing of your code. Good job @jerryshueh
Addresses #22. Had to perform some workarounds on this one as we do not yet have an official feed-worker implemented. That issue has been assigned to @robertbegna in issue #108, and I didn't want to take away from any work he might want to do. In order to test my implementation, I had to create a new, altered feed-worker in /test.
A feeds-redlist.json was created in root, which contains filtered feeds. Since they are not permanently banned or anything, I didn't want to call it a "blacklist". Each item contains an
url
attribute for the feed, and alastUpdate
attribute (ISO 8601 datetime string) for the blog's last known activity, according to the feed data. A 0 value means the blog has a dead feed.New module created in /src/inactive-blog-filter.js, which performs most of the logic:
check()
, which takes a feed URL and a callback function. It will check a provided feed url against the redlist file and return the result (true/false) to the callback, which can then perform whatever task.update()
, which performs a complete sweep of all feeds on the feedlist to check if they are inactive or not. Then it re-writes the redlist file to reflect the latest changes. This may become an issue if the feedlist grows, so for now, we'll say we can run this periodically and not live with each feed update. The reason I chose to do this is because, again, we do not have a proper feed-worker to accumulate data, nor a proper database to search. Once we move forward, this module can be updated to perform specific searches on stored blogs.