You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Considering adding a post.author during the webscraper and dumping those names into a filter. Praw can snag the userinfo and test to see if they have been active for over a certain time, might dump out the bots. We can keep the user info in a cache. Since we are talking about thousands of comments to search through, we can pass these off to a separate thread to build a blacklist of commentors to compare against the main thread. This would allow them first pass through but after they are checked and rejected, future checks would hit on the blacklist. So 1 bot that spams a random pump and dump would have low volume and not show up hopefully. Let me know if its worth the effort.
The text was updated successfully, but these errors were encountered:
Considering adding a post.author during the webscraper and dumping those names into a filter. Praw can snag the userinfo and test to see if they have been active for over a certain time, might dump out the bots. We can keep the user info in a cache. Since we are talking about thousands of comments to search through, we can pass these off to a separate thread to build a blacklist of commentors to compare against the main thread. This would allow them first pass through but after they are checked and rejected, future checks would hit on the blacklist. So 1 bot that spams a random pump and dump would have low volume and not show up hopefully. Let me know if its worth the effort.
The text was updated successfully, but these errors were encountered: