Skip to content

scambaitermailbox/scambaiter_backend

Repository files navigation

Expandable Scam-baiting Mail Server

This is a open-source scam-baiting mail server for all relevant studies. You can customize your repliers (also called responders in the source code), add your own crawlers for fetching initial scam emails and configure directories and API keys.

Configurable Directories & Keys

All the directory and API configurations are in file "secret.py". I highly recommand you to add this in .gitignore due to security issues.

Here is an example secret.py config:

# MAIL
API_KEY = "YOUR_MAILGUN_API_KEY"
API_BASE_URL = "YOUR_MAILGUN_API_URL"
DOMAIN_NAME = "YOUR_SERVER_DOMAIN_NAME"
MAIL_SAVE_DIR = "./emails/queued"
MAIL_ARCHIVE_DIR = "./emails/archive"
MAIL_HANDLED_DIR = "./emails/handled"
ADDR_SOL_PATH = "./emails/record.json"
# MODEL PATH
MODEL_HISTORY_PATH = "./models/history.json"
NEO_ENRON_PATH = "./models/neo_enron"
NEO_RAW_PATH = "./models/neo_raw"
CLASSIFIER_PATH = "./models/classifier/final-model.pt"
TEMPLATES_DIR = "responder/templates"
# CRAWLER CONF
CRAWLER_PROG_DIR = "./cache"
MAX_PAGE = 20

Adding Crawlers

All the crawlers are in "crawler" package. We have implemented two example crawlers, which are "scamletterinfo.py" and "scamsurvivors.py" respectively.

To implement your own crawler, simply add your crawler Python file in this package and make sure to implement "fetch" function. Your crawler should save the results in JSON format in the MAIL_SAVE_DIR directory. Here is an example of crawler output:

{
    "title": "App/Share your project",
    "url": "http://scamletters.info/2022/04/app-share-your-project/",
    "date": "14.04.2022",
    "from": "j***************0@gmail.com",
    "content": "Sent from: j***************0@gmail.com\nHi,\nHappy to connect.\nI would like to give you a brief backdrop about our company as well\ncore-competency areas in App developments.\n*Spa & Massage App, Shopping App, Wedding App, Food & Drink App, Shopping\nApp, E-Commerce App, IPhone and iPad Apps, Mobile App.*\nPlease tell me what type of App development you need, please share your\ncontact details for more discussion.\nThank you\n\n[image: beacon]"
}

After that, you must import your crawler and call your "fetch" function in the "fetch_all" function of "init.py".

Customized Responder

The responders are all in "responder.replier". We have 4 example repliers in this file.

To implement your own responder, you need to inhereit "Replier" class and "name" property and overwrite "_gen_text" function. The return value of "_gen_text" should be str and always ends with "[baiter_end]".

Here is an example of a responder

class MyReplier(Replier):
    name = "MyReplier"

    def _gen_text(self, prompt) -> str:
        return "test[baiter_end]"

To enable your responder, you need to add the instance of your replier class into list "replier_list" in "responder/init.py", line 8.

from .replier import NeoEnronReplier, NeoRawReplier, Replier, ClassifierReplier, MyReplier
replier_list = [ClassifierReplier(), NeoEnronReplier(), NeoRawReplier(), MyReplier()]

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages