Skip to content

Scrapy spider that sends a notification email if a URL is down

Notifications You must be signed in to change notification settings

shadiakiki1986/emailifdown-scrapy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

emailifdown-scrapy

Scrapy spider that sends a notification email if a URL is down. Can be deployed to the Scrapy cloud using their free plan.

NOTE: I just learned about https://uptimerobot.com/ which has a free plan for 50 monitors .. so this project is pretty much useless in that regards. Other tools are documented here and here

Usage

  1. Install pew
  2. Initialize
pew new emailifdown-scrapy
pip install scrapy
  1. Configure SMTP email settings by copying emailifdown_scrapy/settings-sample.py to emailifdown_scrapy/settings.py and editing it (section marked EDIT THIS SECTION)
  • these MAIL_... settings are mandatory if -a emailTo=... below is not skipped
  • These settings are the scrapy mail settings
  • Alternatively, pass the settings directly to scrapy as parameters using -s (check below for run example)
  1. Run:
scrapy runspider emailIfDownSpider.py \
  -L WARNING \
  -a url="https://duckduckgo.com" \
  -a emailTo="my@email.com"
  • add the below settings if not put in settings.py (check above note)
  ...
  -s MAIL_USER="another@email.com" \
  -s MAIL_HOST="smtp.email.com" \
  -s MAIL_PORT=123 \
  -s MAIL_PASS="password"
  • skip -a emailTo=.... for no email to be sent

Deploy to scrapinghub.com

  1. Follow instructions in Usage above
  2. Register on Scrapy cloud
  • Start a new project
  • Copy the project's "target number" and your API key
  1. Install shub: pip install shub
  2. Deploy: shub deploy (will prompt for target number copied above)
  3. Schedule: shub schedule 12345/emailifdown where 12345 is your target number
  • Maybe shub supports -a and -s parameters similar to scrapy but it probably didnt work for me
  1. Navigate to the scrapy cloud / project / periodic jobs dashboard
  2. Add a new job (should automatically get emailifdown in dropdown)
  3. add arguments "url" and "emailTo" and select times at which to run
  4. Check logs in dashboard

About

Scrapy spider that sends a notification email if a URL is down

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages