Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
a library for scraping things
Python
Failed to load latest commit information.
docs moving it over
scrapelib try out some stats stuff
.coveragerc fix some coverage issues
.gitignore add migration docs
.travis.yml moving it over
LICENSE moving it over
MANIFEST.in MANIFEST.in
README.rst typo in readme
requirements.txt require 1.2.2
setup.cfg wheel setup
setup.py moving it over
tox.ini stop testing py33 locally

README.rst

scrapelib

https://travis-ci.org/jamesturk/scrapelib.svg?branch=master https://coveralls.io/repos/jamesturk/scrapelib/badge.png?branch=master Documentation Status

scrapelib is a library for making requests to less-than-reliable websites, it is implemented (as of 0.7) as a wrapper around requests.

scrapelib originated as part of the Open States project to scrape the websites of all 50 state legislatures and as a result was therefore designed with features desirable when dealing with sites that have intermittent errors or require rate-limiting.

Advantages of using scrapelib over alternatives like httplib2 simply using requests as-is:

  • All of the power of the suberb requests library.
  • HTTP, HTTPS, and FTP requests via an identical API
  • support for simple caching with pluggable cache backends
  • request throttling
  • configurable retries for non-permanent site failures

Written by James Turk <james.p.turk@gmail.com>, thanks to Michael Stephens for initial urllib2/httplib2 version

See https://github.com/jamesturk/scrapelib/graphs/contributors for contributors.

Requirements

  • python 2.7, 3.3, 3.4
  • requests >= 2.0 (earlier versions may work but aren't tested)

Example Usage

Documentation: http://scrapelib.readthedocs.org/en/latest/

import scrapelib
s = scrapelib.Scraper(requests_per_minute=10)

# Grab Google front page
s.get('http://google.com')

# Will be throttled to 10 HTTP requests per minute
while True:
    s.get('http://example.com')
Something went wrong with that request. Please try again.