Scrape with a Chrome extension and push to a couch
Switch branches/tags
Nothing to show
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.


Lazydriver simplifies the building of a Chrome extension to scrape a site.

How it works

Different sites will have different logic, so you need to write a script to navigate the pages.

It is possible to parse the pages as you are navigating them, but I recommend simply saving the raw html and parsing it later. This gives you a wider choice of approaches/libraries/languages for parsing and allows.

Writing the script

Write your script to navigate the pages. Lazydriver includes jQuery automatically, so feel free to use it. Somewhere in that script, you need to run save_page to push the text to a couch. You may specify up to three options, pageId, text and couch. For any that you don't specify, these defaults will be used.

function save_page({
  pageId : document.URL
, text   : document.documentElement.innerHTML
, couch  : "http://localhost:5984/lazydriver"

In addition to writing a script to navigate the page, you need to write a manifest.json.


Once you've finished the script and manifest.json, run <chrome extension dir>, and an unpacked chrome extension will appear in a deploy directory inside the chrome extension directory.

Then you can go to the chrome extensions panel to load the extension or pack it.