scrape a page and get an array of all the `<a>` links to pngs or jpegs
JavaScript
Switch branches/tags
Nothing to show
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
.gitignore
index.js
package.json
readme.md
server.js

readme.md

get-images

scrape a page and get an array of all the <a> links to pngs or jpegs

Installation

npm install get-images

Usage

var getImages = require('get-images')
getImages('http://substack.net/images', function(err, images) {
  // => images is an array of image urls like ["http://substack.net/images/1up.png"]
})

Bonus feature

there is a built in proxy server (server.js) that you should run with sudo server.js and then open http://localhost/?url=http://substack.net/images/

License

BSD