Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
crawls funny pictures from various websites via node.js
JavaScript Shell
Branch: master

Fetching latest commit…

Cannot retrieve the latest commit at this time

Failed to load latest commit information.
lib magnetlib: beautify imagePath
log
plugins
.gitignore
BUGS
METADATA.md
PROPOSED_API
README.md
magnets.js
package.json
run.sh

README.md

FUCKING MAGNETS HOW DO THEY WORK?

This project is designed to be a modular image grabber/crawler written in node.js.

It's main purpose is derived from this XKCD quote:

With the collapse of the dollar the government has endorsed an alternate currency. Your monetary worth is now determined by the number of funny pictures savedto your hard drive.

Quote from XKCD

The name was borrowed from the 'fucking magnets' meme.

USAGE

npm install
# together with mediengewitter the image_folder should be the same as public/content !
image_folder=/data/images node magnets.js

FEATURES

  • download all the funny pictures on the internet directly to your hard disk
  • extensible plugin facility with multi-module support
  • modules for high-quality funny pics
    • icanhascheezburger-network like failblog and lolcats
    • soup.io
    • kqe
    • bildschirmarbeiter
    • ... more
  • FUNNY PICTURES ALL THE WAY

    TODO:

    • Live Ticker (scheduler) - partly implemented
    • Other Plugins ( recently added cheezburger_network)
    • Add debugging and inspection howto
    • Support commandline options like loglevel
    • Use node-htmlparser instead of own regex (used in cheezburger as first plugin
    • meta data for every picture

Used node.js libraries:

in short: npm install trollop htmlparser soupselect http://github.com/pfleidi/node-wwwdude/tarball/master

Workaround to delete downloaded html files:

This is not necessary as wwwdude supports redirects! For historic reasons -> $ for i in *; do file $i; done | grep HTML | awk '{print $1}' | sed -e 's/://' | xargs rm -v

Something went wrong with that request. Please try again.