Skip to content


Subversion checkout URL

You can clone with
Download ZIP
Port of Simon Willison's Soup Select (for BeautifulSoup) to node.js and node-htmlparser
Branch: master
Failed to load latest commit information.
deps Update version
lib Fixed multiple class selectors and bug with nested tag selectors, e.g…
testdata Add tests for pull #2 96baffb and 0c7e885
tests Add tests for pull #2 96baffb and 0c7e885
.gitmodules Reorganise into package structure
AUTHORS Perfectionism Add benchmarking tool based on slickspeed plus log to track history Documentation fix
benchmark.js Add benchmarking tool based on slickspeed plus log to track history
example.js Add example
package.json Fix issue 3 - trailing comma in package .json
test.js Add package index, package description and test bootstrap


A port of Simon Willison's soupselect for use with node.js and node-htmlparser.

$ npm install soupselect

Minimal example...

var select = require('soupselect').select;
// dom provided by htmlparser...
select(dom, "#main a.article").forEach(function(element) {//...});

Wanted a friendly way to scrape HTML using node.js. Tried using jsdom, prompted by this article but, unfortunately, jsdom takes a strict view of lax HTML making it unusable for scraping the kind of soup found in real world web pages. Luckily htmlparser is more forgiving. More details on this found here.

A complete example including fetching HTML etc...;

var select = require('soupselect').select,
    htmlparser = require("htmlparser"),
    http = require('http'),
    sys = require('sys');

// fetch some HTML...
var http = require('http');
var host = '';
var client = http.createClient(80, host);
var request = client.request('GET', '/',{'host': host});

request.on('response', function (response) {

    var body = "";
    response.on('data', function (chunk) {
        body = body + chunk;

    response.on('end', function() {

        // now we have the whole body, parse it and select the nodes we want...
        var handler = new htmlparser.DefaultHandler(function(err, dom) {
            if (err) {
                sys.debug("Error: " + err);
            } else {

                // soupselect happening here...
                var titles = select(dom, 'a.title');

                sys.puts("Top stories from reddit");
                titles.forEach(function(title) {
                    sys.puts("- " + title.children[0].raw + " [" + title.attribs.href + "]\n");

        var parser = new htmlparser.Parser(handler);


  • Requires node-htmlparser > 1.6.2 & node.js 2+
  • Calls to select are synchronous - not worth trying to make it asynchronous IMO given the use case
Something went wrong with that request. Please try again.