Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

Initial commit

  • Loading branch information...
commit e5e2028a66a9f6b3ab7a893c5b698ca360c38e8c 0 parents
@akaspin authored
2  .gitignore
@@ -0,0 +1,2 @@
+/.project
+/.settings
10 Makefile
@@ -0,0 +1,10 @@
+NODE=`which node`
+THIS_DIR=$(shell pwd)
+
+default: test
+
+test:
+ @$(NODE) run.js test/pit
+ @$(NODE) run.js test/simple
+
+.PHONY: test
241 readme.md
@@ -0,0 +1,241 @@
+# pit
+
+Simple drop-in test runner for [node.js](http://nodejs.org/).
+
+Test matters. But use of big testing frameworks leads developer to loss of
+control. *node.js* is asynchronous thing, and lots of errors may dissappear
+in the bowels of next big test system.
+
+*pit* is one small (about 10k) `js` file. No installations, no dependencies, no
+loss of control.
+
+## Basic usage
+
+As mentioned earlier, each test is simple node.js application. *pit* just
+collects tests and runs they. If app throws exception - test fails. That's all.
+
+Simply put `run.js` wherever you want and and pit it on your tests. For example,
+we have the following directory structure:
+
+ + app
+ - test-feature.js <- Put some to STDOUT
+ - test-report.js <- This test must fail and brings some to STDOUT
+ - test-simple.js <- Simple. Must pass
+ - test-unstable.js <- This test must fail
+ - run.js <- Yes, its me!
+ - index.js
+
+... and we in app directory...
+
+ $node test/run.js
+
+... and ...
+
+ PASS feature
+ * output:
+ Just bring something
+ FAIL report
+ * output:
+ Just some output
+ * errors:
+ node.js:50
+ throw e; // process.nextTick error, or 'error' ...
+ ^
+ AssertionError: true == false
+ ...
+ FAIL unstable
+ * errors:
+ node.js:50
+ throw e; // process.nextTick error, or 'error' ...
+ ^
+ AssertionError: true == false
+ ...
+
+ 2/3
+
+Fast clean and effective. *pit* collect all `test-*.js` files from current
+directory and run each as separate process. If test throws exception - it fails.
+Passed tests that that not print not appear in the log by default, but will be
+counted in the total result.
+
+## Advanced usage
+
+You can control many things with parameters. *pit* takes two types of
+parameters: options and directories.
+
+ node run.js [--option ...] [directory ...]
+
+All parameters separates by space. Options are preceded by double dash (`--`).
+Directories... just write it. Order of options and directories is not
+important.
+
+### Directories, prefixes, extensions and host interpreter
+
+Current dir is cool. But *pit* can collect the tests of those directories that
+you specify. Note that *pit* is not looking in the directories recursively.
+
+ node run.js ../../other-dir tests tests/basic tests/advanced
+
+By default *pit* searches for files with names `test-*.js` and runs they with
+`node`. This behavior can be changed using options `prefix`, `ext` and `host`.
+
+ node run.js --prefix=bench- --ext=.pl --host=perl
+
+This may seem redundant, but may prove to be useful.
+
+### Concurrency
+
+As stated above, *pit* runs each test in separate process one after another.
+You can increase the number of concurrent tests by option `conc`.
+
+ node run.js --conc=4
+
+### Customising output
+
+By default, *pit* following the next rules:
+
+* If test *passed* and nothing is printed, it will not appear in the log but
+ will be counted in the total result.
+* If test *passed* but prints some in `STDOUT`,
+ it will appear in the log with output.
+* If test failed, it will appear in the log with
+ it's `STDERR` and `STDOUT` (if exists).
+* Regardless any options, all tests will be counted in the total result.
+
+You can change it with next secret weapons: `noout`, `noerr`, `passed` and
+`nofail`.
+
+`noout` and `noerr` disables test's `STDOUT` and `STDERR` respectively. With
+`noout` option *pit* logs only failed tests. With `noerr` option *disables*
+tests `STDOUT` but failed tests will be appeared in log.
+
+ $node test/run.js --noout --noerr
+
+ FAIL report
+ FAIL unstable
+
+ 2/3
+
+To disable log completelly and get only total results you need add `nofail`
+option to `noerr` and `noout`.
+
+ $node test/run.js --noout --noerr --nofail
+ 2/3
+
+`passed` option forces *pit* to log passed tests regardless their `STDOUT`.
+
+ $node test/run.js --noout --noerr --passed
+
+ PASS feature
+ FAIL report
+ PASS simple
+ FAIL unstable
+
+ 2/3
+
+### Timing
+
+*Pit* also has very basic benchmarking ability. It can be done with `times`
+option.
+
+ $node test/run.js --noout --noerr --passed --times
+
+ PASS feature (0.069s)
+ FAIL report (0.088s)
+ PASS simple (0.072s)
+ FAIL unstable (0.087s)
+
+ 2/3 (0.178s)
+
+*Pit* measures duration of each test in seconds - from start to end. Also *pit*
+measures total duration. Of course, total duration depends from `conc` option.
+
+## Helpers
+
+*Pit* written to run tests. But it has three convenient tool for creating them.
+These tools are `expect`, `mark` and `hook`.
+
+### expect and mark
+
+`expect` function give you the ability to determine - what is expected from
+the test. `mark` help meet its needs.
+
+For example, running this test...
+
+ var pit = require('./run.js');
+
+ pit.expect(5); // Set five expected "common" marks
+ pit.expect("be-exp", 1); // Set one expected "be-exp" mark
+ pit.mark("not-exp"); // Put "not-exp" mark
+ pit.mark("be-exp"); // Put "be-exp" mark
+
+... will give the following result
+
+ FAIL marks
+ * errors
+ assert.js:80
+ throw new assert.AssertionError({
+ ^
+ AssertionError: Marks do not meet expectations:
+ - common: 0/5 <- Expected but not satisfied
+ + be-exp: 1/1 <- Satisfied
+ - not-exp: 1/undefined <- Unexpected mark
+ ...
+
+`expect` sets expected number of marks. It takes two arguments: mark label and
+expected number. Invoking `expect` with one argument causing set number of
+"common" mark.
+
+Marks can be increased by `mark` function. Being called with single parameter -
+the mark's label, it increases mark's value. Without any parameters `mark`
+increases value of "common" mark.
+
+### hook
+
+When invoked at least once `expect` and `mark` causes hanging a handler to
+`process->exit` event. If you want to add yours, use `hook`. Small example:
+
+ var pit = require('./run.js');
+ var simpleMessage = "Hook";
+
+ pit.hook(function(marks, expects) {
+ console.log("%s three. Will not displayed", simpleMessage);
+ });
+ pit.hook(function(marks, expects) {
+ console.log("%s two", simpleMessage);
+ throw "ARRR!";
+ });
+ pit.hook(function(marks, expects) {
+ console.log("%s one", simpleMessage);
+ });
+
+... and output:
+
+ FAIL chain
+ * output
+ Hook one <- hook one
+ Hook two <- hook two
+ * errors
+ /home/dev/pit/test-chain.js:9 <- Ohh!
+ throw "ARRR!";
+ ^
+ ARRR!
+
+*Hooks* are executed in stacked (FILO) order. The default handler runs at last.
+Exceptions interrupts execution.
+
+`hook` takes one parameter - hook function. Which in turn takes two parameters:
+collected marks and expects. Which have the following structure:
+
+ {
+ <mark label>: <{int}number of marks>,
+ ...
+ }
+
+## Need help?
+
+Call *pit* with `help` option:
+
+ node run.js --help
+
+You can also look at examples in "test" directory and run all by `make`.
369 run.js
@@ -0,0 +1,369 @@
+/**
+ * Simple and lightweigt drop-in test runner.
+ */
+
+var assert = require('assert');
+
+var exec = require('child_process').exec;
+var fs = require('fs');
+var path = require('path');
+var util = require('util');
+var events = require('events');
+
+// setted marks
+var marks = {
+ "common": 0
+};
+// expected marks
+var expects = {
+ "common": 0
+};
+var pitted = false;
+var pitchain = [
+ function(marks, expects) {
+ var message = "Marks do not meet expectations:";
+ var bundle = {};
+ // Filling with initial values
+ for (var ek in expects) {
+ bundle[ek] = {marks: undefined, expect: expects[ek]};
+ }
+ // extra keys from marks
+ for (var mk in marks) {
+ if (bundle[mk]) {
+ bundle[mk].marks = marks[mk];
+ } else {
+ bundle[mk] = {marks: marks[mk], expect: undefined};
+ }
+ }
+ for (var bk in bundle) {
+ var satisfied = bundle[bk].marks === bundle[bk].expect ?
+ "+" : "-";
+ message += "\n" + satisfied + " " + bk + ": " +
+ bundle[bk].marks + "/" + bundle[bk].expect;
+ }
+ assert.deepEqual(marks, expects, message);
+ }
+];
+
+/**
+ * Checks pitted and (if not) hook one to process.exit.
+ */
+function checkPitted() {
+ if (!pitted) {
+ process.on('exit', function() {
+ pitchain.forEach(function(fn) {
+ fn(marks, expects);
+ });
+ });
+ pitted = true;
+ }
+}
+
+/**
+ * Sets one mark for label.
+ * @param label Label for expect
+ */
+function mark(label) {
+ checkPitted(); // check pit
+ label = label || "common";
+ if (!marks[label]) {
+ marks[label] = 0;
+ }
+ marks[label]++;
+}
+exports.mark = mark;
+
+/**
+ * Sets one expect. Receives one or two arguments.
+ *
+ * expect(5); // for common label
+ * expect("MyLabel", 5); // for MyLabel
+ *
+ * @param label Label for expect
+ * @param n Expected marks for label
+ */
+function expect(label, num) {
+ checkPitted(); // check pit
+ // Parse parameters
+ if (!num) {
+ // num is absent. Just set common
+ // FIXME: convert label to int
+ expects.common = label;
+ } else {
+ expects[label] = num;
+ }
+}
+exports.expect = expect;
+
+/**
+ * Hooks up function to the top of pit.
+ * @param fn Function that takes two arguments: markers and expects
+ */
+function hook(fn) {
+ checkPitted(); // check pit
+ pitchain.unshift(fn);
+}
+exports.hook = hook;
+
+/**
+ * Collect
+ * @param root Root folder
+ * @param targets targets
+ */
+function collect(root, targets, prefix, ext) {
+ var pattern = new RegExp("^" + prefix + ".+" +
+ ext.replace(/\./, "\\\.") + "$");
+ var targets = targets.reduce(function(acc, t) {
+ return acc.concat(path.normalize(root + "/" + t));
+ }, []);
+ return targets.reduce(function(acc, dir) {
+ try {
+ return acc.concat(
+ fs.readdirSync(dir).reduce(function(acc, file) {
+ var fullName = path.normalize(dir + "/" + file);
+ var ok = pattern.exec(file) &&
+ fs.statSync(fullName).isFile();
+ return ok ? acc.concat(fullName) : acc;
+ }, [])
+ );
+ } catch (e) {
+ console.warn("WARN: %s\n%s", e.message, e.stack);
+ return acc;
+ }
+ } ,[]);
+}
+
+/**
+ * Test runner event emitter.
+ * Runs tests. Emits next events:
+ *
+ * * `data` - test done
+ * * `end` - all tests runned
+ *
+ * @param targets Array of file names
+ * @param conc Number of concurrent processes
+ */
+function Runner(targets, conc, host) {
+ events.EventEmitter.call(this);
+ this.conc = conc || 1;
+ this.host = host || "node";
+ this.pending = targets;
+ this.running = 0;
+ this.start = Date.now();
+}
+util.inherits(Runner, events.EventEmitter);
+
+Runner.prototype.run = function() {
+ if (this.pending.length) {
+ var forLaunch = this.conc > this.pending.length ?
+ this.pending.length : this.conc;
+
+ for (var i = 0; i < forLaunch; i++) {
+ this.next();
+ }
+ } else {
+ this.end();
+ }
+};
+
+/**
+ * Run next test
+ */
+ Runner.prototype.next = function() {
+ this.running++;
+ var file = this.pending.shift();
+ var test = {
+ file: file,
+ time: Date.now()
+ };
+ var self = this;
+ exec(this.host + " " + file, function(error, stdout, stderr) {
+ test.time = (Date.now() - test.time) / 1000 ;
+ test.fail = error;
+ test.out = stdout ? stdout : undefined;
+ test.err = stderr ? stderr : undefined;
+ self.done(test);
+ });
+ };
+ Runner.prototype.done = function(test) {
+ this.running--;
+ this.emit("data", test);
+ if (this.pending.length) {
+ this.next();
+ } else if (this.running < 1) {
+ this.end();
+ }
+ };
+ Runner.prototype.end = function() {
+ this.emit("end", (Date.now() - this.start) / 1000);
+ };
+
+ /**
+ * Reporter. Catches Runner events and prints the report
+ * in accordance given with options.
+ *
+ * @param runner Runner
+ * @param opts Options
+ * @returns {Reporter}
+ */
+ function Reporter(runner, opts) {
+ this.runner = runner;
+ this.opts = opts;
+
+ this.total = 0;
+ this.passed = 0;
+
+ this.anyWritten = false;
+
+ var self = this;
+ this.runner.on("data", function(test) {
+ self.test(test);
+ });
+ this.runner.on("end", function(time) {
+ self.done(time);
+ });
+ this.runner.run();
+ }
+ Reporter.prototype.test = function(test) {
+ // Just format output
+ function formatOut(pre, out) {
+ return "* " + pre + "\n " + out
+ .replace(/\n$/, "")
+ .replace(/^\n/, "")
+ .replace(/\n\n/, "\n")
+ .replace(/\n/g, "\n ");
+
+ }
+
+ this.total++;
+ if (test.fail != null) {
+ // Oops. Test failed
+ var mark = "FAIL";
+ var need = (!this.opts.get("noout") && test.out)
+ || !this.opts.get("noerr")
+ || !this.opts.get("nofailed");
+ var outer = console.error;
+ } else {
+ this.passed++;
+ var mark = "PASS";
+ var need = (!this.opts.get("noout") && test.out)
+ || this.opts.get("passed");
+ var outer = console.log;
+ }
+
+ if (need) {
+ var name = path.basename(test.file, this.opts.get("ext"))
+ .substring(this.opts.get("prefix").length);
+ var time = this.opts.get("times") ? " (" + test.time +"s)" : "";
+
+ if (!this.anyWritten) {
+ console.log();
+ this.anyWritten = true;
+ }
+
+ outer("%s %s%s", mark, name, time);
+ if (test.out && !this.opts.get("noout")) {
+ outer(formatOut("output", test.out));
+ }
+ if (test.err && !this.opts.get("noerr")) {
+ outer(formatOut("errors", test.err));
+ }
+ }
+ };
+ Reporter.prototype.done = function(time) {
+ var time = this.opts.get("times") ? " (" + time +"s)" : "";
+ var anyWritten = this.anyWritten ? "\n" : "";
+ var outer = this.passed == this.total ?
+ console.log : console.error;
+
+ outer("%s%d/%d%s%s", anyWritten,
+ this.passed, this.total, time, anyWritten);
+ };
+
+ /**
+ * Command-line options parser. Quick and dirty.
+ * @param synopsis
+ * @returns {Opts}
+ */
+ function Opts(synopsis) {
+ this._synopsis = synopsis;
+ this._opts = {};
+ this._params = [""];
+ }
+ Opts.prototype.set = function(name, value, message) {
+ this._opts[name] = {
+ value: value,
+ message: message};
+ return this;
+ };
+ Opts.prototype.parse = function(args) {
+ var self = this;
+ args.forEach(function(arg) {
+ if (arg.substring(0,2) === "--") {
+ // it's option
+ var tokens = arg.split("=");
+ var name = tokens[0].substring(2);
+ var value = tokens.length == 1 ? true : tokens[1];
+
+ if (self._opts[name]) {
+ self._opts[name].value = value;
+ }
+ } else {
+ // it's param
+ self._params.push(arg);
+ }
+ });
+ return this;
+ };
+ Opts.prototype.get = function(name) {
+ return this._opts[name].value;
+ };
+ Opts.prototype.__defineGetter__("params", function() {
+ return this._params;
+ });
+ Opts.prototype.__defineGetter__("usage", function() {
+ function trail(str, max) {
+ for (var i = 0, t = " "; i < max - str.length; i++) {
+ t += " ";
+ }
+ return str + t;
+ }
+ var usage = this._synopsis + "\n\n";
+ var maxlength = Object.keys(this._opts).reduce(function(max, name) {
+ return name.length > max ? name.length : max;
+ }, 0);
+ for (var k in this._opts) {
+ usage += " --" + trail(k, maxlength) + this._opts[k].message + "\n";
+ }
+ return usage;
+
+ });
+
+ if (!module.parent) {
+ // if we are in running module
+ var opts = new Opts(
+ "Pit, Simple drop-in test runner.\n" +
+ "http://github.com/akaspin/pit\n\n" +
+ "usage: node run.js [options] [targets]")
+ .set("prefix", "test-", "Prefix for collect tests.")
+ .set("ext", ".js", "Extension for collect tests.")
+ .set("times", false, "Show tests durations.")
+ .set("passed", false, "Show passed tests.")
+ .set("nofailed", false, "Disable failed tests.")
+ .set("noout", false, "Disable tests STDOUT.")
+ .set("noerr", false, "Disable tests STDERR.")
+ .set("conc", "1", "Concurrent tests.")
+ .set("host", "node", "Host interpreter.")
+ .set("help", false, "Help. This message.")
+ .parse(process.argv.slice(2));
+
+ if (opts.get("help")) {
+ console.log(opts.usage);
+ } else {
+ // run tests
+ var tests = collect(process.cwd(), opts.params,
+ opts.get("prefix"), opts.get("ext"));
+ var runner = new Runner(tests, parseInt(opts.get("conc")));
+ new Reporter(runner, opts);
+ }
+}
14 test/pit/test-chain.js
@@ -0,0 +1,14 @@
+var pit = require('../../run.js');
+
+var simpleMessage = "Hook";
+
+pit.hook(function(marks, expects) {
+ console.log("%s three. Will not displayed", simpleMessage);
+});
+pit.hook(function(marks, expects) {
+ console.log("%s two", simpleMessage);
+ throw "ARRR!";
+});
+pit.hook(function(marks, expects) {
+ console.log("%s one", simpleMessage);
+});
6 test/pit/test-marks.js
@@ -0,0 +1,6 @@
+var pit = require('../../run.js');
+
+pit.expect(5);
+pit.expect("be-expected", 1);
+pit.mark("not-expected");
+pit.mark("be-expected");
7 test/simple/test-fail-out.js
@@ -0,0 +1,7 @@
+/**
+ * Simple test that must passed. Also prints some in STDOUT.
+ */
+
+var assert = require("assert");
+console.log("Just simple output");
+assert.ok(false);
6 test/simple/test-fail.js
@@ -0,0 +1,6 @@
+/**
+ * Simple test that must passed.
+ */
+
+var assert = require("assert");
+assert.ok(false);
7 test/simple/test-pass-out.js
@@ -0,0 +1,7 @@
+/**
+ * Simple test that must passed. Also prints some in STDOUT.
+ */
+
+var assert = require("assert");
+console.log("Just simple output");
+assert.ok(true);
6 test/simple/test-pass.js
@@ -0,0 +1,6 @@
+/**
+ * Simple test that must passed.
+ */
+
+var assert = require("assert");
+assert.ok(true);
Please sign in to comment.
Something went wrong with that request. Please try again.