Skip to content
Go to file

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

build status


Copyright (c) Lawrence S. Maccherone, Jr., 2012

It's less about testing your code with your documentation, but more the other way around. Make sure that the examples in your documentation stay current with your code.


  • coffeedoc by Omar Khan starting point for coffeedoctest
  • showdown.js for extracting code blocks from markdown

If you've spent any time working in Python, then you are probably familiar with doctest. The examples you add to document your project are like a map to the secret treasure that your users will find when they are able to easily use your library/API/tool/etc. But if the examples are wrong, it's like labeling the map with "promised land" right over the spot where it should say, "there be dragons".

coffeedoctest is a way to test your documentation with your code... to make sure the map matches the terrain.

I'm building upon Omar Khan's awesome coffedoc tool and using the same conventions. The text within multiline comments is interpreted as markdown markup. Any code blocks (each line that starts with 4 or more spaces) within this markdown is pulled out as "test" code. Any single line comments within these code blocks are treated as your expected output. When this example code runs, it should generate the results shown in the single line comments.


Let's say you have this awesome little library

Super square

    square = require('square').square
    # 36
Not only will it square 5 but it will square other numbers.

    # 16
exports.square = (n) -> n * n   

and you run coffeedoctest

coffeedoctest square

you should see the following output


Actual does not match expected when running coffeedoctest_temp/
Expected: 36
Actual  : 25
    square = require('square').square
    # 36

Notice how you are able to sprinkle non-test narrative in your markdown and it is ignored. Markdown code blocks in all of the properly-positioned multi-line comments found in the module (file) are concatenated into one test as if there was no intervening narrative or production code. Following coffeedoc convention, the proper place for these is either at the top of the module or between the declaration and body of a class, function, etc. Each module (file) is tested independently.

Note that coffeedoctest will not attempt to test codeblocks that are found within ordered or unordered lists. If you want to put some examples in that you don't want tested, you can use this behavior.


If you type coffeedoctest with no options or coffeedoctest -h, you'll get the help.

Usage: coffeedoctest [options] [targets]
   or: coffeedoctest . (scans all .coffee files from the current directory and down)

    --commonjs    : Use if target scripts use CommonJS for module loading (default)
    --requirejs   : Use if target scripts use RequireJS for module loading
    --readme      : Use if you want it to run tests in your file
    --clean       : Deletes temporary files even if there is an error
    --requirepath : Specifies "require" search root (default "./")

coffeedoctest will create a modified version of your package.json file in the coffeedoctest temporary working directory. This makes it possible for your example code to do simple requires so you don't clutter your example code with relative paths that may not apply to your users' usage.

A typical usage might look like this:

coffeedoctest --readme src


npm install coffeedoctest --save-dev


  • 0.5.0 - 2013-03-06 - Upgraded to CoffeeScript 1.6.x
  • 0.4.3 - 2012-12-07 - Errors output to console.error so exec-sync fails
  • 0.4.2 - 2012-12-07 - Updated dependencies
  • 0.4.1 - 2012-12-07 - No longer prefers global install
  • 0.4.0 - 2012-10-10 - Now creates a package.json so the tests don't need to worry about location of require() calls


It's not about testing your code with your docs but the other way around.



No releases published


No packages published
You can’t perform that action at this time.