Skip to content


Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Any-separated values.
branch: master


latest commit 8136c78a29
@chbrown authored
Failed to load latest commit information.
bin Add flag for parsing JSON at the command line
test Reflect changes made to streaming package: streaming.{Filter, Omitter…
.gitignore First commit of sv.
LICENSE First commit of sv. Add CLI Usage section and split out dev notes
example.js Clarify that peek is only for the Stringifier
inference.js Revert back to implied {quotechar: '"'} for sv.Parser.
merge.js Refactor some things out to the 'streaming' package, move main script…
package.json 0.3.6
parser.js Don't call _flush directly from Parser and Stringifier
stringifier.js Don't call _flush directly from Parser and Stringifier


For all your separated value needs.


npm install sv

The optimist dependency is only required for command line use.

API usage

All tabular data must / will have column names on the first row.



index   name    time
1   chris   1:18
2   daniel  1:17
3   lewis   1:30
4   stephen 1:16
5   larry   1:32

And in node:

var sv = require('sv');
var parser = new sv.Parser();
parser.on('data', function(obj) {
  console.log('sprinter ->', obj);

var fs = require('fs');
var sprints = fs.createReadStream('sprints.csv', {encoding: 'utf8'});


var expenses = [
  {name: 'Tip'                },
  {name: 'Lunch', amount: 5.90},
  {name: 'Latte', amount: 3.15},
  {name: 'Paper', amount: 2.10},
  {name: 'Pens' , amount: 4.59},
  {               amount: 9.16}

var sv = require('sv');
var stringifier = new sv.Stringifier({peek: 2, missing: 'n/a'});
expenses.forEach(function(expense) {

// if you write set 'peek' to more rows than you have in your data,
// you'll need to call stringifier end so that they get flushed.
  • N.b.: If you pipe a buffer or (i.e., with a stringifier) into a parser, the parser will not receive any encoding. You must set the encoding on the parser in those cases.

Stringifier features:

  1. Infer column names from a list of objects.
  2. Convert from objects to csv / tsv plaintext.
    • Also allows writing arrays / strings directly.
  3. Write header automatically.

Parser features:

  1. Infer delimiter from input.
  2. Infer column names from first line of input.
  3. Handle universal newlines (\r, \r\n, or \n).

CLI usage

shopt -s globstar
for csv in ~/corpora/testsheets/**/*.csv; do
  file "$csv"
  echo "Tunneling through multiple 'sv' calls should be transparent."
  cat "$csv" | sv -j | wc -l
  cat "$csv" | sv | sv -j | wc -l


  • Decide how to encode a field like {id: 1, name: '"chris'}, when the delimiter is , and quotechar is ".
    • This is weird because it doesn't need quoting, but without, the quotechar marker will trigger an inside state, but there's no end quote.)

Also see the notes for more development comments.


Copyright © 2013 Christopher Brown. MIT Licensed.

Something went wrong with that request. Please try again.