Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
creating a simple readme from inline documentation
branch: master

Fetching latest commit…

Cannot retrieve the latest commit at this time

Failed to load latest commit information.
bin
test
.gitignore
package.json
readme.html
readme.md

readme.md

simple-doc

creating a simple readme from inline documentation

test/function-proto.js

test/functions.js

test/proto.js

  • Parser - A class for parsing sococo log files and emitting structured data

Parser • top

Location: test/function-proto.js

A class for parsing sococo log files and emitting structured data

This conforms to the node Stream base class, and can be chained together with other stream filters using the .pipe() method. This class expects data to be in line format rather than chunks of data coming directly from the file. You can use readFile to pass in a file directly, and it will convert into buffered lines for you.

var p = new Parser();
p.on('data', function(data){
    console.log(data);
});
p.on('end', function(){
    console.log('parsing complete');
});
p.readFile('test-log.txt');

Parser#readFile(file)

Helper method for reading a file into the log parser directly.

Assumes first line of the file is going to be a header, so skips it for parsing. Buffers input until newline, then feeds each line into the parser. When its finished, it emits an end event. Because parsing is synchronous, you can depend on parsing being complete.

Parser#write(line)

Write log file lines to be parsed

Conforming to the stream api, mostly so you can parse together data transform streams. This stream expects lines in string format.

Parser#parseTransportData(line)

Parse sococo transport data out of a log line

We have a header on log messages that conforms roughly to a {{name:value;name:value}} format. This method parses that data off raw log message and returns a javascript object.

Parser • top

Location: test/proto.js

A class for parsing sococo log files and emitting structured data

This conforms to the node Stream base class, and can be chained together with other stream filters using the .pipe() method. This class expects data to be in line format rather than chunks of data coming directly from the file. You can use readFile to pass in a file directly, and it will convert into buffered lines for you.

var p = new Parser();
p.on('data', function(data){
    console.log(data);
});
p.on('end', function(){
    console.log('parsing complete');
});
p.readFile('test-log.txt');

Parser#readFile(file)

Helper method for reading a file into the log parser directly.

Assumes first line of the file is going to be a header, so skips it for parsing. Buffers input until newline, then feeds each line into the parser. When its finished, it emits an end event. Because parsing is synchronous, you can depend on parsing being complete.

Parser#write(line)

Write log file lines to be parsed

Conforming to the stream api, mostly so you can parse together data transform streams. This stream expects lines in string format.

Parser#parseTransportData(line)

Parse sococo transport data out of a log line

We have a header on log messages that conforms roughly to a {{name:value;name:value}} format. This method parses that data off raw log message and returns a javascript object.

ATestFunction2(stuff, things) • top

Location: test/function-proto.js

A test function summary for a second function

With a full summary

ATestFunction2(stuff, things);

functions.js(stuff, things) • top

Location: test/functions.js

A test function summary

With a full summary

ATestFunction(stuff, things);

ATestFunction2(stuff, things) • top

Location: test/functions.js

A test function summary for a second function

With a full summary

ATestFunction2(stuff, things);
Something went wrong with that request. Please try again.