Should eventually be a plugin for Small Federated wiki
JavaScript CoffeeScript Perl
Latest commit 91f6c9d Jul 11, 2012 @cognominal Spec Draft for jptf : json parse tree format
probalby a misnomer because it will evovle in something bigger.
Failed to load latest commit information.
bin false commit. really a backup Jun 25, 2012
data dynamic highlighting should work Jun 26, 2012
js more work on the page about the lite view port Jul 9, 2012
p6lib/Match dynamic highlighting should work Jun 26, 2012
server dynamic highlighting should work Jun 26, 2012
README.html Serve the docco correspondance file with the parse tree Jun 24, 2012 "text directed" hiliting partially working Jul 9, 2012
TBD more work on the page about the lite view port Jul 9, 2012
i.html more work on the page about the lite view port Jul 9, 2012

Tentatively called zbrew. Note : having no user yet, I play fast and loose. No git branches when I refactorize. I will probably tag somme "interesting" versions. This will soon change for the better. This file attemps to describe things from a bird view. The TBD file evolution shows the micro objectives.

Goal : literate everything

zbrew will allow to create material about programming code. It relies on a viewport called lite to highlight code. We can now have many lite viewports on one page.

The main focus is now to highlight stuff in a lite from textual material.

A lite does classic static highlighting and dynamic highlighting. With dynamic hightlighting one can understand how code is parsed. Usually, static highlighting is wrongly dubbed syntax highlighting, in fact it is just lexical hightlighting.

A most important point is that zbrew is web based. People must understand the power of zbrew without installing it. Projects like padre who depend on non web libraries are non starters. To get the most of zbrew one needs to interact with a server even if a disconnected mode is a goal (TBD)

achievements so far

Static hiliting and dynamic do work. Panning in parse path pane does not Integration in SFW not done. Touch interface devices should be supported. Run make, serve the directory with some web server and access i.html

Codeview needs both rakudo and nodejs

rakudo is used to parse files to generate json parse tree files. You can run npm install -g . in the codeview folder to install coffeescript. Coffeescript is used to convert .coffee files in .js ones.

Note: to install SFW, I created a gist

User interface

We call lite the code viewport. Lite stands for literate editor. Lite has three modes: simple, rule and full mode. So we say a "simple lite" or a "full lite" to denote such a viewport depending on its mode. In simple lite, only the code pane is visible. The code pane may splitted in two à la docco, with comments in left and code proper in right. (worked with static data. need to check with true code). One should be able to show the code pane with or without docco. In the second case he should be able to hide comments

So far, only the full lite mode is implemented.

In full lite, from top to bottom, the view port is constituted of 3 panes. The parse path pane (P3), the rule pane and the code pane. At any given time, some text chunk in the code panel is current. When the code was parsed, according to the grammar for that code, some sequence of rules was used to reduce that current text chunk. The last rule of that sequence is the leaf rule. The rule pane includes a clickable label that displays the grammar language (say pegjs, jison, Perl 6). Clicking it opens the relevant doc. TBD.

When moving the mouse over the code panel, or clicking in it, the currrent text chunk changes so the parse path and the rule panes are updated accordingly.


The loading of a lite is done thru an ajax call that brings a json object constitued of thre subobjects. We call it a lite object. The parseTree subobject is the parse tree, the rules subobject is the grammar broken in rules, the docco subobject is the partial mapping of rule names to docco css names for highlighting.

Note that the rules subobject may eventually be a lite object as well so that the rule pane will be a folded lite viewport with a partial view of a grammar; The view being the current rule. Clicking on that pane will bring the grammar as code in the enclosing viewport. Clicking again will bring the grammar of the grammar. An history mechanism should be implemented so that we can bring back the original code.

Note that, for highlighting purposes, the relevant rule may be a parent of the leaf rule. Say, that the code panel contains json code and the current text chunk has been reduced by the str rule. The static highlighting will likely be associated with the string rule, not str. Too much granularity for SH would be detrimental.

token string { \" ~ \" ( <str> | \\ <str_escape> )* } token str { <-["\\\t\n]>+ }

We eventually want to support the ST2 API because it is simple and clean. Apparently ST2 does not support chording sequence à la emacs. As a user I (may) regret it. As an implementer it will make easier the binding between chords and actions Plugins would be written in JS or run-server side.

We may revert to textareas for edition because the editing class attribute is supported only in webkit and behave wildly.


HTML5 allows to store a string in customs html element attributes which names starts by the "data" string. Like SFW, we use that feature. Note that we need to serialize js object into json before storing it. Serialization is done thru JSON.stringify; deserialization thru JSON.parse. Such an object cannot have cycles. If so we will need YAML. See


We will support two kinds of highlighting. The traditional one is called SH (Static Hilighting). With SH you can readily identify token by some CSS attribute, usually a color. DH (Dynamic Hilighting) is proper to brews and show how a file is parsed. DH comes in addition to SH and changes with user interaction, so its name.

Highlighting internals

HC is short for the highlighted code in a given viewport. HV denote a viewport that show HC.

A file is parsed and a json tree is generated that represents the corresponding parse tree. Depending on options, one can get either a full parse tree or just a sequence of tokens. The full parse tree is necessary for dynamic highlighting but a full file displayed this way would result on gigantic web pages.

In the client src/, the json file is pulled using ajax. When the rulename belongs to the passed doco object, the html element representing the text parsed by that rule gets the associated class attribute

So with the following docco object

docco = string: 's'

An element generated for a reduction caused by the rule string will look like

<span id="666" class="code rule-string docco-s">"toto"</span>

The html class names of the form rule-* are used by DH while those of the form docco-* are used by SH.

DH will be very costly in term of browser ressources. So at a given time only part of the code will be displayed as such and the rest will be in DH. But we want to be able to instantly convert in SH.

We will divide the HC in chunks. Each chunks will store the corresponding json deep parse in its data attribute. When mousing out a DH chunks, it will be converted back in SH. WHen mousing in a SH chunk it will be converted in DH.

This can be done in the client. The server will just say what sub parse trees should be chunks.

Multiple goals

The main goal is to integrate code-viewer (which is currently a mess) in SFW. SFW will probably be renamed zbrew to emphasize that the (upcoming) fork has a change of focus. Some material which is in emacs orgmode format will be integrated as soon as possible

I document here the ST2 packages I use

Many of them must be downloaded via "install package"

  • Fetch : Conveniently download stuff to set up file to webserve
  • Git : to work with the eponymous version control system
  • SublimeServer : We need to serve the files with a web server (versus acessing them thru the file system) so as to serve some of them using ajax

Stuff to present

How SublimeServer serves all the folder of the current projet. linters that run in background.


Dual licensed under the MIT or GPL Version 2 licenses.