Skip to content
Go to file
This branch is 39 commits ahead, 2 commits behind cruncher:master.

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time
Apr 2, 2020

Music JSON proposal

A proposal for a format for representing music in JSON, with the aim of making emerging web apps using the new Web Audio and Web MIDI APIs interoperable.

This document is intended as a discussion starter. Please comment, propose ideas and make pull requests.

Example JSON

Here are the first two bars of Dolphin Dance represented in Music JSON:

    "label": "Dolphin Dance",
    "events": [
        [0,   "meter", 4, 1],
        [0,   "rate", 1, "step"],
        [2,   "note", 76, 0.8, 0.5],
        [2.5, "note", 77, 0.6, 0.5],
        [3,   "note", 79, 1, 0.5],
        [3.5, "note", 74, 1, 3.5],
        [10,  "note", 76, 1, 0.5],
        [0,   "mode", "C", "∆", 4],
        [4,   "mode", "G", "-", 4]
    "interpretation": {
        "time_signature": "4/4",
        "key": "C",
        "transpose": 0


A sequence is an object with the properties id, label and events.

    "id": "0",
    "label": "My Sequence",
    "events": [event1, event2, ...]

The property id is a string, and in any array of sequences it must be unique. The property label is an arbitrary string. The property events is an array of event objects.

A sequence may also optionally have the properties sequences and interpretation.

    "id": "0",
    "label": "My Sequence",
    "events": [event1, event2, ...],
    "sequences": [sequence1, sequence2, ...]
    "interpetation": {...},

The property sequences is an array of sequence objects. Sequences may be nested to any arbitrary depth. The property interpretation is an object containing data that might help a music renderer display a score.


An event is an array with a start beat and an event type as it's first two members, and extra data depending on type.

[beat, type, data ...]

beat – FLOAT, describes a point in time from the start of the sequence
type – STRING, the event type

Beat values are arbitrary – they describe time in beats, rather than in absolute time. In a performance context, the absolute time of a beat is dependent upon the rate and the start time of its parent sequence.

The type determines the structure of the rest of the data in the event array. The possible types and their data are as follows:


Renders a note.

[time, "note", name, velocity, duration]

name – FLOAT [0-127], represents the pitch of a note as a MIDI number
velocity – FLOAT [0-1], represents the force of the note's attack
duration – FLOAT [0-n], represents the duration of the note in beats

The name parameter is a note pitch represented by a MIDI note number (where 0 represents note "C-1" and 127 represents note "G9"). However unlike MIDI it may be a float, allowing all pitches – tones and microtones – to be represented.

TBD. It may be useful to allow `name` to be a string OR a number. This would allow for notes in any scale, western or not, to be represented by arbitrary names.


Adjusts an instrument parameter.

[beat, "param", name, value, curve]

name – STRING, the name of the param to control
value – FLOAT, the destination value of the param
curve – STRING ["step"|"linear"|"exponential"|"target"], represents the type of ramp to use to transition to value


Changes the tempo the current sequence is playing at.

[beat, "rate", rate, curve]

rate – FLOAT, rate of playback of the parent sequence
curve – STRING ["step"|"linear"|"exponential"|"target"], represents the type of ramp to use to transition to the new rate


Changes the displayed meter of the sequence.

[beat, "meter", numerator, denominator]

numerator – INT, is the number of meter divisions per bar denominator – INT, is the duration in beats of a meter division


A mode provides information about the current key centre and mode of the music. A mode event could be used by a music renderer to display chord symbols, or could be interpreted by a music generator to improvise music.

[time, "mode", root, mode]

root – STRING ["A"|"Bb"|"B" ... "F#"|"G"|"G#"], represents the root of the chord
mode – STRING ["∆"|"-" ... TBD], represents the mode of the chord


Renders a sequence from the sequences array.

[beat, "sequence", sequenceId, targetId]

sequenceId – STRING, the id of a sequence found in the sequences array
targetId – STRING, the id of an instrument to play the sequence through

// Make the sequence "groove" play at beat 0.5 through instrument "3"
[0.5, "sequence", "groove", "3"]
TBD. It is not clear exactly how to spec targetId to select a target instrument in an interoperable manner. In Soundstage, it refers to the id of a node in the `nodes` array, where nodes are WebAudio nodes in the Soundstage graph.

interpretation (object)

The optional interpret object contains meta information not necessarily needed to render music as sound, but required to render music as notation. A good renderer should be capable of making intelligent guesses as to how to interpret Music JSON as notation and none of these properties are required.

    "meter": [4, 4],
    "key": "C",
    "transpose": 0


  • creates and exports Music JSON.
  • Soundstage, the JS library that powers, can be used to edit and play Music JSON in any web page.
  • MIDI Soundio's MIDI library converts MIDI events to Music JSON events with it's normalise method.
  • Scribe is a music notation interpreter and SVG renderer that consumes (an old version of) Music JSON.



A proposal for a standard way of creating music sequence data in JSON



No releases published


No packages published
You can’t perform that action at this time.