Skip to content

Latest commit

 

History

History
1654 lines (1434 loc) · 77 KB

Sparkplug_6_Payloads.adoc

File metadata and controls

1654 lines (1434 loc) · 77 KB

Payloads

Overview

The MQTT specification does not define any required data payload format. From an MQTT infrastructure standpoint, the payload is treated as an agnostic binary array of bytes that can be anything from no payload at all, to a maximum of 256MB. But for applications within a known solution space to work using MQTT the payload representation does need to be defined.

This section of the Eclipse Sparkplug specification defines how MQTT Sparkplug payloads are encoded and the data that is required. Sparkplug supports multiple payloads encoding definitions. There is an 'A' payload format as well as a 'B' payload format. As described in the Introduction Section Sparkplug A is deprecated and is not included in this document. This section will only cover the details of the Sparkplug 'B' payload format.

The majority of devices connecting into next generation IIoT infrastructures are legacy equipment using poll/response protocols. This means we must take in account register based data from devices that talk protocols like Modbus. The existing legacy equipment needs to work in concert with emerging IIoT equipment that is able to leverage message transports like MQTT natively.

Google Protocol Buffers

"Protocol Buffers are a way of encoding structured data in an efficient yet extensible format."

Google Protocol Buffers, sometimes referred to as "Google Protobufs", provide the efficiency of packed binary data encoding while providing the structure required to make it easy to create, transmit, and parse register based process variables using a standard set of tools while enabling emerging IIoT requirements around richer metadata. Google Protocol Buffers development tools are available for:

  • C

  • C++

  • C#

  • Java

  • Python

  • GO

  • JavaScript

Additional information on Google Protocol Buffers can be found at:

Sparkplug A MQTT Payload Definition

As described in the Introduction Section Sparkplug A is deprecated and is not included in this document.

Sparkplug B MQTT Payload Definition

The goal of Sparkplug is to provide a specification that both OEM device manufactures and application developers can use to create rich and interoperable SCADA/IIoT solutions using MQTT as a base messaging technology. In the Sparkplug B message payload definition, the goal was to create a simple and straightforward binary message encoding that could be used primarily for legacy register based process variables (Modbus register value for example).

The Sparkplug B MQTT payload format has come about based on the feedback from many system integrators and end users who wanted to be able to natively support a much richer data model within the MQTT infrastructures that they were designing and deploying. Using the feedback from the user community Sparkplug B provides support for:

  • Complex data types using templates

  • Datasets

  • Richer metrics with the ability to add property metadata for each metric

  • Metric alias support to maintain rich metric naming while keeping bandwidth usage to a minimum

  • Historical data

  • File data

The Sparkplug B payload definition creates a bandwidth efficient data transport for real time device data. For WAN based SCADA/IIoT infrastructures this equates to lower latency data updates while minimizing the amount of traffic and therefore cellular and/or VSAT bandwidth required. In situations where bandwidth savings is not the primary concern, the efficient use enables higher throughput of more data eliminating sensor data that may have previously been left stranded in the field. It is also ideal for LAN based SCADA infrastructures equating to higher throughput of real time data to consumer applications without requiring extreme networking topologies and/or equipment.

There are many data encoding technologies available that can all be used in conjunction with MQTT. Sparkplug B selected an existing, open, and highly available encoding scheme that efficiently encodes register based process variables. The encoding technology selected for Sparkplug B is Google Protocol Buffers.

Google Protocol Buffer Schema

Using lessons learned on the feedback from the Sparkplug A implementation a new Google Protocol Buffer schema was developed that could be used to represent and encode the more complex data models being requested. The entire Google Protocol Buffers definition is below.

// * Copyright (c) 2015-2021 Cirrus Link Solutions and others
// *
// * This program and the accompanying materials are made available under the
// * terms of the Eclipse Public License 2.0 which is available at
// * http://www.eclipse.org/legal/epl-2.0.
// *
// * SPDX-License-Identifier: EPL-2.0
// *
// * Contributors:
// *   Cirrus Link Solutions - initial implementation

//
// To compile:
// cd client_libraries/java
// protoc --proto_path=../../ --java_out=src/main/java ../../sparkplug_b.proto
//

syntax = "proto2";

package org.eclipse.tahu.protobuf;

option java_package         = "org.eclipse.tahu.protobuf";
option java_outer_classname = "SparkplugBProto";

enum DataType {
    // Indexes of Data Types

    // Unknown placeholder for future expansion.
    Unknown         = 0;

    // Basic Types
    Int8            = 1;
    Int16           = 2;
    Int32           = 3;
    Int64           = 4;
    UInt8           = 5;
    UInt16          = 6;
    UInt32          = 7;
    UInt64          = 8;
    Float           = 9;
    Double          = 10;
    Boolean         = 11;
    String          = 12;
    DateTime        = 13;
    Text            = 14;

    // Additional Metric Types
    UUID            = 15;
    DataSet         = 16;
    Bytes           = 17;
    File            = 18;
    Template        = 19;

    // Additional PropertyValue Types
    PropertySet     = 20;
    PropertySetList = 21;

    // Array Types
    Int8Array = 22;
    Int16Array = 23;
    Int32Array = 24;
    Int64Array = 25;
    UInt8Array = 26;
    UInt16Array = 27;
    UInt32Array = 28;
    UInt64Array = 29;
    FloatArray = 30;
    DoubleArray = 31;
    BooleanArray = 32;
    StringArray = 33;
    DateTimeArray = 34;
}

message Payload {

    message Template {

        message Parameter {
            optional string name        = 1;
            optional uint32 type        = 2;

            oneof value {
                uint32 int_value        = 3;
                uint64 long_value       = 4;
                float  float_value      = 5;
                double double_value     = 6;
                bool   boolean_value    = 7;
                string string_value     = 8;
                ParameterValueExtension extension_value = 9;
            }

            message ParameterValueExtension {
                extensions              1 to max;
            }
        }

        optional string version         = 1;          // The version of the Template to prevent mismatches
        repeated Metric metrics         = 2;          // Each metric includes a name, datatype, and optionally a value
        repeated Parameter parameters   = 3;
        optional string template_ref    = 4;          // MUST be a reference to a template definition if this is an instance (i.e. the name of the template definition) - MUST be omitted for template definitions
        optional bool is_definition     = 5;
        extensions                      6 to max;
    }

    message DataSet {

        message DataSetValue {

            oneof value {
                uint32 int_value                        = 1;
                uint64 long_value                       = 2;
                float  float_value                      = 3;
                double double_value                     = 4;
                bool   boolean_value                    = 5;
                string string_value                     = 6;
                DataSetValueExtension extension_value   = 7;
            }

            message DataSetValueExtension {
                extensions  1 to max;
            }
        }

        message Row {
            repeated DataSetValue elements  = 1;
            extensions                      2 to max;   // For third party extensions
        }

        optional uint64   num_of_columns    = 1;
        repeated string   columns           = 2;
        repeated uint32   types             = 3;
        repeated Row      rows              = 4;
        extensions                          5 to max;   // For third party extensions
    }

    message PropertyValue {

        optional uint32     type                    = 1;
        optional bool       is_null                 = 2;

        oneof value {
            uint32          int_value               = 3;
            uint64          long_value              = 4;
            float           float_value             = 5;
            double          double_value            = 6;
            bool            boolean_value           = 7;
            string          string_value            = 8;
            PropertySet     propertyset_value       = 9;
            PropertySetList propertysets_value      = 10;      // List of Property Values
            PropertyValueExtension extension_value  = 11;
        }

        message PropertyValueExtension {
            extensions                             1 to max;
        }
    }

    message PropertySet {
        repeated string        keys     = 1;         // Names of the properties
        repeated PropertyValue values   = 2;
        extensions                      3 to max;
    }

    message PropertySetList {
        repeated PropertySet propertyset = 1;
        extensions                       2 to max;
    }

    message MetaData {
        // Bytes specific metadata
        optional bool   is_multi_part   = 1;

        // General metadata
        optional string content_type    = 2;        // Content/Media type
        optional uint64 size            = 3;        // File size, String size, Multi-part size, etc
        optional uint64 seq             = 4;        // Sequence number for multi-part messages

        // File metadata
        optional string file_name       = 5;        // File name
        optional string file_type       = 6;        // File type (i.e. xml, json, txt, cpp, etc)
        optional string md5             = 7;        // md5 of data

        // Catchalls and future expansion
        optional string description     = 8;        // Could be anything such as json or xml of custom properties
        extensions                      9 to max;
    }

    message Metric {

        optional string   name          = 1;        // Metric name - should only be included on birth
        optional uint64   alias         = 2;        // Metric alias - tied to name on birth and included in all later DATA messages
        optional uint64   timestamp     = 3;        // Timestamp associated with data acquisition time
        optional uint32   datatype      = 4;        // DataType of the metric/tag value
        optional bool     is_historical = 5;        // If this is historical data and should not update real time tag
        optional bool     is_transient  = 6;        // Tells consuming clients such as MQTT Engine to not store this as a tag
        optional bool     is_null       = 7;        // If this is null - explicitly say so rather than using -1, false, etc for some datatypes.
        optional MetaData metadata      = 8;        // Metadata for the payload
        optional PropertySet properties = 9;

        oneof value {
            uint32   int_value                      = 10;
            uint64   long_value                     = 11;
            float    float_value                    = 12;
            double   double_value                   = 13;
            bool     boolean_value                  = 14;
            string   string_value                   = 15;
            bytes    bytes_value                    = 16;       // Bytes, File
            DataSet  dataset_value                  = 17;
            Template template_value                 = 18;
            MetricValueExtension extension_value    = 19;
        }

        message MetricValueExtension {
            extensions  1 to max;
        }
    }

    optional uint64   timestamp     = 1;        // Timestamp at message sending time
    repeated Metric   metrics       = 2;        // Repeated forever - no limit in Google Protobufs
    optional uint64   seq           = 3;        // Sequence number
    optional string   uuid          = 4;        // UUID to track message type in terms of schema definitions
    optional bytes    body          = 5;        // To optionally bypass the whole definition above
    extensions                      6 to max;   // For third party extensions
}

Payload Metric Naming Convention

For the remainder of this document JSON will be used to represent components of a Sparkplug B payload. It is important to note that the payload is a binary encoding and is not actually JSON. However, JSON representation is used in this document to represent the payloads in a way that is easy to read. For example, a simple Sparkplug B payload with a single metric can be represented in JSON as follows:

{
        "timestamp": <timestamp>,
        "metrics": [{
                "name": <metric_name>,
                "alias": <alias>,
                "timestamp": <timestamp>,
                "dataType": <datatype>,
                "value": <value>
        }],
        "seq": <sequence_number>
}

A simple Sparkplug B payload with values would be represented as follows:

{
        "timestamp": 1486144502122,
        "metrics": [{
                "name": "My Metric",
                "alias": 1,
                "timestamp": 1479123452194,
                "dataType": "String",
                "value": "Test"
        }],
        "seq": 2
}

Note that the ‘name’ of a metric may be hierarchical to build out proper folder structures for applications consuming the metric values. For example, in an application where an Edge Node in connected to several devices or data sources, the ‘name’ could represent discrete folder structures of:

‘Folder 1/Folder 2/Metric Name’

Using this convention in conjunction with the group_id, edge_node_id and device_id already defined in the Topic Namespace, consuming applications can organize metrics in the same hierarchical fashion:

Figure 9 – Payload Metric Folder Structure

plantuml::assets/plantuml/payload-metric-folder-structure.puml[format=svg, alt="Payload Metric Folder Structure"]

Sparkplug B v1.0 Payload Components

The Sparkplug specification Topics Section defines the Topic Namespace that Sparkplug uses to publish and subscribe between Edge Nodes and Host Applications within the MQTT infrastructure. Using that Topic Namespace, this section of the specification defines the actual payload contents of each message type in Sparkplug B v1.0.

Payload Component Definitions

Sparkplug B consists of a series of one or more metrics with metadata surrounding those metrics. The following definitions explain the components that make up a payload.

Payload

A Sparkplug B payload is the top-level component that is encoded and used in an MQTT message. It contains some basic information such as a timestamp and a sequence number as well as an array of metrics which contain key/value pairs of data. A Sparkplug B payload includes the following components.

  • payload

    • timestamp

      • This is the timestamp in the form of an unsigned 64-bit integer representing the number of milliseconds since epoch (Jan 1, 1970).

      • [tck-id-payloads-timestamp-in-UTC] This timestamp MUST be in UTC.

      • This timestamp represents the time at which the message was published.

    • metrics

      • This is an array of metrics representing key/value/datatype values. Metrics are further defined here.

    • seq

      • This is the sequence number which is an unsigned 64-bit integer.

        • [tck-id-payloads-sequence-num-always-included] A sequence number MUST be included in the payload of every Sparkplug MQTT message from an Edge Node except NDEATH messages.

        • [tck-id-payloads-sequence-num-zero-nbirth] A NBIRTH message from an Edge Node MUST always contain a sequence number between 0 and 255 (inclusive).

        • [tck-id-payloads-sequence-num-incrementing] All subsequent messages after an NBIRTH from an Edge Node MUST contain a sequence number that is continually increasing by one in each message from that Edge Node until a value of 255 is reached. At that point, the sequence number of the following message MUST be zero.

    • uuid

      • This is a field which can be used to represent a schema or some other specific form of the message. Example usage would be to supply a UUID which represents an encoding mechanism of the optional array of bytes associated with a payload.

    • body

      • This is an array of bytes which can be used for any custom binary encoded data.

Metric

A Sparkplug B metric is a core component of data in the payload. It represents a key, value, timestamp, and datatype along with metadata used to describe the information it contains. These also represent 'tags' in classic SCADA systems. It includes the following components.

  • name

    • This is the friendly name of a metric. It should be represented as a forward-slash delimited UTF-8 string. The slashes in the string represent folders of the metric to represent hierarchical data structures. For example, ‘outputs/A’ would be a metric with a unique identifier of ‘A’ in the ‘outputs’ folder. There is no limit to the number of folders. However, across the infrastructure of MQTT publishers a defined folder should always remain a folder.

    • [tck-id-payloads-name-requirement] The name MUST be included with every metric unless aliases are being used.

    • All UTF-8 characters are allowed in the metric name. However, special characters including but not limited to the following are discouraged: . , \ @ # $ % ^ & * ( ) [ ] { } | ! ` ~ : ; ' " < > ?. This is because many Sparkplug Host Applications may have issues handling them.

  • alias

    • This is an unsigned 64-bit integer representing an optional alias for a Sparkplug B payload. Aliases are optional and not required. If aliases are used, the following rules apply.

      • [tck-id-payloads-alias-uniqueness] If supplied in an NBIRTH or DBIRTH it MUST be a unique number across this Edge Node’s entire set of metrics.

        • Non-normative comment: no two metrics for the same Edge Node can have the same alias. Upon being defined in the NBIRTH or DBIRTH, subsequent messages can supply only the alias instead of the metric friendly name to reduce overall message size.

      • [tck-id-payloads-alias-birth-requirement] NBIRTH and DBIRTH messages MUST include both a metric name and alias.

      • [tck-id-payloads-alias-data-cmd-requirement] NDATA, DDATA, NCMD, and DCMD messages MUST only include an alias and the metric name MUST be excluded.

  • timestamp

    • This is the timestamp in the form of an unsigned 64-bit integer representing the number of milliseconds since epoch (Jan 1, 1970).

    • [tck-id-payloads-name-birth-data-requirement] The timestamp MUST be included with every metric in all NBIRTH, DBIRTH, NDATA, and DDATA messages.

    • [tck-id-payloads-name-cmd-requirement] The timestamp MAY be included with metrics in NCMD and DCMD messages.

    • [tck-id-payloads-metric-timestamp-in-UTC] The timestamp MUST be in UTC.

      • Non-normative comment: This timestamp represents the time at which the value of a metric was captured.

  • datatype

    • [tck-id-payloads-metric-datatype-value-type] The datatype MUST be an unsigned 32-bit integer representing the datatype.

    • [tck-id-payloads-metric-datatype-value] The datatype MUST be one of the enumerated values as shown in the valid Sparkplug Data Types.

    • [tck-id-payloads-metric-datatype-req] The datatype MUST be included with each metric definition in NBIRTH and DBIRTH messages.

    • [tck-id-payloads-metric-datatype-not-req] The datatype SHOULD NOT be included with metric definitions in NDATA, NCMD, DDATA, and DCMD messages.

  • is_historical

    • This is a Boolean flag which denotes whether this metric represents a historical value. In some cases, it may be desirable to send metrics after they were acquired from a device or Edge Node. This can be done for batching, store and forward, or sending local backup data during network communication loses. This flag denotes that the message should not be considered a real time/current value.

  • is_transient

    • This is a Boolean flag which denotes whether this metric should be considered transient. Transient metrics can be considered those that are of interest to a host application(s) but should not be stored in a historian.

  • is_null

    • This is a Boolean flag which denotes whether this metric has a null value. This is Sparkplug B’s mechanism of explicitly denoting a metric’s value is actually null.

  • metadata

    • This is a MetaData object associated with the metric for dealing with more complex datatypes. This is covered in the metadata section.

  • properties

    • This is a PropertySet object associated with the metric for including custom key/value pairs of metadata associated with a metric. This is covered in the property set section.

  • value

    • The value of a metric utilizes the ‘oneof’ mechanism of Google Protocol Buffers. The value supplied with a metric MUST be one of the following types. Note if the metrics is_null flag is set to true the value can be omitted altogether. More information on the Google Protocol Buffer types can be found here: https://developers.google.com/protocol-buffers/docs/proto#scalar

      • Google Protocol Buffer Type: uint32

      • Google Protocol Buffer Type: uint64

      • Google Protocol Buffer Type: float

      • Google Protocol Buffer Type: double

      • Google Protocol Buffer Type: bool

      • Google Protocol Buffer Type: string

      • Google Protocol Buffer Type: bytes

      • Sparkplug DataSet

      • Sparkplug Template

MetaData

A Sparkplug B MetaData object is used to describe different types of binary data. These are optional and includes the following components.

  • is_multi_part

    • A Boolean representing whether this metric contains part of a multi-part message. Breaking up large quantities of data can be useful for keeping the flow of MQTT messages flowing through the system. Because MQTT ensures in-order delivery of QoS 0 messages on the same topic, very large messages can result in messages being blocked while delivery of large messages takes place.

  • content_type

    • This is a UTF-8 string which represents the content type of a given metric value if applicable.

  • size

    • This is an unsigned 64-bit integer representing the size of the metric value. This is useful when metric values such as files are sent. This field can be used for the file size.

  • seq

    • If this is a multipart metric, this is an unsigned 64-bit integer representing the sequence number of this part of a multipart metric.

  • file_name

    • If this is a file metric, this is a UTF-8 string representing the filename of the file.

  • file_type

    • If this is a file metric, this is a UTF-8 string representing the type of the file.

  • md5

    • If this is a byte array or file metric that can have a md5sum, this field can be used as a UTF-8 string to represent it.

  • description

    • This is a freeform field with a UTF-8 string to represent any other pertinent metadata for this metric. It can contain JSON, XML, text, or anything else that can be understood by both the publisher and the subscriber.

PropertySet

A Sparkplug B PropertySet object is used with a metric to add custom properties to the object. The PropertySet is a map expressed as two arrays of equal size, one containing the keys and one containing the values. It includes the following components.

  • keys

    • This is an array of UTF-8 strings representing the names of the properties in this PropertySet.

    • [tck-id-payloads-propertyset-keys-array-size] The array of keys in a PropertySet MUST contain the same number of values included in the array of PropertyValue objects.

  • values

    • This is an array of PropertyValue objects representing the values of the properties in the PropertySet.

    • [tck-id-payloads-propertyset-values-array-size] The array of values in a PropertySet MUST contain the same number of items that are in the keys array.

PropertyValue

A Sparkplug B PropertyValue object is used to encode the value and datatype of the value of a property in a PropertySet. It includes the following components.

  • type

    • [tck-id-payloads-metric-propertyvalue-type-type] This MUST be an unsigned 32-bit integer representing the datatype.

    • [tck-id-payloads-metric-propertyvalue-type-value] This value MUST be one of the enumerated values as shown in the Sparkplug Basic Data Types or the Sparkplug Property Value Data Types.

    • [tck-id-payloads-metric-propertyvalue-type-req] This MUST be included in Property Value Definitions in NBIRTH and DBIRTH messages.

  • is_null

    • This is a Boolean flag which denotes whether this property has a null value. This is Sparkplug B’s mechanism of explicitly denoting a property’s value is actually null.

  • value

    • The value of a property utilizes the ‘oneof’ mechanism of Google Protocol Buffers. The value supplied with a metric MUST be one of the following types. Note if the metrics is_null flag is set to true the value can be omitted altogether. More information on the Google Protocol Buffer types can be found here: https://developers.google.com/protocol-buffers/docs/proto#scalar

      • Google Protocol Buffer Type: uint32

      • Google Protocol Buffer Type: uint64

      • Google Protocol Buffer Type: float

      • Google Protocol Buffer Type: double

      • Google Protocol Buffer Type: bool

      • Google Protocol Buffer Type: string

      • Sparkplug PropertySet

      • Sparkplug PropertySetList

Quality Codes

There is one specific property key in Sparkplug called 'Quality'. This defines the quality of the value associated with the metric. This property is optional and is only required if the quality of the metric is not GOOD.

There are three possible quality code values. These are defined below with their associated meanings.

  • 0

    • BAD

  • 192

    • GOOD

  • 500

    • STALE

[tck-id-payloads-propertyset-quality-value-type] The 'type' of the Property Value MUST be a value of 3 which represents a Signed 32-bit Integer.

[tck-id-payloads-propertyset-quality-value-value] The 'value' of the Property Value MUST be an int_value and be one of the valid quality codes of 0, 192, or 500.

PropertySetList

A Sparkplug B PropertySetList object is an array of PropertySet objects. It includes the following components.

DataSet

A Sparkplug B DataSet object is used to encode matrices of data. It includes the following components.

  • num_of_columns

    • [tck-id-payloads-dataset-column-size] This MUST be an unsigned 64-bit integer representing the number of columns in this DataSet.

  • columns

    • This is an array of strings representing the column headers of this DataSet.

    • [tck-id-payloads-dataset-column-num-headers] The size of the array MUST have the same number of elements that the types array contains.

  • types

    • [tck-id-payloads-dataset-types-def] This MUST be an array of unsigned 32 bit integers representing the datatypes of the columns.

    • [tck-id-payloads-dataset-types-num] The array of types MUST have the same number of elements that the columns array contains.

    • [tck-id-payloads-dataset-types-type] The values in the types array MUST be a unsigned 32-bit integer representing the datatype.

    • [tck-id-payloads-dataset-types-value] This values in the types array MUST be one of the enumerated values as shown in the Sparkplug Basic Data Types.

    • [tck-id-payloads-dataset-parameter-type-req] The types array MUST be included in all DataSets.

  • rows

    • This is an array of DataSet.Row objects. It contains the data that makes up the data rows of this DataSet.

DataSet.Row

A Sparkplug B DataSet.Row object represents a row of data in a DataSet. It includes the following components.

  • elements

    • This is an array of DataSet.DataSetValue objects. It represents the data contained within a row of a DataSet.

DataSet.DataSetValue

  • value

    • The value of a DataSet.DataSetValue utilizes the ‘oneof’ mechanism of Google Protocol Buffers.

    • [tck-id-payloads-template-dataset-value] The value supplied MUST be one of the following Google Protobuf types: uint32, uint64, float, double, bool, or string.

More information on the types above can be found here: https://developers.google.com/protocol-buffers/docs/proto#scalar

Template

A Sparkplug B Template is used for encoding complex datatypes in a payload. It is a type of metric and can be used to create custom datatype definitions and instances. These are also sometimes referred to as 'User Defined Types' or UDTs. There are two types of Templates.

  • Template Definition

    • This is the definition of a Sparkplug Template.

      • [tck-id-payloads-template-definition-nbirth-only] Template Definitions MUST only be included in NBIRTH messages.

      • [tck-id-payloads-template-definition-is-definition] A Template Definition MUST have is_definition set to true.

      • [tck-id-payloads-template-definition-ref] A Template Definition MUST omit the template_ref field.

      • [tck-id-payloads-template-definition-members] A Template Definition MUST include all member metrics that will ever be included in corresponding template instances.

      • [tck-id-payloads-template-definition-nbirth] A Template Definition MUST be included in the NBIRTH for all Template Instances that are included in the NBIRTH and DBIRTH messages.

        • A Template Instance can not reference a Template Definition that was not included in the NBIRTH.

      • [tck-id-payloads-template-definition-parameters] A Template Definition MUST include all parameters that will be included in the corresponding Template Instances.

      • [tck-id-payloads-template-definition-parameters-default] A Template Definition MAY include values for parameters in the Template Definition parameters.

        • These act as the defaults for any template instances that don’t include parameter values in the NBIRTH or DBIRTH messages.

  • Template Instance

    • This is an instance of a Sparkplug Template.

      • [tck-id-payloads-template-instance-is-definition] A Template Instance MUST have is_definition set to false.

      • [tck-id-payloads-template-instance-ref] A Template Instance MUST have template_ref set to the type of template definition it is.

        • It must be set to the name of the metric that represents the template definition.

      • [tck-id-payloads-template-instance-members] A Template Instance MUST include only members that were included in the corresponding template definition.

      • [tck-id-payloads-template-instance-members-birth] A Template Instance in a NBIRTH or DBIRTH message MUST include all members that were included in the corresponding Template Definition.

      • [tck-id-payloads-template-instance-members-data] A Template Instance in a NDATA or DDATA message MAY include only a subset of the members that were included in the corresponding template definition.

        • A Template Instance does not need to be a complete set of all member metrics that were included in the Template Definition.

      • [tck-id-payloads-template-instance-parameters] A Template Instance MAY include parameter values for any parameters that were included in the corresponding Template Definition.

        • If a parameter value was included in the Template Definition but not in the Template Instance the value of that parameter is implicitly the default value from the Template Definition.

A Sparkplug Template includes the following components.

  • version

    • This is an optional field and can be included in a Template Definition or Template Instance.

    • [tck-id-payloads-template-version] If included, the version MUST be a UTF-8 string representing the version of the Template.

  • metrics

    • This is an array of metrics representing the members of the Template. These can be primitive datatypes or other Templates as required.

  • parameters

    • This is an option field and is an array of Parameter objects representing parameters associated with the Template.

  • template_ref

    • [tck-id-payloads-template-ref-definition] This MUST be omitted if this is a Template Definition.

    • [tck-id-payloads-template-ref-instance] This MUST be a UTF-8 string representing a reference to a Template Definition name if this is a Template Instance.

  • is_definition

    • This is a Boolean representing whether this is a Template definition or a Template instance.

    • [tck-id-payloads-template-is-definition] This MUST be included in every Template Definition and Template Instance.

    • [tck-id-payloads-template-is-definition-definition] This MUST be set to true if this is a Template Definition.

    • [tck-id-payloads-template-is-definition-instance] This MUST be set to false if this is a Template Instance.

Template.Parameter

A Sparkplug B Template.Parameter is a metadata field for a Template. This can be used to represent parameters that are common across a Template Definition but the values are unique to the Template instances. It includes the following components.

  • name

    • [tck-id-payloads-template-parameter-name-required] This MUST be included in every Template Parameter definition.

    • [tck-id-payloads-template-parameter-name-type] This MUST be a UTF-8 string representing the name of the Template parameter.

  • type

    • [tck-id-payloads-template-parameter-value-type] This MUST be an unsigned 32-bit integer representing the datatype.

    • [tck-id-payloads-template-parameter-type-value] This value MUST be one of the enumerated values as shown in the Sparkplug Basic Data Types.

    • [tck-id-payloads-template-parameter-type-req] This MUST be included in Template Parameter Definitions in NBIRTH and DBIRTH messages.

  • value

    • The value of a template parameter utilizes the ‘oneof’ mechanism of Google Protocol Buffers.

    • [tck-id-payloads-template-parameter-value] The value supplied MUST be one of the following Google Protocol Buffer types: uint32, uint64, float, double, bool, or string.

    • For a template definition, this is the default value of the parameter. For a template instance, this is the value unique to that instance.

Data Types

Sparkplug defines the valid data types used for various Sparkplug constucts including Metric datatypes Property Value types, DataSet types, and Template Parameter types. Datatypes are represented as an enum in Google Protobufs as shown below.

enum DataType {
    // Indexes of Data Types

    // Unknown placeholder for future expansion.
    Unknown         = 0;

    // Basic Types
    Int8            = 1;
    Int16           = 2;
    Int32           = 3;
    Int64           = 4;
    UInt8           = 5;
    UInt16          = 6;
    UInt32          = 7;
    UInt64          = 8;
    Float           = 9;
    Double          = 10;
    Boolean         = 11;
    String          = 12;
    DateTime        = 13;
    Text            = 14;

    // Additional Metric Types
    UUID            = 15;
    DataSet         = 16;
    Bytes           = 17;
    File            = 18;
    Template        = 19;

    // Additional PropertyValue Types
    PropertySet     = 20;
    PropertySetList = 21;

    // Array Types
    Int8Array = 22;
    Int16Array = 23;
    Int32Array = 24;
    Int64Array = 25;
    UInt8Array = 26;
    UInt16Array = 27;
    UInt32Array = 28;
    UInt64Array = 29;
    FloatArray = 30;
    DoubleArray = 31;
    BooleanArray = 32;
    StringArray = 33;
    DateTimeArray = 34;
}

Datatype Details

  • Basic Types

    • Unknown

      • Sparkplug enum value: 0

    • Int8

      • Signed 8-bit integer

      • Google Protocol Buffer Type: uint32

      • Sparkplug enum value: 1

    • Int16

      • Signed 16-bit integer

      • Google Protocol Buffer Type: uint32

      • Sparkplug enum value: 2

    • Int32

      • Signed 32-bit integer

      • Google Protocol Buffer Type: uint32

      • Sparkplug enum value: 3

    • Int64

      • Signed 64-bit integer

      • Google Protocol Buffer Type: uint64

      • Sparkplug enum value: 4

    • UInt8

      • Unsigned 8-bit integer

      • Google Protocol Buffer Type: uint32

      • Sparkplug enum value: 5

    • UInt16

      • Unsigned 16-bit integer

      • Google Protocol Buffer Type: uint32

      • Sparkplug enum value: 6

    • UInt32

      • Unsigned 32-bit integer

      • Google Protocol Buffer Type: uint32

      • Sparkplug enum value: 7

    • UInt64

      • Unsigned 64-bit integer

      • Google Protocol Buffer Type: uint64

      • Sparkplug enum value: 8

    • Float

      • 32-bit floating point number

      • Google Protocol Buffer Type: float

      • Sparkplug enum value: 9

    • Double

      • 64-bit floating point number

      • Google Protocol Buffer Type: double

      • Sparkplug enum value: 10

    • Boolean

      • Boolean value

      • Google Protocol Buffer Type: bool

      • Sparkplug enum value: 11

    • String

      • String value (UTF-8)

      • Google Protocol Buffer Type: string

      • Sparkplug enum value: 12

  • DateTime

    • Date time value as uint64 value representing milliseconds since epoch (Jan 1, 1970)

    • Google Protocol Buffer Type: uint64

    • Sparkplug enum value: 13

  • Text

    • String value (UTF-8)

    • Google Protocol Buffer Type: string

    • Sparkplug enum value: 14

  • Additional Types

    • UUID

      • UUID value as a UTF-8 string

      • Google Protocol Buffer Type: string

      • Sparkplug enum value: 15

    • DataSet

      • DataSet as defined here

      • Google Protocol Buffer Type: none – defined in Sparkplug

      • Sparkplug enum value: 16

    • Bytes

      • Array of bytes

      • Google Protocol Buffer Type: bytes

      • Sparkplug enum value: 17

    • File

      • Array of bytes representing a file

      • Google Protocol Buffer Type: bytes

      • Sparkplug enum value: 18

    • Template

      • Template as defined here

      • Google Protocol Buffer Type: none – defined in Sparkplug

      • Sparkplug enum value: 19

  • Additional PropertyValue Types

    • PropertySet

      • PropertySet as defined here

      • Google Protocol Buffer Type: none – defined in Sparkplug

      • Sparkplug enum value: 20

    • PropertySetList

      • PropertySetList as defined here

      • Google Protocol Buffer Type: none – defined in Sparkplug

      • Sparkplug enum value: 21

  • Array Types

All array types use the bytes_value field of the Metric value field. They are simply little-endian packed byte arrays.

For example, consider an Int32 array with two decimal values [123456789, 987654321]

Array converted to little endian hex: [0x15CD5B07, 0xB168DE3A]

The bytes_value of the Sparkplug Metric must be: [0x15, 0xCD, 0x5B, 0x07, 0xB1, 0x68, 0xDE, 0x3A]

  • Int8Array

    • Int8Array as an array of packed little endian int8 bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 22

    • Example (Decimal to Metric bytes_value): [-23, 123] → [0xE9, 0x7B]

  • Int16Array

    • Int16Array as an array of packed little endian int16 bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 23

    • Example (Decimal to Metric bytes_value): [-30000, 30000] → [0xD0, 0x8A, 0x30, 0x75]

  • Int32Array

    • Int8Array as an array of packed little endian int32 bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 24

    • Example (Decimal to Metric bytes_value): [-1, 315338746] → [0xFF, 0xFF, 0xFF, 0xFF, 0xFA, 0xAF, 0xCB, 0x12]

  • Int64Array

    • Int8Array as an array of packed little endian int64 bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 25

    • Example (Decimal to Metric bytes_value): [-4270929666821191986, -3601064768563266876] → [0xCE, 0x06, 0x72, 0xAC, 0x18, 0x9C, 0xBA, 0xC4, 0xC4, 0xBA, 0x9C, 0x18, 0xAC, 0x72, 0x06, 0xCE]

  • UInt8Array

    • UInt8Array as an array of packed little endian uint8 bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 26

    • Example (Decimal to Metric bytes_value): [23, 250] → [0x17, 0xFA]

  • UInt16Array

    • UInt16Array as an array of packed little endian uint16 bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 27

    • Example (Decimal to Metric bytes_value): [30, 52360] → [0x1E, 0x00, 0x88, 0xCC]

  • UInt32Array

    • UInt32Array as an array of packed little endian uint32 bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 28

    • Example (Decimal to Metric bytes_value): [52, 3293969225] → [0x34, 0x00, 0x00, 0x00, 0x49, 0xFB, 0x55, 0xC4]

  • UInt64Array

    • UInt64Array as an array of packed little endian uint64 bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 29

    • Example (Decimal to Metric bytes_value): [52, 16444743074749521625] → [0x34, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0xD9, 0x9E, 0x02, 0xD1, 0xB2, 0x76, 0x37, 0xE4]

  • FloatArray

    • FloatArray as an array of packed little endian 32-bit float bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 30

    • Example (Decimal to Metric bytes_value): [1.23, 89.341] → [0xA4, 0x70, 0x9D, 0x3F, 0x98, 0xAE, 0xB2, 0x42]

  • DoubleArray

    • DoubleArray as an array of packed little endian 64-bit float bytes

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 31

    • Example (Decimal to Metric bytes_value): [12.354213, 1022.9123213] → [0xD7, 0xA2, 0x05, 0x68, 0x5B, 0xB5, 0x28, 0x40, 0x8E, 0x17, 0x1C, 0x6F, 0x4C, 0xF7, 0x8F, 0x40]

  • BooleanArray

    • BooleanArray as an array of bit-packed bytes preceeded by a 4-byte integer that represents the total number of boolean values

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 32

    • Example (boolean array to Metric bytes_value): [false, false, true, true, false, true, false, false, true, true, false, true] → [0x0C, 0x00, 0x00, 0x00, 0x34, 0xDX]

      • Note an 'X' above is a 'do not care'. It can be either 1 or 0 but must be present so the array ends on a byte boundary.

  • StringArray

    • StringArray as an array of null terminated strings

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 33

    • Example (string array to Metric bytes_value): [ABC, hello] → [0x41, 0x42, 0x43, 0x00, 0x68, 0x65, 0x6c, 0x6c, 0x6f, 0x00]

  • DateTimeArray

    • DateTimeArray as an array of packed little endian bytes where each Datetime value is an 8-byte value representing the number of milliseconds since epoch in UTC

    • Google Protocol Buffer Type: bytes

    • Sparkplug enum value: 34

    • Example (DateTime array → ms since epoch → Metric bytes_value): ['Wednesday, October 21, 2009 5:27:55.335 AM', 'Friday, June 24, 2022 9:57:55 PM'] → [1256102875335, 1656107875000] → [0xC7, 0xD0, 0x90, 0x75, 0x24, 0x01, 0x00, 0x00, 0xB8, 0xBA, 0xB8, 0x97, 0x81, 0x01, 0x00, 0x00]

Payload Representation on Host Applications

Sparkplug B payloads in conjunction with the Sparkplug topic namespace result in hierarchical data structures that can be represented in folder structures with metrics which are often called tags.

NBIRTH

The NBIRTH is responsible for informing host applications of all of the information about the Edge Node. This includes every metric it will publish data for in the future.

There is a dependency on the MQTT CONNECT packet with regard to NBIRTH messages that are subsequently sent for that given MQTT Session. These are can be found in the Edge Node Session Establishment Section

  • [tck-id-payloads-nbirth-timestamp] NBIRTH messages MUST include a payload timestamp that denotes the time at which the message was published.

  • [tck-id-payloads-nbirth-edge-node-descriptor] Every Edge Node Descriptor in any Sparkplug infrastructure MUST be unique in the system.

    • These are used like addresses and need to be unique as a result.

  • [tck-id-payloads-nbirth-seq] Every NBIRTH message MUST include a sequence number and it MUST have a value between 0 and 255 (inclusive).

  • [tck-id-payloads-nbirth-bdseq] Every NBIRTH message MUST include a bdSeq number metric.

  • [tck-id-payloads-nbirth-bdseq-repeat] The bdSeq number value MUST match the bdSeq number value that was sent in the prior MQTT CONNECT packet WILL Message.

    • Note if a new NBIRTH is being sent (due to a Rebirth request or ant other reason) the Sparkplug Edge Node MQTT client MUST publish the same bdSeq number that was sent in the prior MQTT CONNECT packet Will Message payload. This is because the NDEATH bdSeq number MUST always match the bdSeq number of the associated NBIRTH that is stored in the MQTT Server via the MQTT Will Message.

  • [tck-id-payloads-nbirth-rebirth-req] Every NBIRTH MUST include a metric with the name 'Node Control/Rebirth' and have a boolean value of false.

    • This is used by Host Applications to force an Edge Node to send a new birth sequence (NBIRTH and DBIRTH messages) if errors are detected by the Host Application in the data stream.

  • [tck-id-payloads-nbirth-qos] NBIRTH messages MUST be published with the MQTT QoS set to 0.

  • [tck-id-payloads-nbirth-retain] NBIRTH messages MUST be published with the MQTT retain flag set to false.

The following is a representation of a simple NBIRTH message on the topic:

spBv1.0/Sparkplug B Devices/NBIRTH/Raspberry Pi

In the topic above the following information is known based on the Sparkplug topic definition:

  • The ‘Group ID’ is: Sparkplug B Devices

  • The ‘Edge Node ID’ is: Raspberry Pi

  • The 'Edge Node Descriptor' is the combination of the Group ID and Edge Node ID.

  • This is an NBIRTH message based on the 'NBIRTH' Sparkplug Verb

Consider the following Sparkplug B payload in the NBIRTH message shown above:

{
        "timestamp": 1486144502122,
        "metrics": [{
                "name": "bdSeq",
                "timestamp": 1486144502122,
                "dataType": "Int64",
                "value": 0
        }, {
                "name": "Node Control/Reboot",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Node Control/Rebirth",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Node Control/Next Server",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Node Control/Scan Rate",
                "timestamp": 1486144502122,
                "dataType": "Int64",
                "value": 3000
        }, {
                "name": "Properties/Hardware Make",
                "timestamp": 1486144502122,
                "dataType": "String",
                "value": "Raspberry Pi"
        }, {
                "name": "Properties/Hardware Model",
                "timestamp": 1486144502122,
                "dataType": "String",
                "value": "Pi 3 Model B"
        }, {
                "name": "Properties/OS",
                "timestamp": 1486144502122,
                "dataType": "String",
                "value": "Raspbian"
        }, {
                "name": "Properties/OS Version",
                "timestamp": 1486144502122,
                "dataType": "String",
                "value": "Jessie with PIXEL/11.01.2017"
        }, {
                "name": "Supply Voltage",
                "timestamp": 1486144502122,
                "dataType": "Float",
                "value": 12.1
        }],
        "seq": 0
}

This would result in a structure as follows on the Host Application.

Figure 10 – Sparkplug B Metric Structure 1

plantuml::assets/plantuml/sparkplugb-metric-structure-1.puml[format=svg, alt="Sparkplug B Metric Structure 1"]

DBIRTH

The DBIRTH is responsible for informing the Host Application of all of the information about the device. This includes every metric it will publish data for in the future.

  • [tck-id-payloads-dbirth-timestamp] DBIRTH messages MUST include a payload timestamp that denotes the time at which the message was published.

  • [tck-id-payloads-dbirth-seq] Every DBIRTH message MUST include a sequence number.

  • [tck-id-payloads-dbirth-seq-inc] Every DBIRTH message MUST include a sequence number value that is one greater than the previous sequence number sent by the Edge Node. This value MUST never exceed 255. If in the previous sequence number sent by the Edge Node was 255, the next sequence number sent MUST have a value of 0.

  • [tck-id-payloads-dbirth-order] All DBIRTH messages sent by an Edge Node MUST be sent immediately after the NBIRTH and before any NDATA or DDATA messages are published by the Edge Node.

  • [tck-id-payloads-dbirth-qos] DBIRTH messages MUST be published with the MQTT QoS set to 0.

  • [tck-id-payloads-dbirth-retain] DBIRTH messages MUST be published with the MQTT retain flag set to false.

The following is a representation of a simple DBIRTH message on the topic:

spBv1.0/Sparkplug B Devices/DBIRTH/Raspberry Pi/Pibrella

In the topic above the following information is known based on the Sparkplug topic definition:

  • The ‘Group ID’ is: Sparkplug B Devices

  • The ‘Edge Node ID’ is: Raspberry Pi

  • The ‘Device ID’ is: Pibrella

  • This is a DBIRTH message based on the 'DBIRTH' Sparkplug Verb

Consider the following Sparkplug B payload in the DBIRTH message shown above:

{
        "timestamp": 1486144502122,
        "metrics": [{
                "name": "Inputs/A",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Inputs/B",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Inputs/C",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Inputs/D",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Inputs/Button",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Outputs/E",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Outputs/F",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Outputs/G",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Outputs/H",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Outputs/LEDs/Green",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Outputs/LEDs/Red",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Outputs/LEDs/Yellow",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Outputs/Buzzer",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": false
        }, {
                "name": "Properties/Hardware Make",
                "timestamp": 1486144502122,
                "dataType": "String",
                "value": "Pibrella"
        }],
        "seq": 1
}

This would result in a structure as follows on the Host Application.

Figure 11 – Sparkplug B Metric Structure 2

plantuml::assets/plantuml/sparkplugb-metric-structure-2.puml[format=svg, alt="Sparkplug B Metric Structure 2"]

NDATA

NDATA messages are used to update the values of any Edge Node metrics that were originally published in the NBIRTH message. Any time an input changes on the Edge Node, a NDATA message should be generated and published to the MQTT Server. If multiple metrics on the Edge Node change, they can all be included in a single NDATA message. It is also important to note that changes can be aggregated and published together in a single NDATA message. Because the Sparkplug B payload uses an ordered List of metrics, multiple different change events for multiple different metrics can all be included in a single NDATA message.

  • [tck-id-payloads-ndata-timestamp] NDATA messages MUST include a payload timestamp that denotes the time at which the message was published.

  • [tck-id-payloads-ndata-seq] Every NDATA message MUST include a sequence number.

  • [tck-id-payloads-ndata-seq-inc] Every NDATA message MUST include a sequence number value that is one greater than the previous sequence number sent by the Edge Node. This value MUST never exceed 255. If in the previous sequence number sent by the Edge Node was 255, the next sequence number sent MUST have a value of 0.

  • [tck-id-payloads-ndata-order] All NDATA messages sent by an Edge Node MUST NOT be sent until all the NBIRTH and all DBIRTH messages have been published by the Edge Node.

  • [tck-id-payloads-ndata-qos] NDATA messages MUST be published with the MQTT QoS set to 0.

  • [tck-id-payloads-ndata-retain] NDATA messages MUST be published with the MQTT retain flag set to false.

The following is a representation of a simple NDATA message on the topic:

spBv1.0/Sparkplug B Devices/NDATA/Raspberry Pi

In the topic above the following information is known based on the Sparkplug topic definition:

  • The ‘Group ID’ is: Sparkplug B Devices

  • The ‘Edge Node ID’ is: Raspberry Pi

  • This is an NDATA message based on the 'NDATA' Sparkplug Verb

Consider the following Sparkplug B payload in the NDATA message shown above:

{
        "timestamp": 1486144502122,
        "metrics": [{
                "name": "Supply Voltage",
                "timestamp": 1486144502122,
                "dataType": "Float",
                "value": 12.3
        }],
        "seq": 2
}

This would result in the host application updating the value of the 'Supply Voltage' metric.

DDATA

DDATA messages are used to update the values of any device metrics that were originally published in the DBIRTH message. Any time an input changes on the device, a DDATA message should be generated and published to the MQTT Server. If multiple metrics on the device change, they can all be included in a single DDATA message. It is also important to note that changes can be aggregated and published together in a single DDATA message. Because the Sparkplug B payload uses an ordered List of metrics, multiple different change events for multiple different metrics can all be included in a single DDATA message.

  • [tck-id-payloads-ddata-timestamp] DDATA messages MUST include a payload timestamp that denotes the time at which the message was published.

  • [tck-id-payloads-ddata-seq] Every DDATA message MUST include a sequence number.

  • [tck-id-payloads-ddata-seq-inc] Every DDATA message MUST include a sequence number value that is one greater than the previous sequence number sent by the Edge Node. This value MUST never exceed 255. If in the previous sequence number sent by the Edge Node was 255, the next sequence number sent MUST have a value of 0.

  • [tck-id-payloads-ddata-order] All DDATA messages sent by an Edge Node MUST NOT be sent until all the NBIRTH and all DBIRTH messages have been published by the Edge Node.

  • [tck-id-payloads-ddata-qos] DDATA messages MUST be published with the MQTT QoS set to 0.

  • [tck-id-payloads-ddata-retain] DDATA messages MUST be published with the MQTT retain flag set to false.

The following is a representation of a simple DDATA message on the topic:

spBv1.0/Sparkplug B Devices/DDATA/Raspberry Pi/Pibrella

  • The ‘Group ID’ is: Sparkplug B Devices

  • The ‘Edge Node ID’ is: Raspberry Pi

  • The ‘Device ID’ is: Pibrella

  • This is an DDATA message based on the 'NDATA' Sparkplug Verb

Consider the following Sparkplug B payload in the DDATA message shown above:

{
        "timestamp": 1486144502122,
        "metrics": [{
                "name": "Inputs/A",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": true
        }, {
                "name": "Inputs/C",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": true
        }],
        "seq": 0
}

This would result in the Host Application updating the value of the ‘Inputs/A’ metric and ‘Inputs/C’ metric.

NCMD

NCMD messages are used by Host Applications to write to Edge Node outputs and send Node Control commands to Edge Nodes. Multiple metrics can be supplied in a single NCMD message.

  • [tck-id-payloads-ncmd-timestamp] NCMD messages MUST include a payload timestamp that denotes the time at which the message was published.

  • [tck-id-payloads-ncmd-seq] Every NCMD message MUST NOT include a sequence number.

  • [tck-id-payloads-ncmd-qos] NCMD messages MUST be published with the MQTT QoS set to 0.

  • [tck-id-payloads-ncmd-retain] NCMD messages MUST be published with the MQTT retain flag set to false.

The following is a representation of a simple NCMD message on the topic:

spBv1.0/Sparkplug B Devices/NCMD/Raspberry Pi
  • The ‘Group ID’ is: Sparkplug B Devices

  • The ‘Edge Node ID’ is: Raspberry Pi

  • This is an NCMD message based on the 'NDATA' Sparkplug Verb

Consider the following Sparkplug B payload in the NCMD message shown above:

{
        "timestamp": 1486144502122,
        "metrics": [{
                "name": "Node Control/Rebirth",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": true
        }]
}

This NCMD payload tells the Edge Node to republish its NBIRTH and DBIRTH(s) messages. This can be requested if a Host Application gets an out of order seq number or if a metric arrives in an NDATA or DDATA message that was not provided in the original NBIRTH or DBIRTH messages.

DCMD

DCMD messages are used by Host Applications to write to device outputs and send Device Control commands to devices. Multiple metrics can be supplied in a single DCMD message.

  • [tck-id-payloads-dcmd-timestamp] DCMD messages MUST include a payload timestamp that denotes the time at which the message was published.

  • [tck-id-payloads-dcmd-seq] Every DCMD message MUST NOT include a sequence number.

  • [tck-id-payloads-dcmd-qos] DCMD messages MUST be published with the MQTT QoS set to 0.

  • [tck-id-payloads-dcmd-retain] DCMD messages MUST be published with the MQTT retain flag set to false.

The following is a representation of a simple DCMD message on the topic:

spBv1.0/Sparkplug B Devices/DCMD/Raspberry Pi/Pibrella
  • The ‘Group ID’ is: Sparkplug B Devices

  • The ‘Edge Node ID’ is: Raspberry Pi

  • The ‘Device ID’ is: Pibrella

  • This is an DCMD message based on the 'DCMD' Sparkplug Verb

Consider the following Sparkplug B payload in the DCMD message shown above:

{
        "timestamp": 1486144502122,
        "metrics": [{
                "name": "Outputs/LEDs/Green",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": true
        }, {
                "name": "Outputs/LEDs/Yellow",
                "timestamp": 1486144502122,
                "dataType": "Boolean",
                "value": true
        }]
}

The DCMD payload tells the Edge Node to write true to the attached device’s green and yellow LEDs. As a result, the LEDs should turn on and result in a DDATA message back to the MQTT Server after the LEDs are successfully turned on.

NDEATH

The NDEATH messages are registered with the MQTT Server in the MQTT CONNECT packet as the 'Will Message'. This is used by Host Applications to know when an Edge Node has lost its MQTT connection with the MQTT Server.

  • [tck-id-payloads-ndeath-seq] Every NDEATH message MUST NOT include a sequence number.

  • [tck-id-payloads-ndeath-will-message] An NDEATH message MUST be registered as a Will Message in the MQTT CONNECT packet.

  • [tck-id-payloads-ndeath-will-message-qos] The NDEATH message MUST set the MQTT Will QoS to 1 in the MQTT CONNECT packet.

  • [tck-id-payloads-ndeath-will-message-retain] The NDEATH message MUST set the MQTT Will Retained flag to false in the MQTT CONNECT packet.

  • [tck-id-payloads-ndeath-bdseq] The NDEATH message MUST include the same bdSeq number value that will be used in the associated NBIRTH message.

    • This is used by Host Applications to correlate the NDEATH messages with a previously received NBIRTH message.

    • It is important to note that any new CONNECT packet must increment the bdSeq number in the payload compared to what was in the previous CONNECT packet. This ensures that any Host Applications will be able to distinguish between current and old bdSeq numbers in the event that messages are delivered out of order. When incrementing the bdSeq number, if the previous value was 255, the next must be zero.

  • [tck-id-payloads-ndeath-will-message-publisher] An NDEATH message SHOULD be published by the Edge Node before it intentionally disconnects from the MQTT Server.

    • This allows Host Applications advanced notice that an Edge Node has disconnected rather than waiting for the NDEATH to be delivered by the MQTT Server based on an MQTT keep alive timeout.

  • [tck-id-payloads-ndeath-will-message-publisher-disconnect-mqtt311] If the Edge Node is using MQTT 3.1.1 and it sends an MQTT DISCONNECT packet, the Edge Node MUST publish an NDEATH message to the MQTT Server before it sends the MQTT DISCONNECT packet.

    • This is to ensure Host Applications are notified that the Edge Node is disconnecting. Because an MQTT DISCONNECT packet is sent, the MQTT Server will not deliver the Will Message/NDEATH on behalf of the disconnecting Edge Node.

  • [tck-id-payloads-ndeath-will-message-publisher-disconnect-mqtt50] If the Edge Node is using MQTT 5.0 and it sends an MQTT DISCONNECT packet, the MQTT v5.0 'Disconnect with Will Message' reason code MUST be set in the DISCONNECT packet.

    • This is to ensure Host Applications are notified that the Edge Node is disconnecting by the MQTT Server.

  • An NDEATH message MAY include a timestamp.

    • It should be noted that this timestamp is typically set at the time of the MQTT CONNECT message and as a result may not be useful to Host Applications. If the timestamp is set, Host Applications SHOULD NOT use it to determine corresponding NBIRTH messages. Instead, the bdSeq number used in the NBIRTH and NDEATH messages MUST be used to determine that an NDEATH matches a prior NBIRTH.

The following is a representation of a NDEATH message on the topic:

spBv1.0/Sparkplug B Devices/NDEATH/Raspberry Pi
  • The ‘Group ID’ is: Sparkplug B Devices

  • The ‘Edge Node ID’ is: Raspberry Pi

  • This is an NDEATH message based on the 'NDEATH' Sparkplug Verb

Consider the following Sparkplug B payload in the NDEATH message shown above:

{
        "timestamp": 1486144502122,
        "metrics": [{
                "name": "bdSeq",
                "timestamp": 1486144502122,
                "dataType": "UInt64",
                "value": 0
        }]
}

The payload metric named bdSeq allows a Host Application to reconcile this NDEATH with the NBIRTH that occurred previously.

DDEATH

The DDEATH messages are published by an Edge Node on behalf of an attached device. If the Edge Node determines that a device is no longer accessible (i.e. it has turned off, stopped responding, etc.) the Edge Node should publish a DDEATH to denote that device connectivity has been lost.

  • [tck-id-payloads-ddeath-timestamp] DDEATH messages MUST include a payload timestamp that denotes the time at which the message was published.

  • [tck-id-payloads-ddeath-seq] Every DDEATH message MUST include a sequence number.

  • [tck-id-payloads-ddeath-seq-inc] Every DDEATH message MUST include a sequence number value that is one greater than the previous sequence number sent by the Edge Node. This value MUST never exceed 255. If in the previous sequence number sent by the Edge Node was 255, the next sequence number sent MUST have a value of 0.

The following is a representation of a simple DDEATH message on the topic:

spBv1.0/Sparkplug B Devices/DDEATH/Raspberry Pi/Pibrella
  • The ‘Group ID’ is: Sparkplug B Devices

  • The ‘Edge Node ID’ is: Raspberry Pi

  • The ‘Device ID’ is: Pibrella

  • This is a DDEATH message based on the 'DDEATH' Sparkplug Verb

Consider the following Sparkplug B payload in the DDEATH message shown above:

{
        "timestamp": 1486144502122,
        "seq": 123
}

[tck-id-payloads-ddeath-seq-number] A sequence number MUST be included with the DDEATH messages so the Host Application can ensure order of messages and maintain the state of the data.

STATE

As noted previously, the STATE messages published by Sparkplug Host Applications do not use Sparkplug B payloads. State messages are used by Sparkplug Host Applications to denote to Edge Nodes whether or not the Sparkplug Host Application is online and operational or not.

  • [tck-id-payloads-state-will-message] Sparkplug Host Applications MUST register a Will Message in the MQTT CONNECT packet on the topic 'spBv1.0/STATE/[sparkplug_host_id]'.

    • The [sparkplug_host_id] should be replaced with the Sparkplug Host Application’s ID. This can be any UTF-8 string.

  • [tck-id-payloads-state-will-message-qos] The Sparkplug Host Application MUST set the the MQTT Will QoS to 1 in the MQTT CONNECT packet.

  • [tck-id-payloads-state-will-message-retain] The Sparkplug Host Application MUST set the Will Retained flag to true in the MQTT CONNECT packet.

  • [tck-id-payloads-state-will-message-payload] The Death Certificate Payload MUST be JSON UTF-8 data. It MUST include two key/value pairs where one key MUST be 'online' and it’s value is a boolean 'false'. The other key MUST be 'timestamp' and the value MUST be a numeric value representing the current UTC time in milliseconds since Epoch.

  • [tck-id-payloads-state-subscribe] After establishing an MQTT connection, the Sparkplug Host Application MUST subscribe on it’s own 'spBv1.0/STATE/[sparkplug_host_id]' topic.

    • The [sparkplug_host_id] should be replaced with the Sparkplug Host Application’s ID. This can be any UTF-8 string.

    • Non-normative comment: This allows the Sparkplug Host Application handle timing issues around STATE 'offline' messages being published on it’s behalf by the MQTT Server when it is in fact online.

  • [tck-id-payloads-state-birth] After subscribing on it’s own spBv1.0/STATE/[sparkplug_host_id] topic, the Sparkplug Host Application MUST publish an MQTT message on the topic 'spBv1.0/STATE/[sparkplug_host_id]' with a QoS of 1, and the retain flag set to true.

    • The [sparkplug_host_id] should be replaced with the Sparkplug Host Application’s ID. This can be any UTF-8 string.

  • [tck-id-payloads-state-birth-payload] The Birth Certificate Payload MUST be JSON UTF-8 data. It MUST include two key/value pairs where one key MUST be 'online' and it’s value is a boolean 'true'. The other key MUST be 'timestamp' and the value MUST match the timestamp value that was used in the immediately prior MQTT CONNECT packet Will Message payload.