Skip to content

Latest commit

 

History

History
442 lines (307 loc) · 16.9 KB

README.md

File metadata and controls

442 lines (307 loc) · 16.9 KB

protobuf.js

travis npm npm donate

Protocol Buffers are a language-neutral, platform-neutral, extensible way of serializing structured data for use in communications protocols, data storage, and more, originally designed at Google (see).

protobuf.js is a pure JavaScript implementation for node and the browser. It efficiently encodes plain objects and custom classes and works out of the box with .proto files.

Recommended read: Changes in protobuf.js 6.0

Features

Contents

  • Usage
    How to include protobuf.js in your project.

  • Examples
    A few examples to get you started.

  • Module Structure
    A brief introduction to the structure of the exported module.

  • Documentation
    A list of available documentation resources.

  • Command line
    How to use the command line utility.

  • Building
    How to build the library and its components yourself.

  • Performance
    A few internals and a benchmark on performance.

  • Compatibility
    Notes on compatibility regarding browsers and optional libraries.

Usage

node.js

$> npm install protobufjs
var protobuf = require("protobufjs");

Browsers

Development:

<script src="//cdn.rawgit.com/dcodeIO/protobuf.js/6.0.1/dist/protobuf.js"></script>

Production:

<script src="//cdn.rawgit.com/dcodeIO/protobuf.js/6.0.1/dist/protobuf.min.js"></script>

The protobuf namespace will be available globally.

NOTE: Remember to replace the version tag with the exact release your project depends upon.

Examples

Using .proto files

// awesome.proto
package awesomepackage;
syntax = "proto3";

message AwesomeMessage {
    string awesome_field = 1; // becomes awesomeField
}
protobuf.load("awesome.proto", function(err, root) {
    if (err) throw err;
    
    // Obtain a message type
    var AwesomeMessage = root.lookup("awesomepackage.AwesomeMessage");

    // Create a new message
    var message = AwesomeMessage.create({ awesomeField: "AwesomeString" });

    // Encode a message
    var buffer = AwesomeMessage.encode(message).finish();
    // ... do something with buffer

    // Or, encode a plain object
    var buffer = AwesomeMessage.encode({ awesomeField: "AwesomeString" }).finish();
    // ... do something with buffer

    // Decode a buffer
    var message = AwesomeMessage.decode(buffer);
    // ... do something with message

    // If your application uses length-delimited buffers, there is also encodeDelimited and decodeDelimited. 
});

You can also use promises by omitting the callback:

protobuf.load("awesome.proto")
    .then(function(root) {
       ...
    });

Using reflection only

...
var Root  = protobuf.Root,
    Type  = protobuf.Type,
    Field = protobuf.Field;

var AwesomeMessage = new Type("AwesomeMessage").add(new Field(1, "awesomeField", "string"));

var root = new Root().define("awesomepackage").add(AwesomeMessage);

// Continue at "Create a new message" above
...

Using custom classes

...
var Prototype = protobuf.Prototype;

function AwesomeMessage(properties) {
    Prototype.call(this, properties);
}
protobuf.inherits(AwesomeMessage, root.lookup("awesomepackage.AwesomeMessage") /* or use reflection */);

var message = new AwesomeMessage({ awesomeField: "AwesomeString" });

// Continue at "Encode a message" above

Custom classes are automatically populated with static encode, encodeDelimited, decode, decodeDelimited and verify methods and reference their reflected type via the $type property. Note that there are no methods (just $type) on instances by default as method names might conflict with field names.

Usage with TypeScript

/// <reference path="node_modules/protobufjs/types/protobuf.js.d.ts" />

import * as protobuf from "protobufjs";
...

Module Structure

The library exports a flat protobuf namespace with the following members, ordered by category:

Parser

  • load(filename: string|Array, [root: Root], [callback: function(err: Error, [root: Root])]): Promise [source]
    Loads one or multiple .proto files into the specified root or creates a new one when omitted.

  • tokenize(source: string): Object [source]
    Tokenizes the given .proto source and returns an object with useful utility functions.

  • parse(source: string): Object [source]
    Parses the given .proto source and returns an object with the parsed contents.

    • package: string|undefined
      The package name, if declared.

    • imports: Array|undefined
      File names of imported files, if any.

    • weakImports: Array|undefined
      File names of weakly imported files, if any.

    • syntax: string|undefined
      Source syntax, if defined.

    • root: Root
      The root namespace.

Serialization

  • Writer [source]
    Wire format writer using Uint8Array if available, otherwise Array.

  • BufferWriter extends Writer [source]
    Wire format writer using node buffers.

  • Reader [source]
    Wire format reader using Uint8Array if available, otherwise Array.

  • BufferReader extends Reader [source]
    Wire format reader using node buffers.

  • Encoder [source]
    Wire format encoder using code generation on top of reflection.

  • Decoder [source]
    Wire format decoder using code generation on top of reflection.

  • Verifier [source]
    Runtime message verifier using code generation on top of reflection.

Reflection

  • ReflectionObject [source]
    Base class of all reflection objects.

  • Namespace extends ReflectionObject [source]
    Base class of all reflection objects containing nested objects.

  • Root extends Namespace [source]
    Root namespace.

  • Type extends Namespace [source]
    Reflected message type.

  • Field extends ReflectionObject [source]
    Reflected message field.

  • MapField extends Field [source]
    Reflected message map field.

  • Enum extends ReflectionObject [source]
    Reflected enum.

  • Service extends Namespace [source]
    Reflected service.

  • Method extends ReflectionObject [source]
    Reflected service method.

Runtime

  • inherits(clazz: Function, type: Type, [options: Object.<string,*>]): Prototype [source]
    Inherits a custom class from the message prototype of the specified message type.

  • Prototype [source]
    Runtime message prototype ready to be extended by custom classes or generated code.

Utility

  • util: Object [source]
    Utility functions.

  • common(name: string, json: Object) [source]
    Provides common type definitions.

  • types: Object [source]
    Common type constants.

Documentation

Data type recommendations

Value type protobuf Type Size / Notes
Unsigned 32 bit integers uint32 1 to 5 bytes.
Signed 32 bit integers sint32 1 to 5 bytes. Do not use int32 (always encodes negative values as 10 bytes).
Unsigned 52 bit integers uint64 1 to 10 bytes.
Signed 52 bit integers sint64 1 to 10 bytes. Do not use int64 (always encodes negative values as 10 bytes).
Unsigned 64 bit integers uint64 Use with long.js. 1 to 10 bytes.
Signed 64 bit integers sint64 Use with long.js. 1 to 10 bytes. Do not use int64 (always encodes negative values as 10 bytes).
32 bit precision floats float 4 bytes.
64 bit precision floats double 8 bytes. Use float if 32 bits of precision are enough.
Boolean values bool 1 byte.
Strings string 1 byte + utf8 byte length.
Buffers bytes 1 byte + byte length.

Command line

The pbjs command line utility can be used to bundle and translate between .proto and .json files.

Consolidates imports and converts between file formats.

  -t, --target    Specifies the target format. [json, proto2, proto3, static]
                  Also accepts a path to require a custom target.

  -p, --path      Adds a directory to the include path.

  -o, --out       Saves to a file instead of writing to stdout.

  -w, --wrap      Specifies an alternative wrapper for the static target.

usage: pbjs [options] file1.proto file2.json ...

For production environments it is recommended to bundle all your .proto files to a single .json file, which reduces the number of network requests and parser invocations required:

$> pbjs -t json file1.proto file2.proto > bundle.json

Now, either include this file in your final bundle:

var root = protobuf.Root.fromJSON(require("./bundle.json"));

or load it the usual way:

protobuf.load("bundle.json", function(err, root) {
    ...
});

Building

To build the library or its components yourself, clone it from GitHub and install the development dependencies:

$> git clone https://github.com/dcodeIO/protobuf.js.git
$> cd protobuf.js
$> npm install --dev

Building the development and production versions with their respective source maps to dist/:

$> npm run build

Building the documentation to docs/:

$> npm run docs

Building the TypeScript definition to types/:

$> npm run types

Browserify integration

protobuf.js integrates into any browserify build-process. There are a few possible tweaks:

  • If performance is a concern or IE8 support is required, you should make sure to exclude the browserified buffer module and let protobuf.js do its thing with Uint8Array/Array instead.
  • If you do not need int64 support, you can exclude the long module.
  • If your application does not rely on the following modules and/or package size is a concern, you can also exclude process , _process and fs.
  • If you have any special requirements, there is the bundler as a reference.

Performance

The package includes a benchmark that tries to compare performance to native JSON as far as this is possible. On an i7-2600K running node 6.9.1 it yields:

benchmarking encoding performance ...

Type.encode to buffer x 471,717 ops/sec ±1.30% (91 runs sampled)
JSON.stringify to string x 310,406 ops/sec ±1.00% (90 runs sampled)
JSON.stringify to buffer x 172,766 ops/sec ±1.20% (84 runs sampled)

      Type.encode to buffer was fastest
   JSON.stringify to string was 34.0% slower
   JSON.stringify to buffer was 63.3% slower

benchmarking decoding performance ...

Type.decode from buffer x 1,285,867 ops/sec ±0.70% (90 runs sampled)
JSON.parse from string x 292,106 ops/sec ±1.00% (89 runs sampled)
JSON.parse from buffer x 259,361 ops/sec ±0.92% (90 runs sampled)

    Type.decode from buffer was fastest
     JSON.parse from string was 77.4% slower
     JSON.parse from buffer was 79.9% slower

benchmarking combined performance ...

Type to/from buffer x 238,382 ops/sec ±0.96% (89 runs sampled)
JSON to/from string x 127,352 ops/sec ±0.73% (93 runs sampled)
JSON to/from buffer x 89,593 ops/sec ±0.85% (87 runs sampled)

        Type to/from buffer was fastest
        JSON to/from string was 46.5% slower
        JSON to/from buffer was 62.4% slower

Note that JSON is a native binding nowadays and as such is really fast. So, how can protobuf.js be faster?

  • The benchmark is somewhat flawed.
  • Reader and writer interfaces configure themselves according to the environment to eliminate redundant conditionals.
  • Node-specific reader and writer subclasses benefit from node's buffer binding.
  • Reflection has built-in code generation that builds type-specific encoders, decoders and verifiers at runtime.
  • Encoders and decoders do not verify that required fields are present (with proto3 this is dead code anyway). There is a verify method to check this manually instead - where applicable.
  • For entirely bogus values encoders intentionally rely on runtime errors to be thrown somewhere down the road.
  • Quite a bit of V8-specific profiling is accountable for everything else.

Note that code generation requires new Function(...) (basically eval) support and that an equivalent but slower fallback will be used where unsupported.

Also note that as of this writing, the benchmark suite performs significantly slower on node 7.2.0 compared to 6.9.1 because moths.

Compatibility

Sauce Test Status

  • Because the internals of this package do not rely on google/protobuf/descriptor.proto, options are parsed and presented literally.
  • If typed arrays are not supported by the environment, plain arrays will be used instead.
  • Support for pre-ES5 environments like IE8 can be achieved by using a polyfill and, instead of using property getters and setters on reflection objects, calling the respective functions prefixed with get, set or is directly (i.e. calling Type#getFieldsById() instead of accessing Type#fieldsById).
  • If you need a proper way to work with 64 bit values (uint64, int64 etc.), you can install long.js alongside this library. All 64 bit numbers will then be returned as a Long instance instead of a possibly unsafe JavaScript number (see).

License: Apache License, Version 2.0, bundled external libraries may have their own license

Analytics