NDN EBAMS node running in mini-ndn
Python JavaScript TeX HTML CSS Protocol Buffer C++
Failed to load latest commit information.



Mini-bms is an update on the NDN BMS work, which features sensor data aggregation by type and location, and an option to run in mini-ndn.

The design, implementation, and current deployment is explained in this poster at NDNComm 2015. A screenshot of the sample visualization unit is available here.


  • BMS publisher: gateway-publisher/bms_publisher.py
    • Gateway publisher reads data from a file, parses the data, generates the NDN data packet and inserts into its own memory content cache.
    • A sensor data entry in the input file looks like:
      [2015-02-04 02:00:52.986]: 128:8: Process message (point UCLA:YOUNG_LIBRY.B1716.CHWS.RT 1 -522.84515380859375 0 0 0 1423044067 419999837 577 192)
    • Publisher uses the result of csv_reader.py to decide the mapping from sensor data entries to NDN names. csv_reader.py reads bms-sensor-data-types-sanitized.csv to generate te mapping.
  • BMS node: bms_node.py
    • BMS node reads a configuration file to decide its node name, which types of data this node should ask for, and what aggregated data this node should produce.
    • BMS node uses config_split.py to parse the configuration file. Example configuration files are given in confs folder.
  • BMS consumer: consumer/index.html
    • An in-browser NDN consumer and visualizer of the produced aggregated data.
  • BMS certification service: a quick hack of a BMS certificate issuing web service, so that the aggregation nodes and gateway publisher can have their certificates signed by BMS's root of trust. This service signs any certificates received. Consumer would then be able to verify the BMS data with the correct root of trust installed in certs/anchor.cert. This site is based on ndncert.

How to use

  • Start NFD on all nodes
  • (Optional) Run the BMS certification service.
    git clone https://github.com/zhehaowang/openmhealth-cert
    cd openmhealth-cert
    git checkout -b bms-cert-hack origin/bms-cert-hack
    cd www
    python ndncert-server.py
    • Running the certification service requires Python Flask and MongoDB installed. In MongoDB there should be a "ndncert" database, with at least one entry in "operators" table that looks like the following. The prefix's corresponding public/private key should also be available in the system
      { "_id" : "0", "site_prefix" : "/org/openmhealth", "site_name" : "dummy", "email" : "wangzhehao410305@gmail.com", "name" : "zhehao" }
    • After the certification service's running, you may need to change the hardcoded address here and here
    • Dump the certificate that corresponds to the site_prefix, and copy the certificate to this file, so that it becomes your root of trust
  • Start gateway publisher
    • To follow (tail -f) a file:
      python gateway-publisher/bms_publisher.py -f ucla-datahub.log
    • Or instead, read all the data points from an existing file.
      python gateway-publisher/bms_publisher.py ucla-datahub.log
  • Option 1: start bms nodes from a customized mini-ndn fork:

    • Install mini-ndn (this customized fork of mini-ndn contains a bms experiment, and static routes configurable from a file); See the readme here for mini-ndn's complete installation instructions.
      git clone https://github.com/zhehaowang/mini-ndn
      cd mini-ndn
      sudo ./install.sh -i
    • Start bms experiment in mini-ndn
      sudo minindn confs/minindn-bms-topology.conf --experiment=bms
      Running in mini-ndn takes care of starting multiple bms-nodes with their corresponding configuration files on one physical machine, and configuring the static routes between them. All of this can be done manually on different machines as well.
  • Option 2: run individual bms node from command line

    python bms_node.py --conf=\
  • Start consumer: open consumer/index.html in a browser. Configure the consumer data names here.

  • Configure routes: interest from bms nodes should be able to get to the gateway publisher, or its child nodes. Interest from the consumer should be able to get to the nodes that publish the expected aggregations.


Zhehao Wang: zhehao@remap.ucla.edu Jiayi Meng: mjycom@hotmail.com