Skip to content

Using TensorBoard for Visualization

Vit Stepanovs edited this page Feb 23, 2017 · 8 revisions

TensorBoard is a suite of visualization tools that makes it easier to understand and debug deep learning programs. For example, it allows viewing the model graph, plotting various scalar values as the training progresses, and visualizing the embeddings.

CNTK TensorBoardProgressWriter class in Python now supports output in the native TensorBoard format, thus enabling rich visualization capabilities for CNTK jobs. At present, TensorBoardProgressWriter can be used to:

  • Record model graph.
  • Record arbitrary scalar values during training.
  • Automatically record the values of a loss function and error rate during training.

CNTK model graph as displayed in TensorBoard.

Loss and error rate logged from CNTK and displayed in TensorBoard.

[Examples/TensorBoard/SimpleMNIST.py] (https://github.com/Microsoft/CNTK/blob/master/Examples/Tensorboard/SimpleMNIST.py) script provides an example of how to generate output in TensorBoard format.

First, you need to instantiate a TensorBoardProgressWriter class by providing some of the following arguments:

  • freq – how frequently to log to output files. E.g. the value of 2 will cause every second call to update method to write to disk.
  • log_dir - a directory where the output files will be created.
  • rank - in case of distributed training, this should be set to a rank of a worker. If set, TensorBoardProgressWriter makes sure that only progress from worker 0 is recorded.
  • model – a CNTK model to visualize.

For example, the below line instantiates a TensorBoardProgressWriter that will create files in the ‘log’ directory and write to disk on every 10th call. It will also persist the my_model's graph for visualization later.

tensorboard_writer = TensorBoardProgressWriter(freq=10, log_dir=log’, model=my_model)

You then need to provide the above object to Trainer upon construction:

trainer = Trainer(my_model, (ce, pe), learner, tensorboard_writer)

The Trainer object will make sure to update the TensorBoardProgressWriter with the values of loss/evaluation metric after training/testing on each minibatch. Therefore, you do not need to explicitly call TensorBoardProgressWriter to record these values. To record any other scalar values, you can use write_value() method, e.g.:

    # Log mean of each parameter tensor, to confirm that the parameters change indeed.
    # Don't want to do that very often though, not to spend too much time computing the mean.
    if minibatch_idx % 10000 == 0:
        for p in my_model.parameters:
            tensorboard_writer.write_value(p.uid + "/mean",  reduce_mean(p).eval(), minibatch_idx)

TensorBoard is not part of CNTK package and should be installed separately. After the installation, once your training job is started, you can launch TensorBoard to monitor its progress by running the following command:

    tensorboard --logdir=log

(assuming the command is run from the script’s working directory) and navigate to http://localhost:6006/ in your favorite web-browser.

Clone this wiki locally