Skip to content

Output Documentation

Timothy Dunn edited this page May 18, 2021 · 5 revisions

DANNCE training and predictions produces several different output files, which are explained in more detail here.

dannce-train. Produces outputs in the dannce_train_dir indicated in your io.yaml file (Default: ./DANNCE/train_results.

  • logs (directory). Directory of tensorboard log files. To visualize your loss curves, launch tensorboard --logdir=./logs/ from the terminal when inside ./DANNCE/train_results
  • fullmodel_weights (directory). Directory containing a full model file at the end of training (i.e. architecture + weights + optimizer state).
  • copy_params.mat. File containing the values of all configuration parameters.
  • training.csv. File containing the loss and metric values for each epoch.
  • train_samples.pickle. File containing references to the training samples used.
  • val_samples.pickle. File containing references to the validation samples used. If this file is referenced in the load_valid argument in a subsequent training session, dannce-train will use these validation samples for validation. Alternatively, the data_split_seed can be specified as a configuration option to ensure validation reproducibility.
  • weights.{epoch}-{loss}.hdf5. Full model file for the indicated epoch. Only the best epoch, according to validation loss, will be saved here.
  • checkpoint_predictions_e{epoch}.mat. File containing DANNCE predictions over your validation dataset, at the indicated epoch. These files are saved every time the network reaches a lower validation loss value.
  • weights.checkpoint.epoch{epoch}.val_loss{val_loss}. Full model file saved every 250th epoch.

dannce-predict. Produces outputs in the dannce_predict_dir indicated in your io.yaml file (Default: ./DANNCE/predict_results.

  • com3d_used.mat. File containing the 3D COM used by DANNCE to anchor 3D volumes.
  • save_data_AVG.mat. Checkpoint predictions, saved every 1000th batch. Can be deleted if prediction finished successfully.
  • save_data_AVG{start_batch}.mat. If prediction finished successfully, the final set of predictions will be in this files. The number corresponds to the starting batch used for prediction. This is useful for parallelizing prediction over multiple GPUs, in which different start_batch values can be used to allocate different stretches of video across each GPU. The file contains the following variables:
    • pred. n_frames x 3 x n_landmarks. DANNCE predictions.
    • data. n_frames x 3 x n_landmarks. Ground truth labels. This is only used when working with large motion capture datasets.
    • p_max. n_frames x n_landmarks. Peak output map confidence for each landmark in each frame.
    • sampleID. 1 x n_frames. Global sample identifier for each frame.
    • metadata. Struct of all configuration parameter values used for prediction.

com-train. Produces outputs in the com_train_dir indicated in your io.yaml file (Default: ./COM/train_results.

  • logs (directory). Directory of tensorboard log files. To visualize your loss curves, launch tensorboard --logdir=./logs/ from the terminal when inside ./COM/train_results.
  • fullmodel_weights (directory). Directory containing a full model file at the end of training (i.e. architecture + weights + optimizer state).
  • copy_params.mat. File containing the values of all configuration parameters.
  • training.csv. File containing the loss and metric values for each epoch.
  • train_samples.pickle. File containing references to the training samples used.
  • val_samples.pickle. File containing references to the validation samples used. If this file is referenced in the load_valid argument in a subsequent training session, com-train will use these validation samples for validation. Alternatively, the data_split_seed can be specified as a configuration option to ensure validation reproducibility.
  • weights.{epoch}-{loss}.hdf5. Network weights for the indicated epoch.

com-predict. Produces outputs in the com_predict_dir indicated in your io.yaml file (Default: ./COM/predict_results.

  • com3d.mat. File containing 3D COM coordinates, sampleIDs, and associated metadata.
  • com3d.pickle. An expanded file containing individual 2D COM predictions in addition to 3D triangulation. If using this file in dannce, an expanded set of COM operations are enabled (e.g. filtering COM based on network confidence).
Clone this wiki locally