Skip to content
This repository has been archived by the owner on Mar 17, 2021. It is now read-only.

NiftyNet Dev Meeting 15th April 2019

csudre edited this page Apr 15, 2019 · 2 revisions

Attendance:

Wenqi, Felix, Tom Va., Jorge, Pedro, Ben, Carole

Strategy forward:

  • Discussion around ways to cut between modular parts - Particular issue of default activation layer with loss function / Pb of associated documentation
  • Discussion around move towards tensorflow 2.0 - Details to be discussed on dedicated meeting 10th May 2019 2pm - Input from TomVa. Changes associated to Application driver (no session anymore), collection of variables (no call to get Collection), use of tf.layers instead of current template op.
  • Move towards more abstracted use
  • Need to integrate as much as possible existing feature recently developed
  • Need for detailed "How to pages" with examples of actual code implementation for user-dependent features
  • Need to find ways to smooth learning curve for new adopters of NiftyNet
  • Need for combination of commented contrib systems with recent published applications - ALL to act on this

Currently developed features to be included

Current assignment based on main developers of associated features.

Network

  • Hemis - Tom
  • VGG - Felix
  • Guotai's BRATS network

Custom layers

  • Gaussian Sampler - Ben
  • Stochastic filter group (Masking kernels) - Felix
  • Probability layer (use different probability distribution) - Felix

Preprocessing

  • Histogram regression to classification - Felix
  • MR Augmentation: motion, blurring, noise, RF spike - Richard

Sampler

  • Uniform sampler with pad authorization - Ben
  • CSV reader - Carole/Tom
    • exists as a module but still difficult to be used directly (need many modifications)

Handler - Tom

  • Performance based gradient adjustments
    • Learning rate decay
    • Early stopping performance based
    • Reduced learning rate on plateau
  • Online update
  • Whole volume validation (still needs review and testing)

Custom applications - to use as demo / example

  • Customised application using CSV reader - Pedro / Ze upon publication
  • Multi-task - Felix upon publication
  • Reinforcement learning - Kerstin upon publication
  • Distillation loss - Irme upon publication
  • RCNN/object detection - Carole
  • Counting - Zach upon publication
  • VAE modification - Reuben upon publication

Losses

  • Quality Driven Loss - Zach
  • Cosine similarity gradient enforcement - Irme
  • Cosine loss for direction regression - Carole
  • Smooth L1 - Carole
  • Volume consistency - Carole
  • Volume existence - Carole
  • Variability loss - Carole
  • Weight batch - Carole

Post-processing / Outputs

  • Model introspection - With standalone demo jupyter notebook - Felix
  • Multiple outputs - Demo - Felix/Carole

Tensorboard

  • Histogram viewer

Documentation strategy - how to pages

  • Loading model from different graph - Tom
  • Event handler - Tom
  • Multi output - Carole
  • Custom placeholders - Tom
Clone this wiki locally