Skip to content

davidchen9830/wmh_challenge_segmentation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

64 Commits
 
 
 
 
 
 

Repository files navigation

wmh_segmentation_challenge

Documentation

Implementation

U-NET model for white matter segmentation

Results

Dataset

The dataset is the one used during the WMH Segmentation Challenge In our repository, it should be placed under a datasets/ folder

Generate the data for training

Before training the model, we need to generate the data that will be used, to do this, you will need to launch:

  • python generate_data.py <path/to/datasets> <save/dir>

Note that the <save/dir> is the directory where the train.pickle file will be stored.

Usage

cd models/clean

Training

python3 main.py <path/to/dataset.pickle> preprocess:0|1 <3d:2|3>

Please look at the file main.py for more information regarding the parameters

Examples of training usage:

  • python3 main.py data/train.pickle 0 2
  • python3 main.py data/train.pickle 1 2
  • python3 main.py data/train.pickle 0 3
  • python3 main.py data/train.pickle 1 3

Testing

python3 main.py <path/to/dataset.pickle> preprocess:0|1 <3d:2|3>

<path/to/dataset.pickle> should be something like */test.pickle

Please look at the file main.py for more information regarding the parameters After generating the weights and the result file, you can test the model.

python3 test_model.py <path/to/dataset> <path/to/test_pickle> <result.pickle>

With:

  • <path/to/dataset> The root folder of the dataset
  • <path/to/test_pickle> The same argument given to <path/to/dataset.pickle>
  • <result.pickle> The file generated by the command above with "python3 main.py ...."

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published