Table of Contents
- Software
- Conda: Software and Python Environments
- Cluster
- unison
- Multi-Atlas Brain Segmentation (MABS)
Software app to run the PNL pipelines, and for writing your own custom pipelines.
- homepage: https://github.com/reckbo/pnlpipe
Install into your project directory:
git clone https://github.com/reckbo/pnlpipe
See pnlpipe for full documentation.
pnlpipe
has a set of recipes for building most of the software used at the
PNL. It can install multiple versions of the same package, which makes it easy
to upgrade software for your project without affecting anyone else's currently
running projects. It installs the software to $PNLPIPE_SOFT
, which on the cluster
is set to /data/pnl/soft
and on the network is set to /rfanfs/pnl-zorro/software
.
($soft
is also set to these directories.)
To get a list of available software packages:
cd /data/pnl/soft/pnlpipe
./pnlpipe install -h
Usage:
pnlpipe install [SWITCHES] softwareModule
where softwareModule is one of:
BRAINSTools
FreeSurfer
HCPPipelines
Slicer
UKFTractography
mrtrix3
nrrdchecker
tract_querier
trainingDataT1AHCC
trainingDataT2Masks
whitematteranalysis
To install one of them:
./pnlpipe install <software> [--version <version>]
For example,
./pnlpipe install UKFTractography --version DiffusionPropagator
./pnlpipe install BRAINSTools --version master
installs the DiffusionPropagator
branch of UKFTractography
and the master
branch of BRAINSTools
.
Each software module interprets version in its own way. Most of the time,
--version
expects a Github revision, as for these examples. However
the switch is optional; running install
without specifying a version will
install the software's default version.
Here's an example on how to install the Washington University's HCP Pipeline scripts:
./pnlpipe install HCPPipelines --version 3.22.0
The old bash scripts in pnlutil
are deprecated and are replaced by
pnlpipe/pnlscripts
. The new scripts are implemented in python, using the
plumbum shell scripting library. This makes them easier to understand and
modify, guarantees that temporary intermediate files are cleaned up, and lets
them accept non-standard file names. Some of the scripts have also been updated
to use the newer ANTs binaries/API, and some, like atlas.py
, have additional
options and features.
Many of the scripts use ANTs, which we get via
BRAINSTools. Before using these scripts,
the environment variable ANTSPATH
needs to be set. You can do this by running
source $soft/BRAINSTools-bin-<commit>/env.sh
where <commit>
is a Github revision or date. This will set up ANTS for that
particular version of BRAINSTools (see below for installing BRAINSTools).
Project tracker, to run in your project directories.
- homepage: https://github.com/reckbo/pnldash
- cluster: /data/pnl/soft/pnldash
- network: /rfanfs/pnl-zorro/software/pnldash
The database of projects that people push to is on the cluster at export PNLDASH_DB=/data/pnl/soft/pnldash/db
.
- homepage: https://github.com/BRAINSia/BRAINSTools
- cluster: /data/pnl/soft/BRAINSTools-bin-*
- network: /rfanfs/pnl-zorro/software/BRAINSTools-bin-*
To install the latest and add it to your PATH
and ANTSPATH
:
cd /path/to/pnlpipe
./pnlpipe install BRAINSTools --version master
source $PNLPIPE_SOFT/BRAINSTools-bin-<commit>/env.sh
Sometimes installing BRAINSTools
will give a build error, in this case delete
$PNLPIPE_SOFT/BRAINSTools-build
and try installing again.
- homepage: https://github.com/pnlbwh/ukftractography
- cluster: /data/pnl/soft/UKFTractography-*
- network: /rfanfs/pnl-zorro/software/UKFTractography-*
To install latest:
cd /path/to/pnlpipe
./pnlpipe install UKFTractography --version master
Sometimes installing UKFTractography
will give a build error, in this case
delete $PNLPIPE_SOFT/UKFTractography-build
and try installing again.
- homepage: www.slicer.org
- cluster: /data/pnl/soft/Slicer-*-linux-amd64
- network: /rfanfs/pnl-zorro/software/Slicer-*-linux-amd64
To install, for example, version 4.7.0
:
cd /path/to/pnlpipe
./pnlpipe install Slicer --version 4.7.0
- homepage: https://github.com/SlicerDMRI/whitematteranalysis
- cluster: /data/pnl/soft/whitematteranalysis-*
- network: /rfanfs/pnl-zorro/software/whitematteranalysis-*
To install the latest and add it to the PATH
and PYTHONPATH
:
cd /path/to/pnlpipe
./pnlpipe install whitematteranalysis --version master
source $soft/whitematteranalysis-<commit>/env.sh
- homepage: https://github.com/demianw/tract_querier
- cluster: /data/pnl/soft/tract_querier-*
- network: /rfanfs/pnl-zorro/software/tract_querier-*
To install the latest and add it to the PATH
and PYTHONPATH
:
cd /path/to/pnlpipe
./pnlpipe install tract_querier --version master
source $soft/tract_querier-<commit>/env.sh
- homepage: https://github.com/Washington-University/Pipelines
- cluster: /data/pnl/soft/HCPPipelines-*
- network: /rfanfs/pnl-zorro/software/HCPPipelines-*
To install version 3.22.0
:
cd /path/to/pnlpipe
./pnlpipe install HCPPipelines --version master
Haskell program that compares the nrrd headers of two or more nrrd images and prints or saves a csv with one row per key.
- homepage: https://github.com/reckbo/nrrdchecker
- cluster: /data/pnl/soft/nrrdchecker-*/nrrdchecker # binaries
- cluster: /data/pnl/soft/nrrdchecker # source directory
- network: /rfanfs/pnl-zorro/software/nrrdchecker-553c310/nrrdchecker
To install/update on the cluster:
cd /path/to/pnlpipe
./pnlpipe install nrrdchecker
To manually build on the cluster:
module load stack
cd /data/pnl/soft/nrrdchecker
stack build
stack exec -- nrrdchecker -h
You can copy the output to the PNL local network to use it there (where stack
or cabal
isn't installed).
- homepage: https://github.com/MRtrix3/mrtrix3
- cluster: /data/pnl/soft/mrtrix3-*
- network: /rfanfs/pnl-zorro/software/mrtrix3-*
To install latest version:
cd /path/to/pnlpipe
./pnlpipe install mrtrix3 --version master
On the cluster, you can load software using module
. For example, for matlab:
module unload matlab
module load matlab/2017a
To see a list of available modules:
module avail
pnlpipe
has a pipeline called DWIConvertTest
that tests the BRAINSTools
DWI conversion
binary by comparing the two nrrds that result from these conversions:
DWI -> nrrd
DWI -> nifti -> nrrd
They should be the same but this isn't the case (see BRAINSia/BRAINSTools#350).
'space origin' and 'thicknesses' are not preserved in commits 2d5eccb
, 41353e8
, and b195aae
, the
commits I tested.
I suggest that when upgrading to a new version of DWIConvert
that you run this test first.
On the cluster:
cd /data/pnl/soft/pnlpipe-dwiconverttest
# add new commit to pnlpipe_params/DWIConvertTest.params
./runme.sh
This generates an html report _data/DWIConvertTest.html
. See the latest at
/data/pnl/soft/pnlpipe-dwiconverttest/_data/DWIConvertTest.html
These are some scripts I wrote long time ago that apply our eddy current correction to multi-shell DWI's.
/projects/schiz/software/scripts/diffusion/dwipipeline_multishell
This software generates a report of the PNL's disk usage.
- homepage: https://github.com/pnlbwh/diskusage-logging
- network: /rfanfs/pnl-zorro/software/diskusage-logging
The scripts here are called daily and weekly by a cron job that is defined in
/rfanfs/pnl-zorro/software/cron/daily.sh
/rfanfs/pnl-zorro/software/cron/weekly.sh
To make sure the cron job is active on a machine, run
crontab -ls
If it is active and running correctly, every week a report of the PNL's disk
usage will be generated and saved as
/rfanfs/pnl-zorro/software/diskusage-logging/_data/report.html
. The cron job
is also configured to email the report to the head RA and Sylvain, although this
can be unreliable at times.
If no cron job is detected and you want to start one on a particular machine, run
crontab ./pnl.crontab
Make sure to check regularly that the report is being generated every week.
It is recommended that each project have its own isolated software environment. The advantages of this over a global environment are that it protects the project from changes to the global environment, allows projects to use different software configurations, and records what software was used to generate the data.
To help with this, we are transitioning to using pnlpipe to run our pipelines and install MRI processing software, and conda for setting up our python environments.
Conda is a general purpose package manager whose advantage over virtualenv is that it is not limited to python packages. This helps managing python packages that have external dependencies such as NumPy, SciPy, VTK, and Matplotlib. Conda is installed on the cluster as well as the local network.
To setup a conda environment, each project lists its dependencies in a file
called environment.yml
. Here's an example for a project that uses
whitematteranalysis
:
name: wma-664bb45
dependencies:
- nibabel
- numpy
- vtk
- joblib
- matplotlib
- pip:
- git+https://github.com/SlicerDMRI/whitematteranalysis.git@664bb45b34003689f0dccbed45cf864bb11ce4a5
To create and activate this environment, you would run
conda env create -f environment.yml
source activate wma-664bb45
Note you can install whitematteranalysis
using pnlpipe
, see above.
Listing jobs:
binfo # lists PNL nodes availability
bjobs # lists your LSF jobs
See what software packages are available:
module avail
Loading and unloading a software package:
module unload matlab
module load matlab/2017a
The pnl wide bashrc is located in /data/pnl/soft/bashrc
. Your personal
~/.bashrc
should source this file, i.e. put source /data/pnl/soft/bashrc
at
the end of ~/.bashrc
.
I used to use unison
to sync data between the local PNL network and the cluster, primarily
for INTRuST but also for other datasets. The configuration for each data set are saved in this
directory:
/rfanfs/pnl-zorro/software/.unison
For example, to sync INTRuST, from anywhere on the network run
unison -perms 0 int # -perms 0 means ignore change in permissions
It is a bidirectional sync, so changes can be pushed in either direction. Press '/' to skip a change, 'f' to accept a proposed change, and '<' or '>' to push a change from or to the cluster.
The old script to run MABS, mabs.sh
in pnlutil
, has been replaced by
atlas.py
in pnlpipe/pnlscripts
. This script includes an option to fuse
labelmaps using antsJointFusion
, a method that outperformed several other
fusion strategies that I tested in a DWI baseline mask prediction task. It also
fixes a bug in that it deletes its temporary output when interrupted and won't
fill up the cluster's /tmp
directories.
It has an option to accept command line arguments as well as a csv file.
Command line arguments:
./pnlscripts/atlas.py args -h # to setup ANTSPATH, run source $soft/BRAINSTools-bin-<commit>/env.sh
Specify training images and labelmaps via commandline arguments.
Usage:
atlas.py args [SWITCHES]
Meta-switches
-h, --help Prints this help message and quits
--help-all Print help messages of all subcommands and quit
-v, --version Prints the program's version and quits
Switches
--fusion VALUE:{'avg', 'antsJointFusion'} Also create predicted labelmap(s) by fusing the atlas labelmaps; may be given multiple times
-i, --images VALUE:str list of images in quotations, e.g. "img1.nrrd img2.nrrd"; required
-l, --labels VALUE:str list of labelmap images in quotations, e.g. "mask1.nrrd mask2.nrrd cingr1.nrrd cingr2.nrrd"; required
-n, --names VALUE:str list of names for generated labelmaps, e.g. "atlasmask atlascingr"; required
-o, --out VALUE:str output directory; required
-t, --target VALUE:ExistingFile target image; required
Csv file argument:
./pnlscripts/atlas.py csv -h # to setup ANTSPATH, run source $soft/BRAINSTools-bin-<commit>/env.sh
Specify training images and labelmaps via a csv file. The names in the header row will be used to name the generated atlas labelmaps.
Usage:
atlas.py csv [SWITCHES] csv
Meta-switches
-h, --help Prints this help message and quits
--help-all Print help messages of all subcommands and quit
-v, --version Prints the program's version and quits
Switches
--fusion VALUE:{'avg', 'antsJointFusion'} Also create predicted labelmap(s) by averaging the atlas labelmaps; may be given multiple times
-o, --out VALUE:str output directory; required
-t, --target VALUE:ExistingFile target image; required