Skip to content
This repository contains the code, and access to the pretrained models and datasets for the paper: "Automated analysis of high-content microscopy data with deep learning"
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
sample_image all DeepLoc scripts uploaded Mar 15, 2017
.gitignore Initial commit Mar 15, 2017
DeepLoc_eval.py
DeepLoc_eval_sample_image.py argparse formatting Mar 16, 2017
DeepLoc_train.py
DeepLoc_transfer_SWAT_RFP.py
DeepLoc_transfer_wt2017.py argparse formatting Mar 16, 2017
DeepLoc_visualize_classes.py argparse formatting Mar 16, 2017
LICENSE Initial commit Mar 15, 2017
Load_GR.py
README.md
cellDataClass.py
download_datasets.sh all DeepLoc scripts uploaded Mar 15, 2017
environment.yml all DeepLoc scripts uploaded Mar 15, 2017
nn_layers.py
preprocess_images.py
requirements.txt all DeepLoc scripts uploaded Mar 15, 2017

README.md

DeepLoc

This repository contains the code, pretrained models, and datasets for the paper: "Automated analysis of high-content microscopy data with deep learning" Kraus, O.Z., Grys, B.T., Ba, J., Chong, Y., Frey, B.J., Boone, C., & Andrews, B.J. Molecular Systems Biology 13.4 (2017): 924. http://msb.embopress.org/content/13/4/924

The commands below assume starting from the 'DeepLoc' directory.

REQUIREMENTS

for training and evaluation scripts:

Python 2.7 64-bit: http://www.python.org/getit/

CUDA 8.0+ SDK (for GPU support): https://developer.nvidia.com/cuda-downloads

cuDNN 5.1 (for GPU support): https://developer.nvidia.com/cudnn

Tensorflow v1.0+: https://www.tensorflow.org/install

You can use the following command to install the dependencies (other than tensorflow):

pip install -r requirements.txt

or if running the Anaconda Python distribution use to start an environment with the dependencies (recommended):

conda env create -f environment.yml

GETTING THE FULL DATA SETS

The datasets and pretrained models are too large to store in the repository. Please download them using instructions below.

To download and unzip the datasets and pretrained model, please run:

  bash download_datasets.sh

Otherwise, the datasets are available at: http://spidey.ccbr.utoronto.ca/~okraus/DeepLoc_full_datasets.zip

and the pretrained model is available at: http://spidey.ccbr.utoronto.ca/~okraus/pretrained_DeepLoc.zip

include the datasets in the 'datasets' subdirectory.

TRAINING DeepLoc on Chong et al., 2015 DATA (CELL, doi:10.1016/j.cell.2015.04.051)

To train DeepLoc on the Chong et al. dataset run:

python DeepLoc_train.py --logdir path/to/log-directory
  • the argument passed to --logdir indicates where to save the resulting models and model predictions (default is './logs')
  • download the datasets as described above and store them in './datasets'
  • by default, models are saved every 500 iterations, and a test batch is evaluated every 50 iterations

To evaluate the performance of different DeepLoc checkpoints run:

python DeepLoc_eval.py --logdir path/to/log-directory
  • the argument to --logdir should be the same path used for training
  • adds a python cPickle file called 'test_acc_deploy_results.pkl' including training and test performance (accuracy and test values) for the full datasets
  • default output stored in './logs'

VISUALIZING DeepLoc GRAPH AND TRAINING PERFORMANCE

The DeepLoc model and training performance can be visualized using Tensorboard (https://www.tensorflow.org/how_tos/summaries_and_tensorboard/)

To initialize a tensorboard session, run:

tensorboard --logdir=path/to/log-directory

once TensorBoard is running, navigate your web browser to localhost:6006 to view the TensorBoard

DEPLOYING DeepLoc TO SAMPLE IMAGE FROM ENTIRE SCREEN

DeepLoc can be deployed to an entire automated microscopy screen using the demo in:

python DeepLoc_eval_sample_image.py
  • assumes model saved in './pretrained_DeepLoc/pretrained_models/model.ckpt-5000'
  • default output stored as csv file in './sample_image'

VISUALIZING DeepLoc CLASSES AND FEATURES

Patterns that maximally activate DeepLoc output classes can be visualized using:

python DeepLoc_visualize_classes.py
  • assumes model saved in './pretrained_DeepLoc/pretrained_models/model.ckpt-5000'
  • default output stored in './output_figures/generated_cells.png'

TRANSFERING DeepLoc TO wt2017 and SWAT_RFP DATASETS

DeepLoc can be loaded and fine-tuned on the wt2017 dataset using:

python DeepLoc_transfer_wt2017.py
  • assumes model saved in './pretrained_DeepLoc/pretrained_models/model.ckpt-5000'
  • default output stored in './logs/transfer_wt2017

and fine-tuned on the SWAT_RFP dataset using:

python DeepLoc_transfer_SWAT_RFP.py
  • assumes model saved in './pretrained_DeepLoc/pretrained_models/model.ckpt-5000'
  • default output stored in './logs/transfer_SWAT_RFP
You can’t perform that action at this time.