Skip to content

Commit

Permalink
Merge branch 'master' of https://github.com/angelolab/ark-analysis in…
Browse files Browse the repository at this point in the history
…to update_seg_save
  • Loading branch information
alex-l-kong committed Aug 24, 2022
2 parents e385d4f + c94b48a commit 3c077d6
Show file tree
Hide file tree
Showing 6 changed files with 79 additions and 42 deletions.
108 changes: 73 additions & 35 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,51 @@

# ark-analysis

Toolbox for analyzing multiplexed imaging data
Toolbox for analyzing multiplexed imaging data.

Full documentation for the project can be found [here](https://ark-analysis.readthedocs.io/en/latest/)
Full documentation for the project can be found [here](https://ark-analysis.readthedocs.io/en/latest/).

## Info
## Table of Contents
- [Getting Started](#getting-started)
- [Overview](#overview)
- [1. Segmentation](#1-segmentation)
- [2. Pixel clustering with Pixie](#2-pixel-clustering-with-pixie)
- [3. Cell clustering with Pixie](#3-cell-clustering-with-pixie)
- [4. Spatial analysis](#4-spatial-analysis)
- [Installation Steps](#installation-steps)
- [Download the Repo](#download-the-repo)
- [Getting the Docker Image](#getting-the-docker-image)
- [Running on Windows](#running-on-windows)
- [Using the Repository (Running the Docker)](#using-the-repository-running-the-docker)
- [External Tools](#external-tools)
- [Mantis](#mantis)
- [External Hard Drives and Google File Stream](#external-hard-drives-and-google-file-stream)
- [Updating the Repository](#updating-the-repository)
- [Questions?](#questions)
- [Want to contribute?](#want-to-contribute)
- [How to Cite](#how-to-cite)

This project contains code and example scripts for analyzing multiplexed imaging data
## Getting Started

## To install the project:
### Overview
This repo contains tools for analyzing multiplexed imaging data. The assumption is that you've already performed any necessary image processing on your data (such as denoising, background subtraction, autofluorescence correction, etc), and that it is ready to be analyzed. For MIBI data, we recommend using the [toffy](https://github.com/angelolab/toffy) processing pipeline.

#### 1. Segmentation
The [**segmentation notebook**](./templates_ark/1_Segment_Image_Data.ipynb) will walk you through the process of using [Mesmer](https://www.nature.com/articles/s41587-021-01094-0) to segment your image data. This includes selecting the appropriate channel(s) for segmentation, running your data through the network, and then extracting single-cell statistics from the resulting segmentation mask.

#### 2. Pixel clustering with Pixie
The first step in the [Pixie](https://www.biorxiv.org/content/10.1101/2022.08.16.504171v1) pipeline is to run the [**pixel clustering notebook**](./templates_ark/2_Cluster_Pixels.ipynb). The notebook walks you through the process of generating pixel clusters for your data, and lets you specify what markers to use for the clustering, train a model, use it to classify your entire dataset, and generate pixel cluster overlays. The notebook includes a GUI for manual cluster adjustment and annotation.

#### 3. Cell clustering with Pixie
The second step in the [Pixie](https://www.biorxiv.org/content/10.1101/2022.08.16.504171v1) pipeline is to run the [**cell clustering notebook**](./templates_ark/3_Cluster_Cells.ipynb). This notebook will use the pixel clusters generated in the first notebook to cluster the cells in your dataset. The notebook walks you through generating cell clusters for your data and generates cell cluster overlays. The notebook includes a GUI for manual cluster adjustment and annotation.

#### 4. Spatial analysis
TBD once notebooks are finished


### Installation Steps

#### Download the Repo

Open terminal and navigate to where you want the code stored.

Expand All @@ -21,19 +57,26 @@ Then input the command:
git clone https://github.com/angelolab/ark-analysis.git
```

#### Getting the Docker Image

Next, you'll need to set up the Docker image with all of the required dependencies:
- First, [download](https://hub.docker.com/?overlay=onboarding) Docker Desktop.
- Once it's sucessfully installed, make sure it is running by looking in toolbar for the Docker whale.
- Once it's sucessfully installed, make sure it is running by looking in toolbar for the Docker whale icon.
- Once it's running, enter the following commands into terminal

```
cd ark-analysis
docker pull angelolab/ark-analysis:latest
```

You've now installed the code base.
You can now start to analyze your multiplexed imaging data!

## Whenever you want to run the scripts:

#### Running on Windows

Our repo runs best on Linux-based systems (including MacOS). If you need to run on Windows, please consult our [Windows guide](https://ark-analysis.readthedocs.io/en/latest/_rtd/windows_setup.html) for additional instructions.

#### Using the Repository (Running the Docker)

Enter the following command into terminal from the same directory you ran the above commands:

Expand All @@ -43,27 +86,29 @@ Enter the following command into terminal from the same directory you ran the ab

This will generate a link to a jupyter notebook. Copy the last URL (the one with `127.0.0.1:8888` at the beginning) into your web browser.

Be sure to keep this terminal open. **Do not exit the terminal or enter control-c until you are finished with the notebooks**.
Be sure to keep this terminal open. **Do not exit the terminal or enter `control-c` until you are finished with the notebooks**.

### NOTE
**NOTE:**

If you already have a Jupyter session open when you run `./start_docker.sh`, you will receive a couple additional prompts.
If you already have a Jupyter session open when you run `./start_docker.sh`, you will receive a couple additional prompts.

Copy the URL listed after `Enter this URL instead to access the notebooks:`

You will need to authenticate. Note the last URL (the one with `127.0.0.1:8888` at the beginning), copy the token that appears there (it will be after `token=` in the URL), paste it into the password prompt of the Jupyter notebook, and log in.

## Using the example notebooks:
- The Segment_Image_Data notebook walks you through the appropriate steps to format your data, run the data through deepcell, extracts the counts for each marker in each cell, and creates a csv file with the normalized counts
- The spatial_analysis notebook contains code for performing cluster- and channel-based randomization, as well as neighborhood analysis.
- The example_visualization notebooks contains code for basic plotting functions and visualizations
You can shut down the notebooks and close docker by entering `control-c` in the terminal window.

**REMEMBER TO DUPLICATE AND RENAME NOTEBOOKS**

If you didn't change the name of any of the notebooks within the `templates_ark` folder, they will be overwritten when you decide to update the repo. Read about updating Ark [here](#updating-the-repository)

## Once you are finished
## External Tools

You can shut down the notebooks and close docker by entering control-c in the terminal window.
### Mantis

We use [Mantis Viewer](https://mantis.parkerici.org) to visualize the segmented images.
Below you can see the structure of a Mantis project in order to open it in the visualization tool.

### Mantis Viewer

Mantis Project Structure:
```sh
Expand All @@ -87,7 +132,7 @@ mantis_project
└── ...
```

## External Hard Drives and Google File Stream
### External Hard Drives and Google File Stream

To configure external hard drive (or google file stream) access, you will have to add this to Dockers file paths in the Preferences menu.

Expand All @@ -108,11 +153,11 @@ bash start_docker.sh -e 'path/added/to/preferences'

to mount the drive into the virtual `/data/external` path inside the docker.

## Updates
## Updating the Repository

This project is still in development, and we are making frequent updates and improvements. If you want to update the version on your computer to have the latest changes, perform the following steps
This project is still under development, and we are making frequent changes and improvements. If you want to update the version on your computer to have the latest changes, perform the following steps. Otherwise, we recommend waiting for new releases.

First, get the latest version of the code
First, get the latest version of the repository.

```
git pull
Expand All @@ -133,19 +178,11 @@ or
./start_docker.sh -u
```

### WARNING

If you didn't change the name of any of the notebooks within the `scripts` folder, they will be overwritten by the command above!

If you have made changes to these notebooks that you would like to keep (specific file paths, settings, custom routines, etc), rename them before updating!

For example, rename your existing copy of `Segment_Image_Data.ipynb` to `Segment_Image_Data_old.ipynb`. Then, after running the update command, a new version of `Segment_Image_Data.ipynb` will be created with the newest code, and your old copy will exist with the new name that you gave it.

After updating, you can copy over any important paths or modifications from the old notebooks into the new notebook
For example, rename your existing copy of `1_Segment_Image_Data.ipynb` to `1_Segment_Image_Data_old.ipynb`. Then, after running the update command, a new version of `1_Segment_Image_Data.ipynb` will be created with the newest code, and your old copy will exist with the new name that you gave it.

## Running on Windows

Our repo runs best on Linux-based systems (including MacOS). If you need to run on Windows, please consult our [Windows guide](https://ark-analysis.readthedocs.io/en/latest/_rtd/windows_setup.html) for additional instructions.
After updating, you can copy over any important paths or modifications from the old notebooks into the new notebook.

## Questions?

Expand All @@ -155,7 +192,8 @@ If you run into trouble, please first refer to our [FAQ](https://ark-analysis.re

If you would like to help make `ark` better, please take a look at our [contributing guidelines](https://ark-analysis.readthedocs.io/en/latest/_rtd/contributing.html).

## Citation
Please cite our paper if you found our repo useful!
## How to Cite
Please cite the following papers if you found our repo useful!

[Greenwald, Miller et al. Whole-cell segmentation of tissue images with human-level performance using large-scale data annotation and deep learning](https://www.nature.com/articles/s41587-021-01094-0)
1. [Greenwald, Miller et al. Whole-cell segmentation of tissue images with human-level performance using large-scale data annotation and deep learning [2021]](https://www.nature.com/articles/s41587-021-01094-0)
2. [Liu, Greenwald et al. Robust phenotyping of highly multiplexed tissue imaging data using pixel-level clustering [2022]](https://www.biorxiv.org/content/10.1101/2022.08.16.504171v1)
13 changes: 6 additions & 7 deletions ark/utils/notebooks_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,11 @@


SEGMENT_IMAGE_DATA_PATH = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'..', '..', 'templates_ark',
'Segment_Image_Data.ipynb')
'..', '..', 'templates_ark', '1_Segment_Image_Data.ipynb')
PIXEL_CLUSTER_PATH = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'..', '..', 'templates_ark', 'example_pixel_clustering.ipynb')
'..', '..', 'templates_ark', '2_Cluster_Pixels.ipynb')
CELL_CLUSTER_PATH = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'..', '..', 'templates_ark', 'example_cell_clustering.ipynb')
'..', '..', 'templates_ark', '3_Cluster_Cells.ipynb')


def _exec_notebook(nb_filename, base_folder):
Expand All @@ -32,11 +31,11 @@ def _exec_notebook(nb_filename, base_folder):

# test runs with default inputs
def test_segment_image_data():
_exec_notebook('Segment_Image_Data.ipynb', 'templates_ark')
_exec_notebook('1_Segment_Image_Data.ipynb', 'templates_ark')


def test_example_spatial_analysis():
_exec_notebook('example_spatial_analysis_script.ipynb', 'templates_ark')
def test_example_pairwise_spatial_enrichment():
_exec_notebook('example_pairwise_spatial_enrichment.ipynb', 'templates_ark')


def test_example_neighborhood_analysis():
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.

0 comments on commit 3c077d6

Please sign in to comment.