Skip to content

Commit

Permalink
Add files via upload
Browse files Browse the repository at this point in the history
  • Loading branch information
turleyjm committed Apr 2, 2024
1 parent 44df5dd commit a6e68ea
Showing 1 changed file with 15 additions and 21 deletions.
36 changes: 15 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# cell-division-dl-plugin

Deep learning plugin to detect cell divisions and find there orientation.
Deep learning plugin to detect cell divisions and find their orientations.

----------------------------------

Expand All @@ -18,16 +18,10 @@ https://napari.org/stable/plugins/index.html

Install by cloning this repository, and once inside this repository make a new virtual environment and activate it by running:

For mac
```sh
python3 -m venv venv
source venv/bin/activate
```
For windows
```sh
python3 -m venv venv
.\venv\Scripts\activate
```

Next install the napari and plugin
```sh
Expand All @@ -40,34 +34,37 @@ Models (UNetCellDivision10.pth.tar and UNetOrientation.pth.tar) needed to run th

## Usage

Loading napari can be done by typing `napari` in your terminal with in the virtual environment this will load the napari GUI. Then the video cell divisions are to be detected in can be dragged and dropped in to the GUI. We have added a demo video (focusWoundL18h10-short.tif) to the in the dat folder which can used to demonstrate the method. The plugin can be then be started.
Loading napari can be done by typing `napari` in your terminal within the virtual environment this will load the napari GUI. Then the video, that cell divisions are to be detected in, can be dragged and dropped onto the GUI. We have added a demo video (focusWoundL18h10-short.tif) to the dat folder which can used to demonstrate the method. The plugin can be then be started.

After loading the plugin clicking the button "" there are 3 options for "Division heatmap", "Division database" and "Division & orientaton database" these will each produce a different a
After loading the plugin clicking the button " Detect divisions " there are 3 options for "Division heatmap", "Division database" and "Division & orientaton database" these will each produce a different output.

### Division heatmap
"Division heatmap" loads and runs the UNetCellDivision10.pth.tar model on the video. It displays the division prediction heatmap showing areas of the video it has detected cell divisions. (Note some of the output signal can be weak in some division and is still detected in the next steps but is by eye more difficult to see. Increasing the bightness/contrast can clearly show these detected divisons)
"Division heatmap" loads and runs the UNetCellDivision10.pth.tar model on the video. It displays the division prediction heatmap showing areas of the video where it has detected cell divisions. (Note the output signal for some divisions can be weak and therefore difficult to see by eye. Increasing the brightness/contrast can make the divisions more visible.)

![displayDivisions](https://github.com/turleyjm/cell-division-dl-plugin/assets/68009166/56dba27c-7c78-4231-b574-61294cec449a)
<p align="center">
![github](https://github.com/turleyjm/cell-division-dl-plugin/assets/68009166/5bc88141-a4d5-485d-a620-b9790e3db654)
Video of plugin output with blue overlap of the models division event prediction.
</p>

### Division database
"Division database" will do the first step as above and also locate these divisions from the prediction heatmap and genate a database of cell divisions finding their location in space and time. Also saves a display video in the folder "output" showing detected cell divisions via a blue overlay.
"Division database" will do the same first step as above and also locate these divisions from the prediction heatmap and generate a database of cell divisions finding their locations in space and time. Also saves a display video in the folder "output" showing detected cell divisions via a blue overlay.

### Division & orientaton database
"Division & orientaton database" follows the same steps as above then loads and runs the UNetOrientation.pth.tar model on each of the detected cell divisions to determine the orientation of cell divisions and saves this in a updated database. Also saves a display video in the folder "output" showing detected cell divisions and their orientation via a blue overlay.
"Division & orientaton database" follows the same steps as above then loads and runs the UNetOrientation.pth.tar model on each of the detected cell divisions to determine the orientation of cell divisions and saves this in an updated database. Also saves a display video in the folder "output" showing detected cell divisions and their orientation via a blue overlay.

Extracting divisions from the video can take some time to process. As 5 frame videos are used to detect cell divisions, cells dividing in the first or last 3 frames will not have their cell divisions detected.

## Retrain

This model could be used for other 2D cell division data but retraining is likely to be needed for highly accurate models. Here we have used 2 flourencent channels showing cell nuclei and boundaries to supply dynamic information to the model. Other systems may use different markers to label these parts of cell or used different componds altogether (e.g microtubules).
This model could be used for other 2D cell division data but retraining is likely to be needed for highly accurate models. Here we have used 2 flourescent channels showing cell nuclei and boundaries to supply dynamic information to the model. Other systems may use different markers to label these parts of cell or use different labelled molecules altogether (e.g tubulin).

In the folder trainModels we have included the code to train, test and evaluate each of the deep learning models we have used in the paper. The datasets used to train the models can be found in [Zenodo]. Also included is the saved parameters of the trained models.
In the folder trainModels we have included the code to train, test and evaluate each of the deep learning models we have used in the paper. The datasets used to train the models can be found in [Zenodo]. Also included are the saved parameters of the trained models.

To retrain a model on a new set of exprimental data can be done by genarating a set of training data following the examples provide in the training data. Then using the proved Jupyter notebooks to train the indivual models. For those less familiar with training deep learning model we recommend the freely avalable [fast.ai] training course.
Retraining a model on a new set of experimental data can be done by (1) generating a set of training data following the examples provided in [Zenodo]; then (2) using the provided Jupyter notebooks to train the individual models. For those less familiar with training deep learning models we recommend the freely avalable [fast.ai] training course.

## Expirmental data and analysis
## Experimental data and analysis

The data used in the analysis shown in are [paper] can be downloaded from [Zenodo]. The scripts in the folder analysis process the databases and make the figures of the paper. Also the files called "focus{sampleName}.tif" and similar videos can be used as inputs for the plugin.
The data used in the analysis shown in our [paper] can be downloaded from [Zenodo]. The scripts in the folder analysis processes the databases and makes the figures found in the paper. In addition the files called "focus{sampleName}.tif" and similar videos can be used as inputs for the plugin.

## License

Expand All @@ -89,9 +86,6 @@ If you encounter any problems, please [file an issue] along with a detailed desc
[Mozilla Public License 2.0]: https://www.mozilla.org/media/MPL/2.0/index.txt
[cookiecutter-napari-plugin]: https://github.com/napari/cookiecutter-napari-plugin
[paper]: https://www.biorxiv.org/content/10.1101/2023.03.20.533343v3.abstract
[Zenodo]: https://zenodo.org/records/10846684
[fast.ai]: https://course.fast.ai/Lessons/lesson1.html


[file an issue]: https://github.com/turleyjm/cell-division-dl-plugin/issues

Expand Down

0 comments on commit a6e68ea

Please sign in to comment.