Skip to content

thepanlab/FOCT_kidney

master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.

README

Code for the paper: "Deep-learning-aided forward optical coherence tomography endoscope for percutaneous nephrostomy guidance"[1] The following pieces of python code and jupyter notebooks were used for the paper. The following architectures were used:

  • Resnet 34
  • Resnet 50 and Mobilenetv2 with and without pretrained initial weights from Imagenet Dataset.

The dataset can be found in [2]

Prerequisites

The language used is Python. We used Tensorflow 2.3.

Structure:

  • 0-Read_images.ipynb
    It process the images from JPEG to numpy ndarray binaries

  • ResNet34/

    • Cross-validation
      • archResNet_p1.py
      • archResNet_p2.py
      • archResNet_p3.py
      • archResNet_p4.py

    It uses the ResNet34 architecture to predict the type of tissue( 3 categories) It is split in 4 files in order to be able to run them independently.

  • PT_MobileNetv2/

    • Cross-validation
      • PT_MobileNetv2_batch/
        • mobilenetv2_tl_arg_simult_vC.batch
      • PT_MobileNetv2_python/
        • mobilenetv2_tl_arg_vC.py
  • ResNet50/

    • Cross-validation
      • Resnet50_batch/
        • resnet50_arg_simult.batch
      • Resnet50_python/
        • archResNet50_arg.py
    • Cross-testing
      • Resnet50_batch/
        • resnet50_arg_outer_simult.batch
      • Resnet50_python/
        • archResNet50_arg_outer.py
  • PT_ResNet50/

    • Cross-validation
      • PT_Resnet50_batch/
        • resnet50_tl_arg_simult.batch
      • PT_Resnet50_python/
        • archResNet50_tl_arg.py
    • Cross-testing
      • PT_Resnet50_batch/
        • resnet50_tl_arg_outer_simult.batch
      • PT_Resnet50_python/
        • archResNet50_tl_arg_outer.py
  • Processing_results.ipynb
    Processing of the results to obtain the accuracies, epochs of all the combinations. Time is calculated for a few combinations

  • Processing_predictions.ipynb
    Processing of the predictions to obtain the ROC curves

  • Processing_time.ipynb
    Complete processinf of time for cross-validation.

  • Grad-CAM.ipynb
    Implementation of visual explanation using Grad-CAM[3] for the models obtained

For ResNet34 run the python code, for the rest you need to use arguments. The python file is used as:

archResNet50_arg.py testing_kidney validation_kidney

e.g.

archResNet50_arg.py 1 2

The batch file was used in Summit supercomputer.

Paper

[1] Chen Wang, Paul Calle, Nu Bao Tran Ton, Zuyuan Zhang, Feng Yan, Anthony M. Donaldson, Nathan A. Bradley, Zhongxin Yu, Kar-ming Fung, Chongle Pan, and Qinggong Tang, "Deep-learning-aided forward optical coherence tomography endoscope for percutaneous nephrostomy guidance," Biomed. Opt. Express 12, 2404-2418 (2021)

Paper link

Dataset

[2] Chen Wang, Paul Calle, Qinggong Tang, & Chongle Pan. (2022). OCT porcine kidney dataset for percutaneous nephrostomy guidance [Data set]. Zenodo. https://doi.org/10.5281/zenodo.7113948

References

[3] Selvaraju, R. R., Cogswell, M., Das, A., Vedantam, R., Parikh, D., & Batra, D. (2017). Grad-CAM: Visual explanations from deep networks via gradient-based localization. In Proceedings of the IEEE international conference on computer vision (pp. 618-626).

Contact

Paul Calle - pcallec@ou.edu
Project link: https://github.com/thepanlab/FOCT_kidney

About

Code for the paper: "Deep-learning-aided forward optical coherence tomography endoscope for percutaneous nephrostomy guidance"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published