Skip to content

jono-m/OrganoID

Repository files navigation

REQUIREMENTS

OrganoID was run with the following software configuration:

  • Windows 10 64-bit
  • Python 3.9

INSTALLATION

Overview: to set up OrganoID source dependencies, create an empty Conda environment (e.g., with miniconda) and install all packages listed in requirements.txt.

NOTE: OrganoID uses TensorFlow for neural network predictions. TensorFlow will automatically run on your GPU if compatible libraries are installed for your graphics card (e.g. NVIDIA CUDA). See tensorflow.org/install for guidance.

  1. Install Anaconda (https://www.anaconda.com/products/distribution).
  2. Open Anaconda Prompt and create a new environment:
    >> conda create -n OrganoID python=3.9
    >> activate OrganoID
    
  3. Download OrganoID and extract it to a directory of your choosing (https://github.com/jono-m/OrganoID/archive/refs/heads/master.zip). You may also clone the repository instead.
  4. In Anaconda Prompt, navigate to the OrganoID root directory (which contains OrganoID.py):
    >> cd path/to/OrganoID/directory
    
  5. If you would like to run TensorFlow on your GPU (which may be faster for batch processing), go to https://www.tensorflow.org/install/pip and follow the relevant instructions for your operating system, if GPU-mode is supported. (e.x. Step 5 for Windows Native). Skip this step otherwise.
  6. Install all OrganoID requirements:
    pip install -r requirements.txt
    

USAGE

The OrganoID distribution comes with an optimized TensorFlow Lite model, OptimizedModel. This model can be used for most applications. Here is an example of usage (run in Anaconda Prompt from the directory that contains OrganoID.py):

python OrganoID.py run OptimizedModel /path/to/images /path/to/outputFolder

This command goes through each image in the /path/to/images folder and produces a labeled grayscale image, where the intensity at each pixel is the organoid "ID". These images are saved in the /path/to/outputFolder directory. You can also output other versions of this image with command options, such as --binary, --belief, or --colorize to generate black-and-white masks, detection belief images, or color-labeled images, respectively. To see all options with instructions, run the following command:

python OrganoID.py run -h

If you would like to tune model performance for particular applications, the included model TrainableModel can be re-trained through this tool. Run the following command to view training instructions:

python OrganoID.py train -h

Such as:

python OrganoID.py train /path/to/trainingData /path/to/outputFolder NewModelName -M TrainableModel

USER INTERFACE

OrganoID now includes a user interface. To start the interface, run:

python OrganoID_UI.py

The parameters in the interface correspond to those in the command-line tool.

DATASET

The dataset for model training and all validation/testing from the OrganoID publication is openly available here: https://osf.io/xmes4/

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published