Skip to content
/ qute Public

Leverages and extends several PyTorch-based framework and tools.

License

Notifications You must be signed in to change notification settings

aarpon/qute

Repository files navigation

Framework to support deep-learning based computer-vision research in microscopy image analysis. Leverages and extends several PyTorch-based framework and tools.

Installation

Install prerequisites

Install qute

$ git clone https://github.com/aarpon/qute
$ cd qute
$ uv venv --python 3.12      # Create a virtual environment with Python 3.12
$ uv sync                    # Install production dependencies
$ uv sync --group dev        # (Optional) Install development dependencies
$ uv run pre-commit install  # (Optional) Only if you want to push commits to GitHub
$ source .venv/bin/activate  # (Optional) Activate virtual environment on Linux/macOS
$ .venv\Scripts\activate     # (Optional) Activate virtual environment on Windows

For simplicity, all following calls assume that the virtual environment is activated (see above): otherwise, you would prepend uv run.

Test if GPU acceleration is available

  • Linux and Windows:
$ python -c "import torch; print(torch.cuda.is_available())"
True
  • macOS M1/M2:
$ python -c "import torch; print(torch.backends.mps.is_available())"
True

How to use

Command-line

The highest-level way to access qute is via its command-line interface, but it still in (very) early development. Most functionalities are not yet exposed; the few that are can be accessed as follows:

$ qute --help
 Usage: qute [OPTIONS] COMMAND [ARGS]...

Command-line interface to run various qute jobs.

╭─ Options ─────────────────────────────────────────────────────────────────╮
│ --help          Show this message and exit.                               │
╰───────────────────────────────────────────────────────────────────────────╯
╭─ Commands ────────────────────────────────────────────────────────────────╮
│ run       Run experiment specified by a configuration file.               │
│ version   Print (detailed) version information.                           │
│ config    Manage configuration options.                                   │
╰───────────────────────────────────────────────────────────────────────────╯ 

You can create a classification (segmentation) configuration file with:

$ qute config create --category classification --target /path/to/my/config.ini

The category maps to the underlying Director as explained in the High-level API section below.

You can edit the generated configuration file as you see fit (the template should be mostly self-explanatory) and then run the job with:

$ qute run --config /path/to/my/config.ini --num_workers 24

More detailed instructions will follow.

High-level API

The high-level qute API provides easy to use objects that manage whole training, fine-tuning and prediction workflows following a user-defined configuration file. Configuration templates can be found in config_samples/.

High-level API

To get started with the high-level API, try:

$ python qute/examples/cell_segmentation_demo_unet.py

Configuration parameters are explained in config_samples/.

To follow the training progress in Tensorboard, run:

$ tensorboard --logdir ${HOME}/Documents/qute/

and then open TensorBoard on http://localhost:6006/.

Low-level API

The low-level API allows easy extension of qute for research and prototyping. You can find the detailed API documentation here.

Hyperparameter optimization

For an example on how to use ray[tune] to optimize hyper-parameters, see examples/cell_segmentation_demo_unet_hyperparameters.py.

About

Leverages and extends several PyTorch-based framework and tools.

Resources

License

Stars

Watchers

Forks

Packages

No packages published