Skip to content

zhenqi-he/transnuseg

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

70 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TransNuSeg: A Lightweight Multi-Task Transformer for Nuclei Segmentation

This is the official implementation for TransNuSeg: A Lightweight Multi-Task Transformer for Nuclei Segmentation (MICCAI 2023). paper

Introduction

This paper proposes a lightweight multi-task framework for nuclei segmentation, namely TransNuSeg, as the first attempt at an entirely Swin-Transformer driven architecture. Innovatively, to alleviate the prediction inconsistency between branches, we propose a self-distillation loss that regulates the consistency between the nuclei decoder and normal edge decoder. And an innovative attention-sharing scheme that shares attention heads amongst all decoders is employed to leverage the high correlation between tasks.

The overall architecture is demonstrated in the figure below.

Dataset

In this paper, we test our model in Fluorescence Microscopy Image Dataset and Histology Image Dataset from ClusterSeg. It is available here

Quick Start

1, Download the datasets from the above link and put them under the data folder.

cd data
unzip histology.zip
unzip fluorescence.zip

2, Modify the hyperparameters, alpha, beta, gamma and sharing_ratio in main.sh or use the default value

3, Run main.sh to the model

sh main.sh

Two folders named log and saved will be automatically created to store logging information and the trained model.

Environment

The code is developed on one NVIDIA RTX 3090 GPU with 24 GB memory and tested in Python 3.8.10 and PyTorch 1.13.1.

How to cite

You may cite us as

@InProceedings{transnuseg,
    author="He, Zhenqi
    and Unberath, Mathias
    and Ke, Jing
    and Shen, Yiqing",
    title="TransNuSeg: A Lightweight Multi-task Transformer for Nuclei Segmentation",
    booktitle="Medical Image Computing and Computer Assisted Intervention -- MICCAI",
    year="2023",
    pages="206--215",
    isbn="978-3-031-43901-8"
}