Skip to content

xiongyiheng/Prior-RadGraphFormer

Repository files navigation

Prior-RadGraphFormer: A Prior-Knowledge-Enhanced Transformer for Generating Radiology Graphs from X-Rays

[arXiv] [BibTex]

Code release for the GRAIL @ MICCAI 2023 paper "Prior-RadGraphFormer: A Prior-Knowledge-Enhanced Transformer for Generating Radiology Graphs from X-Rays".

1-teaser-v3-out-1 Prior-RadGraphformer is a transformer-based network aiming at directly generating radiology graphs from radiology X-rays. Generated graphs can be used for multiple downstream tasks such as free-text reports generation and pathologies classification.

Installation

We recommend using python3.8 and following scripts to install required python packages and compile CUDA operators

python -m venv /path/to/new/virtual/environment
source /path/to/new/virtual/environment/bin/activate
pip install -r requirements.txt

cd ./models/ops
python setup.py install

How to train the Prior-RadGraphFormer

Preparing the data

See details here.

Preparing the config file

The config file can be found at .configs/radgraph.yaml. Make custom changes if necessary. Specifically, to train a vanilla RadGraphFormer, set MODEL.ASM=False.

Training

python train.py

Visualization

Run python util/viz_graph.py, you may need to edit the dir though.

Downstream tasks evaluation

See details here.

Pretrained checkpoints

checkpoint

Citation

If you find this code helpful, please consider citing:

@misc{xiong2023priorradgraphformer,
      title={Prior-RadGraphFormer: A Prior-Knowledge-Enhanced Transformer for Generating Radiology Graphs from X-Rays}, 
      author={Yiheng Xiong and Jingsong Liu and Kamilia Zaripova and Sahand Sharifzadeh and Matthias Keicher and Nassir Navab},
      year={2023},
      eprint={2303.13818},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

@inproceedings{xiong2023prior,
  title={Prior-RadGraphFormer: A Prior-Knowledge-Enhanced Transformer for Generating Radiology Graphs from X-Rays},
  author={Xiong, Yiheng and Liu, Jingsong and Zaripova, Kamilia and Sharifzadeh, Sahand and Keicher, Matthias and Navab, Nassir},
  booktitle={International Conference on Medical Image Computing and Computer-Assisted Intervention},
  pages={54--63},
  year={2023},
  organization={Springer}
}

Acknowledgement

This code borrows heavily from Relationformer, Classification by Attention. We thank the authors for their great work.

The authors gratefully acknowledge the financial support by the Federal Ministry of Education and Research of Germany (BMBF) under project DIVA (FKZ 13GW0469C). Kamilia Zaripova was partially supported by the Linde & Munich Data Science Institute, Technical University of Munich Ph.D. Fellowship.

About

[GRAIL @ MICCAI 2023] Prior-RadGraphFormer: A Prior-Knowledge-Enhanced Transformer for Generating Radiology Graphs from X-Rays

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published