Skip to content

The official implementation of "TFormer: A throughout fusion transformer for multi-modal skin lesion diagnosis"

License

Notifications You must be signed in to change notification settings

zylbuaa/TFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TFormer

MIT Licence

Introduction

The official implementation of "TFormer: A throughout fusion transformer for multi-modal skin lesion diagnosis"

Our Network Structure

Enviroments

  • Windows/Linux both support
  • python 3.9
  • PyTorch 1.12.1
  • torchvision

Prepare dataset

Please at first download datasets Derm7pt and then download the pretrained model of swin-tiny on ImageNet-1k from github. Save the model into the folder "./models/swin_transformer".

Run details

To train our TFormer, run:

python train.py --dir_release "your dataset path" --epochs 100 --batch_size 32 --learning_rate 1e-4 --cuda True

License

This project is licensed under the MIT License. See LICENSE for details

Acknowledgement

Our code borrows a lot from:

About

The official implementation of "TFormer: A throughout fusion transformer for multi-modal skin lesion diagnosis"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages