Skip to content

Pytorch implementation of FTNet for Semantic Segmentation on SODA, SCUT Seg, and MFN Datasets

License

Notifications You must be signed in to change notification settings

shreyaskamathkm/FTNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PWC

PWC

PWC

FTNet

This repository is an official PyTorch implementation of the paper " FTNet: Feature Transverse Network for Thermal Semantic Segmentation "

Main Figure We provide scripts for the models from our paper. You can train your own model from scratch, or use pretrained models for testing.

FTNet Model Weights

Model weights are provided for ResNeXt50 and ResNeXt101.

For user convenience, the Thermal Cityscape pretrained model and weights for all datasets are provided here.

This link also provides the semantic maps generated during testing phase.

Highlight:

  • Completely Built on Pytorch Lightning with well designed code structures. This comes with built in DistributedDataParallel, DataParallel support.
  • All initialization models, trained models and predictions are available.
  • Can be easily used to plug in new models with minimal changes.

Requirements

  • Hardware: 1 - 2 GPUs (better with >=11G GPU memory)
  • Python 3.8
  • Pytorch >=1.6 (Code tested on 1.6)

Code

Clone this repository into any place you want.

git clone https://github.com/shreyaskamathkm/FTNet.git
cd FTNet

Dependencies

Please run the following to meet the requirements of the model

pip install -r requirements.txt

Setting up the environment for training and testing

We train and test the models on three dataset:

Extracting Dataset

Please download all the datasets from the link provided above. Once downloaded, run the following commands to get the dataset into the following data structure.

For simplicity sake, consider all the images are downloaded to a folder name Raw_Dataset. The rest of the steps are as follows

For Cityscapes thermal dataset

cd Codes/src/datasets/utils/  # You are now in */src/datasets/utils/
python Cityscape_folderMap.py --input-image-path /raw_dataset/SODA-20211127T202136Z-001/SODA/TIR_leftImg8bit/ --save-path /Processed_dataset/

For SODA thermal dataset

cd Codes/src/datasets/utils/  # You are now in */src/datasets/utils/
python SODA_folderMap.py --input-image-path /raw_dataset/SODA-20211127T202136Z-001/SODA/InfraredSemanticLabel/ --save-path /Processed_dataset/

For SCUTSeg thermal dataset

cd Codes/src/datasets/utils/  # You are now in */src/datasets/utils/
python scutseg_foldermap.py --input-image-path /raw_dataset/SCUT-SEG/ --save-path /Processed_dataset/

For MFN thermal dataset

cd Codes/src/datasets/utils/  # You are now in */src/datasets/utils/
python MFNDataset_folderMap.py --input-image-path /raw_dataset/ir_seg_dataset/ --save-path /Processed_dataset/

Generating Edges

Please Note: Current implementation requires MATLAB to generate edges.

cd Codes/src/datasets/edge_generation/
Change the path in the 'main.m' file and run it to generate edges

Dataset Structure

Once the extracting and edge generation is completed, the dataset looks similar to the structure provided below:

├── ...
├── Processed_dataset                                            # Dataset Folder
│   ├── Cityscapes_thermal
│   	├── CITYSCAPE_5000
│           ├── edges
│   	        └── train
│   	    ├── image
│   	        └── train
│   	    └── mask
│   	        └── train
│   ├── SODA
│           ├── edges
│   	        ├── train
│   	        ├── val
│   	        └── test
│   	    ├── image
│   	        ├── train
│   	        ├── val
│   	        └── test
│   	    └── mask
│   	        ├── train
│   	        ├── val
│   	        └── test
│   ├── MFNDataset
│           ├── edges
│   	        ├── train
│   	        ├── val
│   	        └── test
│   	    ├── image
│   	        ├── train
│   	        ├── val
│   	        └── test
│   	    └── mask
│   	        ├── train
│   	        ├── val
│   	        └── test
│   ├── SCUTSEG
│           ├── edges
│   	        ├── train
│   	        └── val
│   	    ├── image
│   	        ├── train
│   	        └── val
│   	    └── mask
│   	        ├── train
│   	        └── val
└── ...

The new processed dataset will be used for training purposes. You can now train FTNet by yourself. Training and testing script is provided in the */FTNet/Codes/src/bash folder. Before you run them, please fill in the appropriate details in the .sh file before you execute.

cd /Codes/src/bash       # You are now in */src/bash/
bash Train_and_test.sh     # To train and test one dataset. eg: SODA
cd /Codes/src/bash       # You are now in */src/bash/
bash Train_and_test_all.sh     # To train and test more than one dataset. eg: SODA, MFN, SCUT-Seg

License

Please read the LICENSE file in the repository

Citation

If you find the code or trained models useful, please consider citing:

@ARTICLE{9585453,  
author={Panetta, Karen and Shreyas Kamath, K. M. and Rajeev, Srijith and Agaian, Sos S.},
journal={IEEE Access},   
title={FTNet: Feature Transverse Network for Thermal Image Semantic Segmentation},   
year={2021},  
volume={9},  
number={},  
pages={145212-145227},  
doi={10.1109/ACCESS.2021.3123066}}

References