Skip to content

YangParky/BASeg

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BASeg

The code of paper "BASeg: Boundary Aware Semantic Segmentation for Autonomous Driving"

Introduction

This repository is a official PyTorch implementation for semantic segmentation.

Usage

  1. Requirement:

    • Hardware: 4-8 GPUs (better with >=11G GPU memory)
    • Software: PyTorch>=1.1.0, Python3, tensorboardX
  2. Clone the repository:

    git clone git@github.com:YangParky/BASeg.git
  3. Data preparation

    • Download related datasets (ADE20K, Cityscapes, CamVid) and symlink the paths to them as follows (you can alternatively modify the relevant paths specified in folder config):

    • To boost the slow speed of the training, you're supposed to prepare the boundary ground truth from here.

    • The directory structure is the standard layout for the torchvision

      /Dataset/
        ADE20K/
          Scene-Parsing/
             ADEChallengeData2016/
               images/
               bound/
               annotations/
        Cityscapes/
          bound/
          gtFine/
          leftImg8bit/
        CamVid/
          bound/
          CamVid_Label/
          CamVid_RGB/
      /Model
      /Project
        /BASeg/
      
  4. Train:

    • Download ImageNet pre-trained models and put them under folder model for weight initialization.
    • For full traning: ADE20K:
      sh tools/trainade.sh ade20k baseg101
      Cityscapes:
      sh tools/traincityscapes.sh cityscapes baseg101
      CamVid:
      sh tools/traincamvid.sh camvid baseg101
  5. Test:

    • Download trained segmentation models and put them under folder specified in config or modify the specified paths.

    • For full testing (get listed performance): Validation on ADE20K

      sh tools/testade.sh ade20k baseg101

      Test on Cityscapes

      sh tools/testcityscapes.sh cityscapes baseg101

      Validation on CamVid

      sh tools/testcamvid.sh camvid baseg101
    • For boundary evaluation: Evaluation on boundary F1_score

      python util/f_boundary.py

      Evaluation on interior F1_score

      python util/f_interior.py

Citation

If you find the code or trained models useful, please consider citing:

@article{xiao2023baseg,
  title={BASeg: Boundary aware semantic segmentation for autonomous driving},
  author={Xiao, Xiaoyang and Zhao, Yuqian and Zhang, Fan and Luo, Biao and Yu, Lingli and Chen, Baifan and Yang, Chunhua},
  journal={Neural Networks},
  volume={157},
  pages={460--470},
  year={2023},
  publisher={Elsevier}
}

Acknowledgement

The code is from the first author of semseg.

License

This repository is released under MIT License (see LICENSE file for details).

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published