Skip to content

In this paper, we introduce StreakNet-Arch, a novel signal processing architecture designed for Underwater Carrier LiDAR-Radar (UCLR) imaging systems, to address the limitations in scatter suppression and real-time imaging.

License

Notifications You must be signed in to change notification settings

BestAnHongjun/StreakNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Introduction

In this paper, we introduce StreakNet-Arch, a novel signal processing architecture designed for Underwater Carrier LiDAR-Radar (UCLR) imaging systems, to address the limitations in scatter suppression and real-time imaging. StreakNet-Arch formulates the signal processing as a real-time, end-to-end binary classification task, enabling real-time image acquisition. To achieve this, we leverage Self-Attention networks and propose a novel Double Branch Cross Attention (DBC-Attention) mechanism that surpasses the performance of traditional methods. Furthermore, we present a method for embedding streak-tube camera images into attention networks, effectively acting as a learned bandpass filter. To facilitate further research, we contribute a publicly available streak-tube camera image dataset. The dataset contains 2,695,168 real-world underwater 3D point cloud data. These advancements significantly improve UCLR capabilities, enhancing its performance and applicability in underwater imaging tasks.

For further details, please refer to our paper.

Dataset

Introduction

StreakNet-Dataset is an underwater laser imaging dataset for UCLR systems. It comprises a collection of streak-tube images captured by a UCLR system at distances of 10m, 13m, 15m, and 20m. See the table below to learn more details of the dataset.

Distance Number of streak-tube images Resolution of streak-tube images Data type Training set Validation set Test set
10m 400 2048x2048 uint16 315,200 40,800 819,200
13m 349 2048x2048 uint16 281,992 47,530 714,752
15m 300 2048x2048 uint16 245,400 39,200 614,400
20m 267 2048x2048 uint16 229,086 31,240 546,816
Download

You can download StreakNet-Dataset for free from HuggingFace or ModelScope by Git.

Firstly, install git-lfs.

curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | sudo bash
sudo apt update
sudo apt install git-lfs   
sudo git lfs install  --system

Then, download StreakNet-Dataset in work directory of StreakNet.

cd StreakNet
git clone https://huggingface.co/datasets/Coder-AN/StreakNet-Dataset ./datasets
cd StreakNet
git clone https://www.modelscope.cn/datasets/CoderAN/StreakNet-Dataset.git ./datasets
Organizational Structure

After downloading StreakNet-Dataset from HuggingFace or ModelScope, you will see the following directory structure.

datasets
    |- clean_water_10m      # The directory of data taken at a distance of 10m
    |   |- data             # Original streak images
    |   |   |- 001.tif
    |   |   |- 002.tif
    |   |   |- 003.tif
    |   |   |- ...
    |   |
    |   |- groundtruth.npy  # The ground-truth of the final imaged image
    |   |- preview.jpg      # A preview of the ground-truth
    |
    |- clean_water_13m      # The directory of data taken at a distance of 13m (has the same structure as 10m)
    |- clean_water_15m      # The directory of data taken at a distance of 15m (has the same structure as 10m)
    |- clean_water_20m      # The directory of data taken at a distance of 20m (has the same structure as 10m)
    |- template.npy         # The 1-D time sequence of the template signal
    |- test_config.yaml     # The config file of test-set
    |- train_config.yaml    # The config file of training-set
    |- valid_config.yaml    # The config file of validation-set

Quick Start

Installation
conda create -n streaknet python=3.10
conda activate streaknet
  • Step2. Install StreakNet from source.
git clone https://github.com/BestAnHongjun/StreakNet.git
cd StreakNet
pip install -e .
Prepare Dataset
  • Step1. Install the StreakNet module by following the 'Installation' section.

  • Step2. Download the StreakNet-Dataset by following the 'Download' section, then you will see the following directory structure.

StreakNet
    |- datasets
    |   |- clean_water_10m
    |   |- clean_water_13m
    |   |- clean_water_15m
    |   |- ...
    |
    |- assets
    |- exps
    |- scripts
    |- streaknet
    |- ...
Train Models
  • Step1. Install the StreakNet module by following the 'Installation' section.

  • Step2. Prepare the StreakData dataset by following the 'Prepare Dataset' setction.

  • Step3. Run the following commands to train the respective models in the root directory.

python tools/train_streaknet.py -b 512 -f exps/streaknet/streaknet_s.py --cache
                                                         streaknet_m.py
                                                         streaknet_l.py
                                                         streaknet_x.py
python tools/train_streaknet.py -b 512 -f exps/streaknetv2/streaknetv2_s.py --cache
                                                           streaknetv2_m.py
                                                           streaknetv2_l.py
                                                           streaknetv2_x.py

Arguments:
-b: set the batch-size when training.
-f: specify the experiment profile.
--cache: use RAM cache when training

Attention:

(1) When you enable the --cache option, the program will preload the dataset into the RAM to accelerate the training process. Please ensure that your server has at least 25GB of free RAM space to use this option. If your RAM space is insufficient, please disable the --cache option. In that case, the program will load data directly from the disk when needed. However, this approach often results in 10 times longer training times.

(2) The program will utilize CUDA to accelerate the training process. Please ensure that your server is equipped with at least one NVIDIA GPU with a graphics memory capacity of more than 2GB.

python tools/train.py -b 512 -f exps/streaknet/streaknet_s.py
                                               streaknet_m.py
                                               streaknet_l.py
                                               streaknet_x.py
python tools/train.py -b 512 -f exps/streaknetv2/streaknetv2_s.py
                                                 streaknetv2_m.py
                                                 streaknetv2_l.py
                                                 streaknetv2_x.py
  • Step4. Real-time training status will be saved to StreakNet_outputs folder. Run tensorboard to visualize the status of the training process.
tensorboard --logdir=StreakNet_outputs
Demo
# From HuggingFace: For Global Users
cd StreakNet
git clone https://huggingface.co/Coder-AN/StreakNet-Models ./checkpoints
# From ModelScope: For Chinese Users
cd StreakNet
git clone https://www.modelscope.cn/CoderAN/StreakNet-Models.git ./checkpoints
  • Step2. Run the following command to run StreakNet demo:
python tools/demo_streaknet.py -b 2 \
  --path datasets/clean_water_13m \
  -f exps/streaknet/streaknet_s.py \
  -c checkpoints/streaknet_s_ckpt.pth \
  --device "cuda:0" \
  --cache --real-time

Arguments:
--path: path to the dataset.
-f: specify the experiment profile.
-b: set the batch-size when inferring.
-c: specify the model weights when inferring.
--device: specify the GPU when inferring.
--realtime: enable real-time preview.
--save: save imaging results.

Attention: If you omit the -c option, the program will automatically use the 'best_ckpt.pth' file located in the 'StreakNet_outputs' directory, which you just trained in the 'Train Models' section.

python tools/demo_streaknet.py -b 2 \
  --path datasets/clean_water_13m \
  -f exps/streaknet/streaknet_s.py \
  --device "cuda:0" \
  --save
  • Step3. Run the following command to run traditional bandpass-filter demo:
python tools/demo_bandpass.py -b 2 --path datasets/clean_water_13m --device "cuda:0" --cache

Arguments:
--path: path to the dataset.
-b: set the batch-size when inferring.
--device: specify the GPU when inferring.
--save: save imaging results.

  • Step4. Use FDEL as an equivalent bandpass filter:
python tools/demo_bandpass.py -b 2 \
  --path datasets/clean_water_13m \
  -f exps/streaknet/streaknet_s.py \
  -c checkpoints/streaknet_s_ckpt.pth \
  --device "cuda:0" --cache

Arguments:
--path: path to the dataset.
-f: specify the experiment profile.
-b: set the batch-size when inferring.
-c: specify the model weights when inferring.
--device: specify the GPU when inferring.
--save: save imaging results.

Evaluation
python tools/valid_streaknet.py -b 2 \
  -f exps/streaknet/streaknet_s.py \
  -c checkpoints/streaknet_s_ckpt.pth \
  -d "cuda:0" --cache

Arguments:
-f: specify the experiment profile.
-b: set the batch-size when inferring.
-c: specify the model weights when inferring.
-d: specify the GPU when inferring.
--save: save imaging results.

  • Step5. Evaluate traditional bandpass filter algorithm:
python tools/valid_bandpass.py -b 2 -d "cuda:0" --cache

Arguments:
-b: set the batch-size when inferring.
--device: specify the GPU when inferring.
--save: save imaging results.

  • Step 6. Evaluate the equivalent bandpass filter:
python tools/valid_bandpass.py -b 2 \
  -f exps/streaknet/streaknet_s.py \
  -c checkpoints/streaknet_s_ckpt.pth \
  -d "cuda:0" --cache

Arguments:
-f: specify the experiment profile.
-b: set the batch-size when inferring.
-c: specify the model weights when inferring.
-d: specify the GPU when inferring.
--save: save imaging results.

Test speed benchmark
python tools/benchmark_streaknet.py -f exps/streaknet/streaknet_s.py -d "cuda:0" --save
  • Step 4. Test AIT of traditional bandpass filter algorithm.
python tools/benchmark_bandpass.py -d "cuda:0" --save

Cite StreakNet

If you use StreakNet in your research, please cite our work by using the following BibTeX entry:

@misc{li2024streaknetarch,
      title={StreakNet-Arch: An Anti-scattering Network-based Architecture for Underwater Carrier LiDAR-Radar Imaging}, 
      author={Xuelong Li and Hongjun An and Guangying Li and Xing Wang and Guanghua Cheng and Zhe Sun},
      year={2024},
      eprint={2404.09158},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Respect to Predecessors

  • During the development of this open-source project, we drew inspiration from the excellent engineering architecture of the YOLOX project by Megvii Technology. The YOLOX project was led by Dr. Jian Sun (1976.10-2022.6.14), a respected scientist, who made significant contributions to the advancement of computer vision.🕯️🕯️🕯️
  • We were deeply saddened to hear the news of the passing of Prof. Xiaoou Tang (1968.1-2023.12.15) on December 16, 2023, shortly after completing all the preliminary experiments for this project. Prof. Tang devoted his entire life to computer science research and made outstanding contributions to the advancement of computer vision and artificial intelligence. We express our utmost respect to Prof. Tang.🕯️🕯️🕯️

Copyright


Copyright © School of Artificial Intelligence, OPtics and ElectroNics(iOPEN), Northwestern PolyTechnical University.
All rights reserved.

About

In this paper, we introduce StreakNet-Arch, a novel signal processing architecture designed for Underwater Carrier LiDAR-Radar (UCLR) imaging systems, to address the limitations in scatter suppression and real-time imaging.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages