Skip to content

This dataset and code is an implementation of "NIRPed: A Novel Benchmark for Nighttime Pedestrian and Its Distance Joint Detection" published in IEEE-TITS.

Notifications You must be signed in to change notification settings

XiaobiaoDai/NIRPed

Repository files navigation

NIRPed-JointDetection (Implementation based on Tensorflow & Keras)

I. NIRPed dataset

There is a training, validation and testing subset in NIRPed that doesn't need to be divided again.
For being compatible with the existing framework, NIRPed's annotations are provided in the MS-COCO format (JSON).
To make it more accessible to those interested, we have created a mini version of NIRPed called miniNIRPed.
We uploaded the miniNIRPed dataset to this repository.
Due to capacity limitations, it is difficult for us to upload 150GB of data to the repository of GitHub.
Therefore, we give the link to the complete version of the NIRPed dataset on the Cloud Disk of Central South University (CSU), which is open to the world.

A. Data of NIRPed

PNG/PICKLE/JSON (Python):
Training images (60GB) /pickle files (1.6GB) /Training annotations (38MB)
Validation images (38GB) /pickle files (1GB) /Validation annotations (25MB)
Testing images (39GB) /pickle files (1GB) /Testing image information except for annotations (9MB)
You can project LiDAR point clouds saved in a PICKLE file to its matched image by running 'LiDAR2Img.py'.

Please use Google Chrome or Microsoft Edge to download the NIRPed dataset via: https://pan.csu.edu.cn/#/link/3F35F56A95E21A7D2BDE30B3A431936B?path=NIR_PED%2FNIRPed

B. Data of miniNIRPed

PNG/PICKLE/JSON (Python):
Training images (284MB) /pickle files (8MB) /Training annotations (290KB)
Validation images (172MB) /pickle files (5MB) /Validation annotations (183KB)
Testing images (177MB) /pickle files (5MB) /Testing image information except annotations (40KB)

You can also use Google Chrome or Microsoft Edge to download the miniNIRPed dataset via: https://pan.csu.edu.cn/#/link/3F35F56A95E21A7D2BDE30B3A431936B?path=NIR_PED%2FminiNIRPed

C. License

This dataset is made freely available to academic and non-academic entities for non-commercial purposes such as academic research, teaching, scientific publications, or personal experimentation. Permission is granted to use the data given that you agree: That the dataset comes “AS IS”, without express or implied warranty. Although every effort has been made to ensure accuracy, we do not accept any responsibility for errors or omissions. That you include a reference to the NIRPed Dataset in any work that makes use of the dataset. That you do not distribute this dataset or modified versions. It is permissible to distribute derivative works as far as they are abstract representations of this dataset (such as models trained on it or additional annotations that do not directly include any of our data) and do not allow to recover the dataset or something similar in character. You may not use the dataset or any derivative work for commercial purposes such as, for example, licensing or selling the data, or using the data with a purpose to procure a commercial gain. That all rights not expressly granted to you are reserved by us.

II. JointDetector

NIRPed-JointDetector has been implemented based on Tensorflow & Keras in Python

A. Environment

  1. pythonn == 3.9
  2. tensorflow-gpu == 2.9.0
  3. keras == 2.9.0
  4. Please refer to requirements.txt for more configuration.

B. Download

1. Weights

The required network weights (NIRPed_weights_resnet50.h) can be downloaded from our repository in GitHub.

2. Data

There are training, validation and testing subset in NIRPed which doesn't need to be divided again.

C. How2train

1. Training on NIRPed

Data preparation
Before training, downloaded NIRPed training subset,and unzip images to the folder of "./data/NIRPed/images/train";
download COCO-format annotation train.json to the folder of "./data/NIRPed/labels".
Configuration
Open "./keras_frcnn/config.py", modify self.train_img_dir to the training image path (./data/NIRPed/images/train);
Open "./keras_frcnn/config.py", modify self.train_anno to the training annotation path (./data/NIRPed/labels/train.json).
Begin training
Run "train_JointDetector.py" to start training.

2. Training on your own dataset

Data preparation
Collect the image and target distance information in the image, and make a COCO-format annotation file.
Before training, put the png image files into the folder of "./data/yourDataset/images/train";
put the annotation file train.json into the folder of "./data/yourDataset/labels".

Configuration
Open "./keras_frcnn/config.py", modify self.train_img_dir to the training image path (./data/yourDataset/images/train);
modify self.train_anno to the training annotation path (./data/yourDataset/labels/train.json);
modify self.class_mapping according to your tasks;
modify other parameters according to your tasks.

3. Begin training

Run "train_JointDetector.py" to start training. During the training stage, weights will be saved in the folder of "./model_data".

D. How2predict

1. Using our weights

Data preparation
Before prediction, downloaded NIRPed validation or test subset,and unzip images to the folder of "./data/NIRPed/images/val" or "./data/NIRPed/images/test";
download COCO-format annotation val.json or test.json to the folder of "./data/NIRPed/labels";
download optimized weight file (NIRPed_weights_resnet50.h) to "./model_data" from CSU cloud disk.
Configuration
Open "./keras_frcnn/config.py", modify self.val_img_dir or self.test_img_dir to the image path ("./data/NIRPed/images/val" or "./data/NIRPed/images/test");
modify self.model_path to the model path (./model_data/NIRPed_weights_resnet50.h5).
Open "Test_JointDetector.py", modify results_dir to the results-saving path ("./results_NIRPed").
Begin prediction
Run "Test_JointDetector.py" to start prediction. During the prediction stage, results will be saved in the folder of "./results_NIRPed".

2. Using your own weights

Data preparation
After optimizing the weights on your own data, put the weights in the folder of "./model_data".
Before prediction, put the png image files into the folder of "./data/yourDataset/images/val" or "./data/yourDataset/images/test";
put the COCO-format annotation file val.json or test.json into the folder of "./data/yourDataset/labels".

Configuration
Open "./keras_frcnn/config.py", modify self.val_img_dir or self.test_img_dir to the image path ("./data/yourDataset/images/val" or "./data/yourDataset/images/test");
modify self.val_anno or self.test_anno to the annotation paths ("./data/yourDataset/labels/val.json" or "./data/yourDataset/labels/test.json");
modify self.class_mapping according to your tasks;
modify other parameters according to your tasks.

Open "Test_JointDetector.py", modify results_dir to the results-saving path ("./results_yourDataset").**
Begin prediction
Run "Test_JointDetector.py" to start prediction. During the prediction stage, results will be saved in the folder of "./results_yourDataset".

E. How2eval

1. Evaluation on NIRPed validation or testing subset

Data preparation
After prediction, put the results in the folder of "./results_NIRPed";
Before evaluation, put the png image files into the folder of "./data/NIRPed/images/val" or "./data/NIRPed/images/test";
put the COCO-format annotation file val.json or test.json into the folder of "./data/NIRPed/labels".

Configuration
Open "./keras_frcnn/config.py", modify self.val_img_dir or self.test_img_dir to the image path ("./data/NIRPed/images/val" or "./data/NIRPed/images/test").
Open "Evaluate_JointDetector.py", modify Detection_results_dir to the results-saving path ("./results_NIRPed/dt_results_val_B300_001");
modify other parameters in the "Evaluate_JointDetector.py" according to your tasks.

Begin evaluation
Run Evaluate_JointDetector.py to start evaluation. During the prediction stage, results will be saved in the folder of "./results_NIRPed/dt_results_val_B300_001".

2. Evaluation on your own dataset (yourDataset)

Data preparation
After prediction, put the results in the folder of "./results_yourDataset";
Before evaluation, put the png image files into the folder of "./data/yourDataset/images/val" or "./data/yourDataset/images/test";
put the COCO-format annotation file val.json or test.json into the folder of "./data/yourDataset/labels".

Configuration
Open "./keras_frcnn/config.py", modify self.val_img_dir or self.test_img_dir to the image path ("./data/yourDataset/images/val" or "./data/yourDataset/images/test").
Open "Evaluate_JointDetector.py", modify Detection_results_dir to the results-saving path ("./results_yourDataset/dt_results_val_B300_001");
modify other parameters in the "Evaluate_JointDetector.py" according to your tasks.

Begin evaluation
Run Evaluate_JointDetector.py to start evaluation. During the prediction stage, results will be saved in the folder of "./results_yourDataset/dt_results_val_B300_001".

III. Performance

train dataset weight name test dataset input image size MR-2 AP@0.5 MAER
NIRPed NIRPed_weights_resnet50.h NIRPed-val 640*256 6.5 92.4 5.46
NightOwls NightOwls_weights_resnet50.h NightOwls-val 640*256 17.2 77.7 -
ECP ECP_weights_resnet50.h ECP-val 960*256 21.1 81.9 -
KAIST KAIST_weights_resnet50.h KAIST-test 640*256 37.3 69.8 -

IV. Acknowledgement

This work builds on many excellent works, which include:

  1. https://github.com/jinfagang/keras_frcnn
  2. https://github.com/chenyuntc/simple-faster-rcnn-pytorch

V. Cite our Dataset

If you find NIRPed Dataset useful in your research, please consider citing:
@ARTICLE{10077447, author={Dai, Xiaobiao and Hu, Junping and Luo, Chunlei and Zerfa, Houcine and Zhang, Hai and Duan, Yuxia}, journal={IEEE Transactions on Intelligent Transportation Systems}, title={NIRPed: A Novel Benchmark for Nighttime Pedestrian and Its Distance Joint Detection}, year={2023}, volume={}, number={}, pages={1-11}, doi={10.1109/TITS.2023.3257079}}

^_^

Contribution Welcome.

If you encounter any problem, feel free to open an issue, or contact me directly via email: 3622@hnsyu.edu.cn/289234100@qq.com

Correct me if anything is wrong or unclear.

About

This dataset and code is an implementation of "NIRPed: A Novel Benchmark for Nighttime Pedestrian and Its Distance Joint Detection" published in IEEE-TITS.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages