OpenPCDet
is a clear, simple, self-contained open source project for LiDAR-based 3D object detection.- This repository is dedicated solely to inferencing the CenterPoint-pointpillar model.
- Base Image:
nvcr.io/nvidia/tensorrt:23.04-py3
- OS: Ubuntu 20.24
- CUDA: 12.1.0
- cuDNN: 8.9.0
- TensorRT: 8.6.1
- python: 3.8
- Pytorch: 2.1.1
-
Please refer to the
docker.docs
for more details. -
If you would like to know more details, please refer to:
-
docker 설치 후 /var/run/docker.sock의 permission denied 발생하는 경우
sudo chmod 666 /var/run/docker.sock
- Build the docker base image
docker build -f docker/env.Dockerfile -t openpcdet-env docker/
- Start the container.
docker compose up --build -d
- Please refer to the docker/README.md for more details.
- For Waymo datasets, Install the official
waymo-open-dataset
by running the following command:
docker exec -it centerpoint bash
pip install --upgrade pip
sudo apt install python3-testresources
pip install waymo-open-dataset-tf-2-12-0==1.6.4
- Extract point cloud data from tfrecord and generate data infos by running the following command (it takes several hours, and you could refer to
data/waymo/waymo_processed_data_v0_5_0
to see how many records that have been processed):
# only for single-frame setting: without 'elongation' in the 'used_feature_list'
python -m pcdet.datasets.waymo.waymo_dataset --func create_waymo_infos \
--cfg_file tools/cfgs/dataset_configs/waymo_dataset_use_feature_no_elongation.yaml
# only for single-frame setting
python -m pcdet.datasets.waymo.waymo_dataset --func create_waymo_infos \
--cfg_file tools/cfgs/dataset_configs/waymo_dataset.yaml
# for single-frame or multi-frame setting
python -m pcdet.datasets.waymo.waymo_dataset --func create_waymo_infos \
--cfg_file tools/cfgs/dataset_configs/waymo_dataset_multiframe.yaml
# Ignore 'CUDA_ERROR_NO_DEVICE' error as this process does not require GPU.
- Please refer to the docs/GETTING_STARTED.md for more details.
- Execute the container
docker exec -it centerpoint bash
- Install OpenPCDet
cd ~/OpenPCDet
sudo python setup.py develop
- To Build Python module, you have to install and wrap the c++ to python API.
cd ~/
git clone https://github.com/pybind/pybind11.git
cd pybind11
cmake .
sudo make install
cd ~/OpenPCDet/centerpoint/pybind
cmake -BRelease
cmake --build Release
- If you use pytorch 1.x, you have to use
python -m torch.distributed.launch
i.e.,tools/scripts/dist_X.sh
- If you use pytorch 2.x, you have to use
torchrun
i.e.,tools/scripts/torch_train_X.sh
cd ~/OpenPCDet
ln -s /Dataset/Train_Results/CenterPoint/ output # you can replace `/Dataset/Train_Results/CenterPoint/` with directory you want
cd tools/
sh scripts/torch_train.sh 2 --cfg_file ./cfgs/waymo_models/centerpoint_pillar_train.yaml --batch_size 24
cd ~/OpenPCDet
ln -s /Dataset/Train_Results/CenterPoint/ output
cd tools/
CUDA_VISIBLE_DEVICES=1 python train.py --cfg_file ./cfgs/waymo_models/centerpoint_pillar_train.yaml --batch_size 16 # you can replace `CUDA_VISIBLE_DEVICES=1` with gpu's number you want
docker exec -it centerpoint bash
cd /Dataset
ros2 bag play segment-10359308928573410754_720_000_740_000_with_camera_labels/ # ros2 bag play folder_with_ros2bag
docker exec -it centerpoint bash
cd ~/OpenPCDet/tools/
python ros2_demo.py --cfg_file cfgs/waymo_models/centerpoint_pillar_inference.yaml --ckpt ../ckpt/checkpoint_epoch_24.pth
docker exec -it centerpoint bash
rviz2
- Fixed Frame: base_link
- Add -> By display type -> PountCloud2 -> Topic: /lidar/top/pointcloud, Size(m): 0.03
- Add -> By topic -> /boxes/MarkerArray
- If you already installed in the
1.5 PCDET Installation
, skip please.- Install pybind11
cd ~/
git clone git@github.com:pybind/pybind11.git
cd pybind11
cmake .
make install
- To evaluate TensorRT results, you have to wrap the c++ to python API.
- Build Python module
cd centerpoint/pybind
cmake -BRelease
cmake --build Release
docker exec -it centerpoint bash
cd ~/OpenPCDet/tools
python export_onnx.py --cfg_file cfgs/waymo_models/centerpoint_pillar_inference.yaml --ckpt ../ckpt/checkpoint_epoch_24.pth
As a result, create 3 onnx files on the CenterPoint/onnx
- model_raw.onnx: pth를 onnx 로 변환한 순수 버전
- model_sim.onnx: onnx 그래프 간단화해주는 라이브러리 사용한 버전
- model.onnx: sim 모델을 gragh surgeon으로 수정한 최종 버전, tensorRT plugin 사용하려면 gragh surgeon이 필수임.
cd ~/OpenPCDet/
cp onnx/model.onnx centerpoint/model/
- Build the ROS2 package in your ROS2 workspace.
cd ~/ && mkdir -p ros2_ws/src && cd ros2_ws/ && colcon build --symlink-install
cd src && ln -s OPENPCDET_PATH/centerpoint .
cd src/centerpoint && mkdir model
cd ~/ros2_ws && colcon build --symlink-install
source ~/ros2_ws/install/setup.bash
ros2 launch centerpoint centerpoint.launch.py
- Once running ros2 centerpoint node, create tensorRT file to the same folder having onnx file, automatically.
docker exec -it centerpoint bash
cd /Dataset
ros2 bag play segment-10359308928573410754_720_000_740_000_with_camera_labels/ # ros2 bag play folder_with_ros2bag
docker exec -it centerpoint bash
rviz2
- Fixed Frame: base_link
- Add -> By display type -> PountCloud2 -> Topic: /lidar/top/pointcloud, Size(m): 0.03
- Add -> By topic -> /boxes/MarkerArray
- If you already set in the
Section of 4.0
, please jump theSection of 5.1
. - If you already installed in the
1.5 PCDET Installation
, skip please.- Install pybind11
cd ~/
git clone git@github.com:pybind/pybind11.git
cd pybind11
cmake .
make install
- To evaluate TensorRT results, you have to wrap the c++ to python API.
- Build Python module
cd centerpoint/pybind
cmake -BRelease
cmake --build Release
docker exec -it centerpoint bash
cd ~/OpenPCDet/tools/
python test.py --cfg_file cfgs/waymo_models/centerpoint_pillar_inference.yaml --ckpt ../ckpt/checkpoint_epoch_24.pth
- Results as shown:
2024-07-08 07:59:21,802 INFO
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_1/AP: 0.6204
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_1/APH: 0.6137
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_1/APL: 0.6204
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/AP: 0.5417
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/APH: 0.5358
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/APL: 0.5417
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_1/AP: 0.5329
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_1/APH: 0.2887
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_1/APL: 0.5329
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/AP: 0.4553
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/APH: 0.2468
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/APL: 0.4553
OBJECT_TYPE_TYPE_SIGN_LEVEL_1/AP: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_1/APH: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_1/APL: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_2/AP: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_2/APH: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_2/APL: 0.0000
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_1/AP: 0.3267
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_1/APH: 0.2730
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_1/APL: 0.3267
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/AP: 0.3141
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/APH: 0.2625
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/APL: 0.3141
- If you set
test: 25000
ofMAX_NUMBER_OF_VOXELS
at thecfgs/waymo_models/centerpoint_pillar_inference.yaml
like TensorRT (centerpoint/config.yaml
), - You can get more similar results as shown:
2024-07-08 09:57:04,120 INFO
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_1/AP: 0.6199
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_1/APH: 0.6132
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_1/APL: 0.6199
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/AP: 0.5413
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/APH: 0.5353
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/APL: 0.5413
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_1/AP: 0.5327
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_1/APH: 0.2885
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_1/APL: 0.5327
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/AP: 0.4552
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/APH: 0.2466
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/APL: 0.4552
OBJECT_TYPE_TYPE_SIGN_LEVEL_1/AP: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_1/APH: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_1/APL: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_2/AP: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_2/APH: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_2/APL: 0.0000
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_1/AP: 0.3262
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_1/APH: 0.2729
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_1/APL: 0.3262
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/AP: 0.3137
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/APH: 0.2625
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/APL: 0.3137
docker exec -it centerpoint bash
cd ~/OpenPCDet/tools/
python test.py --cfg_file cfgs/waymo_models/centerpoint_pillar_inference.yaml --TensorRT
- Results as shown:
2024-07-08 09:08:53,073 INFO
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_1/AP: 0.5755
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_1/APH: 0.5697
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_1/APL: 0.5755
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/AP: 0.4995
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/APH: 0.4944
OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/APL: 0.4995
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_1/AP: 0.5283
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_1/APH: 0.2872
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_1/APL: 0.5283
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/AP: 0.4513
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/APH: 0.2454
OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/APL: 0.4513
OBJECT_TYPE_TYPE_SIGN_LEVEL_1/AP: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_1/APH: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_1/APL: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_2/AP: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_2/APH: 0.0000
OBJECT_TYPE_TYPE_SIGN_LEVEL_2/APL: 0.0000
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_1/AP: 0.2991
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_1/APH: 0.2508
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_1/APL: 0.2991
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/AP: 0.2876
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/APH: 0.2412
OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/APL: 0.2876