A ROS2 package for configuring, testing, and operating sensors:
- 📸 ZED2 Camera (Monocular Mode)
- 🔄 Ouster OS-1 LiDAR
- Operating System: Ubuntu 22.04 LTS (Jammy Jellyfish)
- ROS2 Distribution: Humble Hawksbill
- Python: 3.10 or higher (tested with 3.10.12)
- CUDA: 12.0 or higher (required for ZED SDK)
- Network: Ethernet port for LiDAR connection
- Download ZED SDK for Ubuntu 22.04:
wget https://download.stereolabs.com/zedsdk/4.2/cu12/ubuntu22 -O zed_sdk.run- Make the installer executable and run:
chmod +x zed_sdk.run
./zed_sdk.runInstall using pip:
pip3 install ouster-sdk-
🎥 Initial Setup: Follow the Ouster Connection Tutorial to properly connect your LiDAR via Ethernet.
-
🎥 Visualization: For testing visualization with ouster-cli, watch the Ouster Visualization Guide.
- Create and enter a ROS2 workspace:
mkdir -p ~/ros2_sensor_ws/src
cd ~/ros2_sensor_ws/src- Clone the repository:
git clone https://github.com/AV-Lab/Sensor_Setup .- Build and source:
cd ~/ros2_sensor_ws
colcon build && source install/setup.bashStart all sensors with a single command (ouster and ZED):
ros2 launch sensors launch_all_sensors.pyRun LiDAR or camera independently:
# Start Ouster LiDAR
ros2 run sensors ouster_node --ros-args --remap use_sim_time:=false# Start ZED Camera
ros2 run sensors zed_node --ros-args --remap use_sim_time:=false# Start ELP Camera
ros2 run sensors elp_node --ros-args --remap use_sim_time:=false📊 Synchronized Data Collection
- The first step in calibration is collecting synchronized data from all sensors
- The save_node enables synchronized capture of image frames and pointcloud data
- Data format:
.pngfor images and.pcdfor pointclouds
- 📁 Update topic names in
sensors/config/save_sample.yaml - Configurable parameters include:
- Synchronization threshold
- Sample folder paths
- Delay between frames
- Number of samples
- Other sampling parameters
# From rosbag playback
ros2 run sensors save_node --ros-args -p use_sim_time:=true
# Real-time sampling
ros2 run sensors save_node --ros-args -p use_sim_time:=falseuse_sim_time.
X ➡️ Forward (depth)
Y ⬅️ Left
Z ⬆️ Up
X ➡️ Right
Y ⬇️ Down
Z ➡️ Forward (depth)
The ZED camera operates in monocular mode using the left lens and publishes:
- Camera Image: Raw image feed
- Camera Info: Camera intrinsic parameters (
CameraInfotype)
- 📁 Config file:
sensors/config/zed_config.yaml - Uses default settings if distortion parameters aren't specified
- Identity rectification matrix (monocular mode)
- Compatible with OS-1 Ouster
- 📁 Config file:
sensors/config/ouster_config.yaml - Publishes
PointCloud2messages (x, y, z, intensity)
- Update LiDAR IP/hostname in config file
- FPS is tied to LiDAR mode (e.g., 512x20 = 20 fps)
- Check sensor status in ouster-cli before launching ROS2 nodes
This implementation focuses on two key calibration procedures:
- 📸 Camera intrinsic calibration
- 🔄 Camera-to-LiDAR extrinsic calibration
- Use a large checkerboard (at least 30x30cm) for reliable detection
- 🔗 Generate pattern: calib.io Pattern Generator
- 📏 Mount on rigid, flat surface (foam board works well)
⚠️ Verify printout dimensions are exact- Black squares should be truly black (matte finish preferred)
- ☀️ Good lighting but avoid direct sunlight (causes glare)
- 🧹 Clear area of objects with checkerboard-like patterns
- Ensure even lighting but avoid reflective surfaces
- 📸 Capture checkerboard at various:
- Distances: 1-5 meters (start close, then move back)
- Angles: 15-45 degrees from sensor axis
- Positions: cover entire sensor field of view
- 🎯 Aim for 15-50 diverse, high-quality frame pairs
- 🖐️ Hold pattern still during capture (motion blur affects accuracy)
- Check image exposure - avoid over/underexposed areas
- Mark floor positions for repeatable captures
- Test detection in MATLAB with a few samples before full collection
- Back up raw data before processing
Camera intrinsic calibration determines the internal parameters affecting 3D-to-2D projection.
- 📏 Focal length (fx, fy)
- 🎯 Principal point (cx, cy)
- 🔧 Distortion coefficients
- Radial (k1, k2)
- Tangential (p1, p2)
- Capture checkerboard images
- Detect corners
- Apply Zhang's method
After calibration, update in config files (Zed_camera, Elp_camera):
# Maps 3D camera coordinates → 2D image points
distortion: [k1, k2, p1, p2, k3]
camera_matrix_K: [fx, 0, cx, 0, fy, cy, 0, 0, 1]
rectification: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]# Install ROS calibration package
sudo apt-get install ros-<ros2-distro>-camera-calibration
# Run calibration node for monocular camera
ros2 run camera_calibration cameracalibrator --size 8x6 --square 0.108 image:=/camera/image_raw camera:=/camera/camera_info
# Parameters:
# --size: Number of inner corners (width x height)
# --square: Size of each square in meters
# /camera/image_raw: Raw image topic
# /camera/camera_info: Camera info topic Follow calibration steps:
- Move checkerboard to fill calibration bars
- Click CALIBRATE when ready
- Click SAVE after successful calibration
- Find results in ~/.ros/camera_info/
% Launch Camera Calibrator App
cameraCalibrator
% Or use Apps tab -> Camera CalibratorFor detailed MATLAB calibration workflow (including both intrinsic and extrinsic calibration), see MATLAB Based Calibration in Camera-to-LiDAR section.
Determines geometric transformation between sensors for point cloud projection.
- 🔄 Rotation matrix (R)
- 📏 Translation vector (t)
# Configure in save_sample.yaml first!
ros2 run sensors save_node --ros-args -p use_sim_time:=falseMATLAB can perform both intrinsic and extrinsic calibration together:
- Launch Calibrator:
% Option 1: Command line
lidarCameraCalibrator % For both calibrations
% Option 2: Apps tab in MATLAB
% Click 'Lidar Camera Calibrator'-
Load and Process Data:
- Select synchronized image-pointcloud pairs
- Specify checkerboard dimensions
- Enter square size in meters
-
Troubleshooting Detection:
% If plane detection fails:
% Manual plane selection:
- Click 'Select Region' button
- Draw polygon around checkerboard in pointcloud view
- Adjust selection until plane is well-defined
% Adjust detection parameters:
- Open 'Settings'
- Modify 'Plane Detection Threshold' (try range 0.01-0.1)
- Adjust 'Refinement Parameters' if needed- Export Results:
- Click 'Export Parameters'
- Choose YAML format
- Save for ROS2 usage
projection: [P11, P12, P13, P14, P21, P22, P23, P24, P31, P32, P33, P34]Before using interactive_node, update initial parameters in calibrate.yaml:
# sensors/config/calibrate.yaml
transform:
# Update with initial intrinsic parameters (from ROS2 or MATLAB calibration)
intrinsic_k: [fx, 0, cx, 0, fy, cy, 0, 0, 1]
# Update with initial extrinsic parameters (from MATLAB calibration)
lidar_camera: [R11, R12, R13, t1,
R21, R22, R23, t2,
R31, R32, R33, t3,
0, 0, 0, 1]ros2 run sensors interactive_nodeTranslation Controls:
r - Move right l - Move left
u - Move up d - Move down
Rotation Controls:
rl - Rotate left (Z-axis) rr - Rotate right (Z-axis)
ru - Rotate up (Y-axis) rd - Rotate down (X-axis)
Commands:
s - Save current transformation
n - Next image pair
- Start with small adjustment values:
- Translation: try 0.01
- Rotation: try 0.001
- Check alignment at different depths
- Verify with multiple image pairs
- Save progress frequently
- 🎯 Check alignment at corners and edges
- 📏 Verify at multiple distances
- 🔄 Test with dynamic scenes
- ⚡ Watch for temporal synchronization issues
- 🔍 Look for consistent offsets
Would you like me to adjust anything in this updated version?
- Configure time source:
# Set system time for recording
ros2 param set /your_node use_sim_time false
# Or via launch file:
ros2 run sensors node_to_run --ros-args -p use_sim_time:=false- Start recording:
ros2 bag record -a -o my_rosbag- Set simulation time:
# Enable simulation time for playback
ros2 param set /your_node use_sim_time true
# Or via launch file:
ros2 run sensors save_node --ros-args -p use_sim_time:=true- Play recorded data:
ros2 bag play my_rosbag --clock 100- ✅ Maintain consistent time sources across nodes
- ✅ Set
use_sim_timein node constructor for new nodes - ✅ Configure timing in launch files when possible
- ✅ Verify settings:
ros2 param get /your_node use_sim_time - ✅ Review all config files before starting sensors
For issues or feature requests, please open an issue on our GitHub repository.