Skip to content
/ nfcalib Public

Code accompanying the paper "Automatic Spatial Calibration of Near-Field MIMO Radar With Respect to Optical Depth Sensors" (IROS 2024)

License

Notifications You must be signed in to change notification settings

vwirth/nfcalib

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Automatic Spatial Calibration of Near-Field MIMO Radar With Respect to Optical Depth Sensors

language CC BY-NC 4.0

CC BY-NC 4.0

This is the official code repository accompanying the paper Automatic Spatial Calibration of Near-Field MIMO Radar With Respect to Optical Depth Sensors.

This repository performs spatial calibration for MIMO Radars in conjunction with various optical depth sensors. We tested our calibration with the following sensor systems:

  • 📡 [MIMO Radar] Rohde & Schwarz's QAR50 radar submodule
  • 📷 [RGB-D] Microsoft Kinect Azure
  • 📷 [RGB-D] Intel Realsense D435i
  • 📷 [RGB-D] Stereolabs Zed X Mini
  • 📷 [Multi View Stereo] Groundtruth Multi View Stereo reconstructions with Agisoft Metashape

Furthermore, it includes the reconstruction code for the MIMO imaging radar measurements.

Table of Contents

Dependencies

  • All basic dependencies are listed in code/setup.py. To install the nfcalib package, run:
cd code;
python3 -m pip install setup.py 

[Optional]: To visualize all sensor data, the installation requires several sensor-specific packages:

  • pykinect_azure (Microsoft Azure Kinect)
  • pyzed (Stereolabs ZED)
  • pyrealsense2 (Intel Realsense) If you only want to visualize photogrammetry and radar data, you can skip installation of these dependencies. Make sure to adjust the sensors_in_use option in your configuration file in this case (see configuration) Further Installation instructions about these additional packages is provided below.

Pykinect Azure (Optional)

If you wish to include the Microsoft Kinect, you need to download the Microsoft Kinect Azure SDK with version 1.3.0. Installation instructions for Linux and Windows are provided here - However, they might not work for your current Linux system (see instructions below)

As the the installation via the package manager system is discontinued since Ubuntu 18.04, it is advisable to install the SDK from scratch. Therefore, make sure to have the following components installed

sudo apt-get ninja-build

Then, install version 1.3.0 of the SDK via:

git clone https://github.com/microsoft/Azure-Kinect-Sensor-SDK.git
cd Azure-Kinect-Sensor-SDK
git checkout v1.3.0
mkdir build && cd build
cmake .. -GNinja
ninja
sudo ninja install

After that, you should have two libraries: libk4a and libk4a-dev

To have a Python wrapper for the kinect SDK, this repository adapts code from Ibai Gorordo's pyKinectAzure, which is located in external/pykinect_azure. To install this package, run:

cd external/pykinect_azure;
python -m pip install setup.py 

Pyzed (4.2.0) (Optional)

Installation instructions are explained here

Install the following packages

python -m pip install cython numpy opencv-python pyopengl

To install the ZED python package, you need to download the ZED SDK here. If this is causing any problems, please not that the currently tested version for this setup is 4.2.0. After installation, you can get the python package pyzed by invoking the respective installation script:

python get_python_api.py

In the installation directory.

Pyrealsense2 (Optional)

The package can be simply installed with

python -m pip install pyrealsense2

Data Structure

Example data is provided in data/example_calib. Please make sure to unzip all .zip files in the photogrammetry directories first.

To calibrate a pair of sensors, the repository expects the following data structure:

|--> <capturename1>
    |--> radar_72.0_82.0_128
    |--> photogrammetry
    |--> kinect
    |--> zed
    |--> realsense
|--> <capturename2>
|--> ...

Radar

The radar directory structure looks like this:

|--> radar_72.0_82.0_128
    # contains, for each frame, the raw radar data, which is a tensor of 94x94x128 complex numbers
    |--> calibrated_data
        |--> 000000.npy
        |--> ...

Photogrammetry

The photogrammetry directory structure looks like this:

|--> photogrammetry
    # contains the extrinsic and intrinsic parameters of the cameras
    # in Agisoft Metashape readable format
    |--> cams.xml 
    # original, unfiltered mesh obtained from Agisoft Metashape
    |--> mesh.obj + mesh.mtl
    # contains the captured RGB images from each DSLR camera
    |--> rgb
        |--> <camera-name>.jpg
        |--> ...
    # contains the reconstructed depth maps from Agisoft Metashape
    |--> depth
        |--> <camera-name>.tif
        |--> ...

Kinect

The kinect directory structure looks like this:

|--> kinect
    # camera parameters of kinect's k4a SDK
    |--> calibration.json
    # color frames
    |--> rgb
        |--> 000000.png
        |--> ...
    # depth frames
    |--> depth
        |--> 000000.png
        |--> ...

Realsense

The realsense directory structure looks like this:

|--> realsense
    # camera parameters 
    |--> calibration.json
    # color frames
    |--> rgb
        |--> 000000.png
        |--> ...
    # depth frames
    |--> depth
        |--> 000000.png
        |--> ...

ZED

The ZED directory structure looks like this:

|--> zed
    # camera parameters 
    |--> calibration.json
    # color frames
    |--> rgb
        # frames of the left RGB camera
        |--> left
            |--> 000000.png
            |--> ...
        # frames of the right RGB camera
        |--> right
            |--> 000000.png
            |--> ...    
    # depth frames
    |--> depth
        |--> left
            |--> 000000.png
            |--> ...
        |--> right
            |--> 000000.png
            |--> ...

Calibration Output

The alignment.json file contains intermediate data about the calibration procedure as well as the calibration results, which are transformation matrices between sensor spaces:

{
    # 4x4 matrix in row-first order that transforms from kinect space to photogrammetry space
    "kinect2photogrammetry": [0,0,0,0], [0,0,0,0], [0,0,0,0], [0,0,0,1],
    # transforms from kinect -> radar
    "kinect2radar": [0,0,0,0], [0,0,0,0], [0,0,0,0], [0,0,0,1],
    # transforms from the original sensor space to an intermediate, so-called 'world space',  which is the same for all sensors and has the following coordinates:
    # Y
    # ^
    # |   z
    # |  /
    # | /
    # \------> X
    "kinect2world": [0,0,0,0], [0,0,0,0], [0,0,0,0], [0,0,0,1],
    # some intermediate data that was used for calibration
    "kinect_calib" : {} 
    
    ...

    "photogrammetry2kinect": ...
    "photogrammetry2realsense": ...
    "photogrammetry2zed" : ...
    "photogrammetry2radar" : ...
    "photogrammetry2world": ...
    "photo_calib": ...

    ... analogous for all other sensors
}

Configuration

Configure your location of the to-be-calibrated sensor data in the configuration file in configs/script_config.json. A short explanation of all important configuration settings is provided here:

Variable Description
base_path The base path where the data lies, e.g. <your-path-to>/data. This path has to be absolute.
calibration_path If you enabled use_relative_paths, put the directory name of the coarse calibration object here, e.g. registration_coarse. If use_relative_paths=False you have to specify the full path <your-path-to>/data/registration_coarse in this argument
fine_calibration_path As mentioned in the paper, the calibration process can be refined with a fine calibration step, where we put a metal plate in front of the sensors. This step is optional and you can leave the path empty in case you don't need it. If you enabled use_relative_paths, put the directory name of the fine calibration object here, e.g. registration_fine. If use_relative_paths=False you have to specify the full path <your-path-to>/data/registration_fine in this argument
reconstruction_path If you enabled use_relative_paths, put the capture directory name here, e.g. registration_coarse. If use_relative_paths=False you have to specify the full path <your-path-to>/data/registration_coarse in this argument
sensors_in_use Specify which sensors should be visualized. Some sensors may require additional dependencies (see dependencies)
<optical-sensor>:circle_detection These are the parameters required for the OpenCV's Circle Hough transform.
<optical-sensor>:vertex_normal_filter Filters the masked pointcloud of the calibration object spheres according to the z-value of the view-space normals.
<optical-sensor>:plane_detection:padding Adds extra padding to the detected 2D bounding box of the styrofoam plane.
<optical-sensor>:plane_detection:relative_red_val The styrofoam plane is primarily green, so we compare the green channel to the red channel. If the red channel of a pixel falls below a specified ratio to the green channel, it is classified as part of the styrofoam board.
<optical-sensor>:plane_detection:relative_blue_val The styrofoam plane is primarily green, so we compare the green channel to the blue channel. If the blue channel of a pixel falls below a specified ratio to the green channel, it is classified as part of the styrofoam board.
<optical-sensor>:plane_fit These are the optimization parameters that are necessary to perform a plane fit with RANSAC, based on the pointcloud that is given for the styrofoam surface. (Equation 7 of the paper)
<optical-sensor>:sphere_fit These are the optimization parameters that are necessary to perform a sphere fit with RANSAC, based on the pointcloud that is given for the four previously detected spheres. (Equation 2 of the paper)
radar:force_redo By default, the radar dataset loader uses the cached reconstructions within the dataset to load the respective reconstruction volume (volume), pointcloud (xyz), depth map (depth), or amplitude map (maxproj). If you explicitely want to trigger the reconstruction algorithms, set this flag to true.
radar:use_intrinsic_parameters Once a radar reconstruction is done, the hyperparameters are stored in an intrinsic file. These parameters are used to load the cached reconstruction files and - in case the radar:reconstruction_reco_params have changed - to check whether reconstruction has to be redone.
radar:calibration_filter_threshold_maxproj For calibration with main.py file: the decibel threshold that is applied to the 2D depth map after maximum projection, to filter out background noise and sidelobes of the calibration object such that only valid radar depth values remain.
radar:calibration_filter_threshold_pc For calibration with main.py file: the decibel threshold that is applied to the volumetric reconstruction after performing backprojection, to filter out background noise and sidelobes of the calibration object such that only valid radar depth values remain.
radar:amplitude_filter_threshold_dB Only for reconstruction with the visualize.py file after successful calibration: the decibel threshold that is applied to to the 2D depth map after maximum projection, to filter out background noise and sidelobes of the to-be-reconstructed object such that only valid radar depth values remain.
radar:farthest_scatterer_detection Detection hyperparameters, inducing prior knowledge to find the metal sphere mounted on the styrofoam board, i.e. the farthest scatterer.
radar:cluster_assignment Defines the set of points within one cluster, one for each of the detected metal spheres.
radar:sphere_cluster_detection Optimization hyperparameters to find the optimal point candidate that forms the center of each of the four metal spheres. (Equation 4 of the Paper)
radar:plane_fit These are the optimization parameters that are necessary to perform a plane fit with RANSAC, based on the point candidates that were detected for the metal spheres. (Equation 7 of the paper)
radar:calibration_capture_params Configures the frequency steps for the FSCW radar setup that was used for calibration with main.py
radar:reconstruction_capture_params Configures the frequency steps for the FSCW radar setup, after calibration, for reconstruction with visualize.py
radar:calibration_coarse_reco_params The voxel extents as well as the voxel density that are used for the radar reconstruction of the coarse calibration object. For example, xmin = -0.15, xmax = 0.15 and xsteps = 301 constructs 301 voxels, with their voxel centers in-between [-0.15 m, 0.15 m] (here: with 0.01 m distance to each other)
radar:calibration_fine_reco_params The voxel extents as well as the voxel density that are used for the radar reconstruction of the fine calibration object.
radar:reconstruction_reco_params The voxel extents as well as the voxel density that are used for the radar reconstruction, after calibration.
camera_sphere_radius_meter The radius of the styrofoam spheres of the calibration object, given in meters.
radar_sphere_radius_meter The radius of the metal spheres of the calibration object, given in meters.
sphere_centers_distances_meter The distance between the metal spheres, i.e. the styrofoam sphere centroids, along the x- and y-axes of the styrofoam board.
sphere_center_dist_prior_meter [Optional] A rough prior with respect to the distance of the metal spheres along both, the x- and y-axes (as our calibration object is symmetric). It is used to estimate the exact parameters for sphere_centers_distances_meter in case we have a ground-truth setup, i.e. the photogrammetry. This parameter is only used by the CameraRegistration class, which performs sphere detection and fitting for the ground-truth MVS setup. All other registration classes assume that sphere_centers_distances_meter is given.
num_spheres Number of styrofoam spheres.
spheres_per_row Number of styrofoam spheres along the x-axis.
spheres_per_col Number of styrofoam spheres along the y-axis.
circle_id_multiple Continous number that is used as increment to assign different mask IDs to the detected spheres.
plane_id Mask ID of the styrofoam plane.
calibration_filename the name of the calibration file that is used to export the transformation matrices to align the sensor coordinate systems, after executing main.py.

Execution

The code uses pycuda to accelerate some portions of code operating on large-scale data. The respective *.cu files are located in nfcalib/cuda. To save time at runtime (and disable the JIT compilter of pycuda) you can precompile the cuda files beforehand:

cd nfcalib/cuda;
./precompile.sh # make sure to adjust the GPU architecture to your hardware first

The calibration can be executed with:

cd code;
python main.py

After calibration, the fused data of any other data capture can be visualized with this script:

cd code;
python visualize.py

Transforming Data Between Sensor Spaces

Transforming 3D data (e.g. a pointcloud or a backprojected depth map) from sensor space src to sensor space dest can be done as follows:

# set object path, for example:
object_path = '<path_to_calibrated_data>/registration_coarse'
calib = {}
# open the 'alignment.json' file that lies within the object path
with open(os.path.join(object_path, 'alignment.json'), "r") as f:
    calib = json.load(f)

# get 'src' dataloader, examples are listed in 'get_data' of nfcalib/utils/data_utils.py
# example here for 'src'='radar':
from nfcalib.sensors.radar_data_loader import RadarDataLoader
radar_loader = RadarDataLoader(os.path.join(object_path, "radar_72.0_82.0_128"))
radar_loader.read_radar_frames()

# get the 0th frame
radar_points, radar_depth, radar_intensity = radar_loader.get_frame(0)
# reshape points from (W,H,3) to (W*H, 3)
radar_points = radar_points.reshape(-1,3)

# transform into 'dest' space, for example 'dest'='realsense'
import nfcalib.utils.spatial_alignment as salign
radar_points_in_realsense_space = salign.transform(
    radar_points, "radar", "realsense", calib)

Acknowledgements

The authors would like to thank the Rohde & Schwarz GmbH & Co. KG (Munich, Germany) for providing the radar imaging devices.

This work was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – SFB 1483 – Project-ID 442419336, EmpkinS.

The authors gratefully acknowledge the scientific support and HPC resources provided by the Erlangen National High Performance Computing Center of the Friedrich-Alexander-Universität Erlangen-Nürnberg.

Citation

If you are using any of the code provided, please cite:

@INPROCEEDINGS{wirth2024nfcalib,
  author={Wirth, Vanessa and Bräunig, Johanna and Khouri, Danti and Gutsche, Florian and Vossiek, Martin and Weyrich, Tim and Stamminger, Marc},
  booktitle={2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)}, 
  title={Automatic Spatial Calibration of Near-Field MIMO Radar With Respect to Optical Depth Sensors}, 
  year={2024},
  volume={},
  number={},
  pages={8322-8329},
  keywords={Laser radar;MIMO radar;Optical imaging;Optical variables control;Adaptive optics;Sensor systems;Calibration;Sensors;Optical sensors;Optical reflection},
  doi={10.1109/IROS58592.2024.10801705}}

About

Code accompanying the paper "Automatic Spatial Calibration of Near-Field MIMO Radar With Respect to Optical Depth Sensors" (IROS 2024)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published