Skip to content
Jianzhu Huai edited this page Dec 9, 2020 · 9 revisions

Overview

    

The MARS logger records RGB camera frames at 30Hz and IMU readings at 50+ Hz and the two data streams are synced to one clock.

Installation

Data Format

Record data

Android

Open the Mars Logger app, press the Record button to start logging, and then press the Stop button to finish logging.

By default upon start, auto exposure and auto focus is on. Tap the screen to lock focus at a particular area and lock the exposure. Note it is observed that after locking focus at an area even the AF mode is not AF locked (i.e., CONTROL_AF_MODE_OFF), the recorded focus distance remains fixed.

iOS

Open the Mars Logger app, press the Record button to start logging, and press the Stop button to finish logging.

By default upon start, auto focus and auto exposure is on. Tap the screen to lock focus at a particular area in the camera view and fix the exposure duration. Long press the screen to unlock the focus and exposure duration.

Once finished, the Export button may be pressed to email the video frame metadata and inertial data to a user-specified mailbox. This is optional because you may also transfer the files by AirDrop from inside the iOS Files app or by connecting the device to a computer (see the below Transfer data section).

Transfer data

Clear data

Android

Use the File manager app, go into the Internal storage, then navigate to Android/data/edu.osu.pcv.marslogger/files/data, you may delete any folder at will.

If your computer has installed ADB(android-tools-adb), then you may open a prompt terminal, and run the following commands.

adb shell
cd /sdcard/Android/data/edu.osu.pcv.marslogger/files/data
ls
rm -rf FOLDER_TO_REMOVE

iOS

The recorded videos can be deleted by using the Camera app. The recorded csv files can be deleted by using the Files app for iOS 11+.

Convert to Rosbag

Use the bagcreator in python for creating a rosbag from the recorded visual inertial data. For instance, for one Android data session,

BAG_PYTHON=/vio_common/python/kalibr_bagcreater.py
ANDROID_DATA_DIR=/path/to/android/data/session
python $BAG_PYTHON --video $ANDROID_DATA_DIR/movie.mp4 \
--imu $ANDROID_DATA_DIR/gyro_accel.csv \
--video_time_file $ANDROID_DATA_DIR/frame_timestamps.txt \
--output_bag $ANDROID_DATA_DIR/movie.bag

For one iOS data session,

BAG_PYTHON=/vio_common/python/kalibr_bagcreater.py
IOS_DATA_DIR=/path/to/ios/data/session
python $BAG_PYTHON --video $IOS_DATA_DIR/IMG_4302.MP4 \
--imu $IOS_DATA_DIR/gyro_accel.csv \
--video_time_file $IOS_DATA_DIR/movie_metadata.csv \
--output_bag $IOS_DATA_DIR/movie.bag

By default, the video frames are downsampled by a factor of 2. To prevent downsampling, pass "--max_video_frame_height=10000" to the above bagcreater commands.

Process the Rosbag with a VIO method

We show how to configure a VIO (visual inertial odometry) method to process the rosbag.

Configure IMU parameters

For IMU noise parameters, I often find good results with the following values in metric units.

VIO gyro white noise accelerometer white noise gyro random walk accelerometer random walk
OKVIS 24.0e-4 16.0e-3 2.0e-5 5.5e-5
VINS-Mono 24.0e-3 16.0e-2 2.0e-4 5.5e-4

The values for VINS-Mono are those for OKVIS scaled up by for an IMU with sampling rate Hz because they use different IMU integration schemes. Due to implementation variations, VINS methods may need different noise parameters for the same IMU. Given a VINS method, I suggest tuning the IMU noise parameters starting from the values provided for its sample dataset.

The IMU systematic errors including scale factor errors, misalignment, g-sensitivity can usually be safely ignored from my experience with consumer smartphones (>200$).

Configure camera parameters

The camera projection parameters should be set according to values provided in movie_metadata.csv. If the above kalibr_bagcreater downsamples by a factor of 2 (by default), then the projection parameters should be also downscaled by 2,

f_x / 2, f_y / 2, c_x / 2, c_y / 2.

A radial tangential distortion model as in OpenCV can be used for the wide-angle lens of the rear color cameras on smartphones, and often times, the distortion parameters can be set zero.

The frame readout time can be either read from the movie_metadata.csv or set to 1.0/30 s.

The camera time offset relative to the IMU can be set zero if camera and IMU timestamps do not differ significantly (e.g., when some Android phone timestamps camera frames and IMU readings with different clocks).

Configure camera-IMU extrinsic parameters

The IMU and camera on nearly all smartphones (Android or iOS) have the same nominal orientation, as shown in the Android doc and the below drawing. the drawing

Because the IMU and camera are not too far, within 10 cm, their relative translation can be safely ignored for practical VINSs. Thus, their relative pose is

T_imu_cam =
[0, -1, 0, 0;
 -1, 0, 0, 0;
 0, 0, -1, 0;
 0, 0, 0, 1]

such that point of homogeneous coordinates in imu frame = T_imu_cam * point of homogeneous coordinates in camera frame.

Run VINS-Mono

After the configuration file is prepared as above, then you can try out VINS-Mono.