Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some questions on your test process and result (TUM RGB-D Benchmark) #28

Open
bjornph opened this issue May 13, 2015 · 14 comments
Open

Comments

@bjornph
Copy link

bjornph commented May 13, 2015

I am testing ORB-SLAM (and LSD-SLAM) on the TUM RGB-D Benchmark dataset, and I have some question on your test process:

ORB questions:

  • Did you only test with KeyFrames, or did you test with the pose of alle the frames?
  • Did you match on timing to find the correct match between ORB-SLAM poses and groundtruth poses?
  • Does the hardware performance affect the RMSE error?
  • Does ORB-SLAM rectify images by itself, or do they have to be pre-rectified?
  • Did you use standard ROS kinect intrinsic parameters, or did you use the fr(1,2,3) calibrated parameters?

LSD-SLAM questions:

  • Did you only use Keyframes, or poses for all the frames?

Thank you

@Gingol
Copy link

Gingol commented Jun 19, 2015

Hi bjornph,
did you succesfully tested the LSD-SLAM with the TUM RGB-D Benchmark dataset?
I'm trying to do it, but the software doesn't initialize well from the first images therefore it loses the track very soon.

@bjornph
Copy link
Author

bjornph commented Jun 24, 2015

I had to stop working on it to try something else for some time. However I've looked at it for the last couple of days. Have you gotten it to work?

I made it work good on some and not on others. Which datasets are you trying to run? The ORB-SLAM paper has a nice overview of it in Table III.

My tests:

  • fr3_sit_xyz: My error of 8.5 cm in comparison to their 7,73 which is pretty damn close. All of the - fr3_.... datasets are pre-rectified so I just used standard parameters and the camera info.
  • fr2_desk: The best I've gotten is an error of ~30cm, compared to the 4.57 which is quite the difference. With ORB-SLAM I got an error of ~2cm.

My test setup:
To record the needed parameters, I use:

  • rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt
  • rostopic echo -p /lsd_slam/keyframes/time > lsd_time.txt
    I then match the comparing keyframes with ground truth on time, and run a script to obtain the scale difference. To get the absolute trajectory error, I use http://vision.in.tum.de/data/datasets/rgbd-dataset/online_evaluation python scripts.

Notes/questions:
fr2_desk: The dataset is compressed. I don't know if this affects the performance. You can decompress it, try "rosbag decompress -h" for info. The camera_info included is the kinect standard ones, from http://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats#intrinsic_camera_calibration_of_the_kinect. I have also tried to replace them with fr2 parameters, but it does not give good results. I find their use of calib files confusing, so when I try to undistort it I run it through image_proc.

Hope this helps some. Keep me updated if you get any breakthrough. I promise it won't take as long to answer next time.

@Gingol
Copy link

Gingol commented Jun 26, 2015

Thanks for your reply.
I'm trying it with almost all the datasets, but often the tracking is lost very soon unless I set a large value for KFUsageWeight and KFDistWeight threshold. However this involve a very poor quality of the map.
Which value did you use for the keyframe threshold?
Futhermore when the algorithm complete the sequence and I evaluate with the online tool the ATE the keyframe poses and the ground truth haven't the same "shape" and the error is at least 1 meter. To gather the keyframe poses I use the same command you wrote so I don't understand why this happen.

For the calibration I'm using the ROS default parameters for all the datasets, because are the raccomanded ones on the TUM website.

Can you be so kind to give me the script to calculate the scale? I've tried to write it but without success.

@bjornph
Copy link
Author

bjornph commented Jun 27, 2015

@Gingol

  • I'm using default parameters on KFUsageWegiht and KFDistWeight.
  • On calibration: "We recommend to use the ROS default parameter set (i.e., without undistortion), as undistortion of the pre-registered depth images is not trivial" - My interpretation is that as long as you don't use depth images then it should not be a problem to use rectified color images.
  • Scale and ATE calculation: See the small repository I uploaded [1]. I use a matlab script [2] and link to it in my script "ate_test.py". Steps: 1 To record keyframes I use "rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt" and "rostopic echo -p /lsd_slam/keyframes/time > lsd_time.txt". 2 process these with "python lsd_to_readable.py lsd_time.txt lsd_camToWorld.txt lsd.txt(<-the output text file)". 3. Process it all with "python ate_test.py lsd.txt gt.txt(e.g. rgbd_dataset_freiburg3_sitting_xyz-groundtruth.txt). This first uses "associate.py" to find corresponding timestamps, then it uses the matlab script "getScale.m" which again links to "absor.m"[2] to get the scale. Then it performs the ATE test with "evaluate_ate.py".
  • To link matlab and python, see [3]
  • I have included my test result form fr3/sit_xyz in [1] so you can play around.
  • There are som hardcoded parameters and all around lazy code in this as I thought I was only going to use it myself. You have to change the path in "ate_test.py" at least. Ask if there's any confusion.

Hope this helps

[1] https://github.com/bjornph/lsd_files
[2] http://www.mathworks.com/matlabcentral/fileexchange/26186-absolute-orientation-horn-s-method
[3] http://se.mathworks.com/help/matlab/matlab_external/install-the-matlab-engine-for-python.html#responsive_offcanvas

@Gingol
Copy link

Gingol commented Jul 8, 2015

@bjornph
Please excuse me for taking so long to answer.
I was quite busy in these days so I tried your code only yesterday... and it works!
Thanks a lot you've been very helpful.
I have one question left: in the LSD-SLAM paper when they tried the TUM benchmark dataset they used the depth image for the initialization. Did you do the same?
Thanks again

@bjornph
Copy link
Author

bjornph commented Jul 10, 2015

So nice that it worked, some questions?

  • Which datasets did you try, and what results did you get?
  • Did you do any undistortion of the images - how? Parameters, etc.

Your question: I am not sure what you mean, but if it's this part you are referring to: "For comparison we show respective results from semi-dense mono-VO
[9], keypoint-based mono-SLAM [15], direct RGB-D SLAM [14] and keypoint-
based RGB-D SLAM [7]. Note that [14] and [7] use depth information from the
sensor, while the others do not.", then it says that RGB-D slam uses depth information, and the others, including LSD-SLAM, does not.

@weichnn
Copy link

weichnn commented Apr 1, 2016

May I ask a question?
why I run rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt
get: ERROR: Cannot load message class for [lsd_slam_viewer/keyframeMsg]. Are your messages built?
thanks. @bjornph

@bjornph
Copy link
Author

bjornph commented Apr 1, 2016

@weichnn I have not looked at this in quite some time, so I am not sure. I would ask this question in the lsd_slam github page. My guess is that you don't load the correct reps in your bash file.

Good luck

@weichnn
Copy link

weichnn commented Apr 1, 2016

In project readme: Instead, this is solved in LSD-SLAM by publishing keyframes and their poses separately:
keyframeGraphMsg contains the updated pose of each keyframe, nothing else.
keyframeMsg contains one frame with it's pose, and - if it is a keyframe - it's points in the form of a depth map.

so, I use topic /lsd_slam/graph now. thanks. @bjornph

@dawei22
Copy link

dawei22 commented Apr 23, 2016

@bjornph Sorry, did you do some pre-processing to the ground truth data or the keyframeTrayectory that ORB SLAM provides you before use the "online evaluation" ? i got very bad results with the freiburg1 sequences

@bjornph
Copy link
Author

bjornph commented Apr 26, 2016

@dawei22 I do not recall the specific details of my test. What kind of results are you getting?

@ibenj93
Copy link

ibenj93 commented Jun 6, 2017

Hi @bjornph,
when I do the first steps to record data from the different topics you mentioned I get weird time data. the lsd_to_readable script works fine but then I get an error related to associate.py:

Traceback (most recent call last):
File "associate.py", line 117, in
second_list = read_file_list(args.second_file)
File "associate.py", line 68, in read_file_list
list = [(float(l[0]),l[1:]) for l in list if len(l)>1]
ValueError: could not convert string to float: {\fonttbl\f0\fmodern\fcharset0

lsd_poses.txt
time.txt

Do you have any idea how to solve this ? how did you manage to get time data in your example approximately the same as in ground truth ?

thanks !!

@haidela
Copy link

haidela commented Oct 23, 2017

Hello someone, I am trying to do first steps: 1 To record keyframes I use "rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt" and "rostopic echo -p /lsd_slam/keyframes/time > lsd_time.txt".

My files camToWorld.txt and time.txt are empties

PARAMETERS

  • /rosdistro: indigo
  • /rosversion: 1.11.21
    NODES
    auto-starting new master
    process[master]: started with pid [81362]
    ROS_MASTER_URI=http://haidara-virtual-machine:11311/
    setting /run_id to dc6eec44-b7ba-11e7-8504-000c2967f3d5
    process[rosout-1]: started with pid [81376]
    started core service [/rosout]

haidara@haidara-virtual-machine:/catkin_ws$ rosrun lsd_slam_core dataset _files:='/home/haidara/Downloads/fr1_rgb_calibration' _hz:=0 _calib:='/home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg'
Reading Calibration from file /home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg ... found!
found ATAN camera model, building rectifier.
Input resolution: 640 480
In: 0.262383 -0.953104 -0.005358 0.002628 1.163314
Out: 0.262383 -0.953104 -0.005358 0.002628 1.163314
Output resolution: 640 480
Prepped Warp matrices
Started mapping thread!
Started constraint search thread!
Started optimization thread
found 68 image files in folder /home/haidara/Downloads/fr1_rgb_calibration!
failed to load image /home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg! skipping.
failed to load image /home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg
! skipping.
Doing Random initialization!
started image display thread!
Done Random initialization!
warning: reciprocal tracking on new frame failed badly, added odometry edge (Hacky).
TRACKING LOST for frame 22 (0.47% good Points, which is 52.07% of available points, DIVERGED)!
failed to load image /home/haidara/Downloads/fr1_rgb_calibration/ost.txt! skipping.
Finalizing Graph... finding final constraints!!
Optizing Full Map!
Done optizing Full Map! Added 0 constraints.
Finalizing Graph... optimizing!!
doing final optimization iteration!
Finalizing Graph... publishing!!
Done Finalizing Graph.!!
... waiting for SlamSystem's threads to exit
Exited mapping thread
Exited constraint search thread
Exited optimization thread
DONE waiting for SlamSystem's threads to exit
waiting for image display thread to end!
ended image display thread!
done waiting for image display thread to end!
haidara@haidara-virtual-machine:~/catkin_ws$

haidara@haidara-virtual-machine:~$ rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt

Can someone please help me out on this one.
Thanks in advance.

@caap95
Copy link

caap95 commented Feb 28, 2018

@bjornph

Hello bjorn, I'm also trying to use TUM online tool to evaluate LSD_SLAM and followed your instructions from the previous commentaries. But, I run into a problem. My lsd_time.txt is being generated with wrong information, or at least not in the format that TUM uses to compare. Comparing with your file my parameter "field" on lsd_time are wrong, i get values like 1.79999876022 while you get values like 1341845820.99. So, as my lsd_time is at the wrong format TUM tool can't find any timestamp correspondences. I also run your scripts, and I get error "Index exceeds matrix dimensions" due to the values from lsd_time, because I run with your files and everything worked fine..

Do you know why i'm getting these wrong values in lsd_time?

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants