Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Calibration and setting suggestions #89

Open
durn2000 opened this issue May 5, 2016 · 31 comments
Open

Calibration and setting suggestions #89

durn2000 opened this issue May 5, 2016 · 31 comments

Comments

@durn2000
Copy link

durn2000 commented May 5, 2016

I am currently running ORB_SLAM2 with a ZED stereo camera on an NVIDIA TX1 embedded computer and I have a number of questions regarding setting up the system.

  • In general, what settings can be altered to ensure a speed up in the algorithm? I have already lowered the camera resolution to its smallest native resolution 640x480.
  • I am using a ZED stereo camera and passing rectified images to ORB_SLAM2 with online rectification turned off. In this case, what camera settings are used by the algorithm? I assume the stereo baseline is the only setting that is being used?
  • My stereo camera has a 120mm baseline and assumed this is what the calibration parameter camera.bf was, however the comment above the paramter says "stereo baseline times fx". can you please explain this parameter?
  • Are there certain parameters for the algorithm, that you would recommend for indoor settings such as your MAV dataset?
  • Can you explain the close/far threshold parameter?

Thank you

@antithing
Copy link

Hi, I am also using the zed camera, I can help you with the baseline question... The value should be baseline in METRES * fx. So if your focal value is 700, the bf is 700 * 0.120 = 84.

@durn2000
Copy link
Author

durn2000 commented May 9, 2016

Thank you antithing. I will make the changes. Do you think you could send me a copy of your settings file to check against yours? You had emailed me directly a while ago, but unfortunately I accidently deleted your email address.

@liyuming1978
Copy link

Camera.bf: = color camera focal * baseline
ThDepth : what is the close , and what is the far? how to get this value
DepthMapFactor: how to get this value.

it's too hard to make clear about these three values.
if someone can explain, please kindly give an answer.

@raulmur
Copy link
Owner

raulmur commented May 20, 2016

Camera.bf is the horizontal focal length (in pixels) multiplied by the baseline (in meters).
ThDepth can be safely set around 50. This will be explained in the paper about this stereo/rgb-d version of ORB-SLAM, which is still not published.
DepthMapFactor is a scale factor that multiplies the input depthmap (if needed) if you are using a RGB-D camera. This is used in the TUM RGB-D dataset.

The best way to understand the setting file is to look at the different examples.
Settings for already rectified stereo (KITTI): https://github.com/raulmur/ORB_SLAM2/blob/master/Examples/Stereo/KITTI04-12.yaml
Setting for non-rectified stereo (EuRoC): https://github.com/raulmur/ORB_SLAM2/blob/master/Examples/Stereo/EuRoC.yaml

@liyuming1978
Copy link

thanks for your answer, still DepthMapFactor I have not make clear.
for ex. from http://vision.in.tum.de/data/datasets/rgbd-dataset/download#freiburg2_xyz
how can i know the DepthMapFactor: 5208.0

@raulmur
Copy link
Owner

raulmur commented May 20, 2016

see here: http://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats
While in the documentation it is said that the factor should be 5000 we saw that for fr2 sequences we got a scale error and needed to correct that factor (it seems to be a calibration error).

@liyuming1978
Copy link

got it! a pixel value of 5000 in the depth image corresponds to a distance of 1 meter from the camera
that's the answer.. thanks so much!

@aorait
Copy link

aorait commented Jun 2, 2016

I am using the ZED camera too. Everything works fine but why is there only one set of camera calibration parameters in the settings file for stereo cameras? The actual calibration values for left and right camera are quite different. Currently I am using the left camera's parameters. What is the correct way to proceed and has this difference been accounted for somewhere?

@aseyfi
Copy link

aseyfi commented Jun 14, 2016

I have the same question about the stereo camera parameters. Did you find out why the software is using only parameters of the one of the cameras?

@raulmur
Copy link
Owner

raulmur commented Jun 14, 2016

Parameters for just one camera are used when the images are already rectified. See for instance the KITTI dataset example: https://github.com/raulmur/ORB_SLAM2/blob/master/Examples/Stereo/KITTI04-12.yaml.

If images are not rectified you need to provide the intrinsic and extrinsic calibration of the stereo pair, as in the EuRoC dataset example: https://github.com/raulmur/ORB_SLAM2/blob/master/Examples/Stereo/EuRoC.yaml.

About rectification: http://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#stereorectify

@aseyfi
Copy link

aseyfi commented Jun 22, 2016

Has anybody successfully used ORB_SLAM2 with Jetson TX1 board with ZED? What is the frame rate?

@aorait
Copy link

aorait commented Jun 22, 2016

@aseyfi Yes. I am getting update rates of 2-4 Hz. But I have GPU accelerated a section of the code to obtain 25-30 Hz with monocular camera. Haven't tested it with ZED yet.

@aseyfi
Copy link

aseyfi commented Jun 22, 2016

Is your code publicly available?

@aorait
Copy link

aorait commented Jun 22, 2016

Yes I used the same calibration. Sorry I can't share the code publicly.

@dcs0002
Copy link

dcs0002 commented Aug 10, 2016

Can you share which area of the code you chose to implement the GPU acceleration in?

@M1234Thomas
Copy link

M1234Thomas commented Nov 16, 2016

I am using ZED camera as well.
Can someone please tell me know how to do stereo rectification and synchronization?
I am not able to find the required parameters to do the online rectification mentioned in the code.
Also I believe the ZED images are already rectified. Please correct me if I am wrong.
I would really appreciate any help.

@aseyfi
Copy link

aseyfi commented Nov 16, 2016

Hello,

If you are using zed-ros-wrapper, the images are already un-distorted and rectified. The only thing that you need is the parameters of the camera corresponding to the rectified images. They are published as two ROS topics for the left and right camera separately. I think the topic names are left_calibration and right_calibration. You can see the list of the topics using "rostopic list" in the command line while ZED wrapper is running.

Ahmad

On Nov 16, 2016, at 9:44 AM, M1234Thomas notifications@github.com wrote:

Can someone please tell me know how to do stereo rectification and synchronization?
I am not able to find the required parameters to do the online rectification mentioned in the code.
Also I believe the ZED images are already rectified. Please correct me if I am wrong.
I would really appreciate any help.


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.

@M1234Thomas
Copy link

Unfortunately I am not using the wrapper.
I am working on windows using visual studio 2013.

@sebsuk
Copy link

sebsuk commented Dec 14, 2016

Hello,
to help you with the calibration parameters, i'll share my ZED parameters with you. I got them by using the MATLAB Stereo Calibration Tool and they seem to work well. They are for the 2k mode with 15 fps. I did not changed the algorithm parameters (except of the number of features), but the camera parameters:

%YAML:1.0

#--------------------------------------------------------------------------------------------
# Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------

# Camera calibration and distortion parameters (OpenCV) 
Camera.fx: 1354.53380
Camera.fy: 1354.5338
Camera.cx: 1105.07919
Camera.cy: 620.120018

Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0

Camera.width: 2208
Camera.height: 1242

# Camera frames per second 
Camera.fps: 15.0

# stereo baseline times fx
Camera.bf: 162.703890988

# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 1

# Close/Far threshold. Baseline times.
ThDepth: 35

#--------------------------------------------------------------------------------------------
# Stereo Rectification. Only if you need to pre-rectify the images.
# Camera.fx, .fy, etc must be the same as in LEFT.P
#--------------------------------------------------------------------------------------------
LEFT.height: 1242
LEFT.width: 2208
LEFT.D: !!opencv-matrix
   rows: 1
   cols: 5
   dt: d
   data: [-0.1698, 0.0228, 0.0, 0.0, 0.0]
LEFT.K: !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [1390.246,0,1142.087,0,1391.216,635.691,0,0,1]
LEFT.R:  !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [0.99989096, -0.002573, 0.01454137, 0.00247185, 0.99997266, 0.00696985, -0.01455891, -0.00693315, 0.99986998]
LEFT.P:  !!opencv-matrix
   rows: 3
   cols: 4
   dt: d
   data: [1.35453380e+03, 0.0, 1.10507919e+03, 0.0, 0.0, 1.3545338e+03, 6.20120018e+02, 0.0, 0.0, 0.0, 1.0, 0.0]
   
   
RIGHT.height: 1242
RIGHT.width: 2208
RIGHT.D: !!opencv-matrix
   rows: 1
   cols: 5
   dt: d
   data: [-0.1702, 0.0231, 0.0, 0.0, 0.0]
RIGHT.K: !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [1391.096,0,1104.712,0,1391.801,604.589,0,0,1]
RIGHT.R:  !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [0.99995885, -0.00188973, 0.00887258, 0.00195136, 0.99997399, -0.0069431, -0.00885923, 0.00696013, 0.99993653]
   
RIGHT.P:  !!opencv-matrix
   rows: 3
   cols: 4
   dt: d
   data: [1.35453380e+03, 0.0, 1.10507919e+03, -1.68893794e+02, 0.0, 1.3545338e+03, 6.20120018e+02, 0.0, 0.0, 0.0, 1.0, 0.0]
   
#--------------------------------------------------------------------------------------------
# ORB Parameters
#--------------------------------------------------------------------------------------------

# ORB Extractor: Number of features per image
ORBextractor.nFeatures: 2500

# ORB Extractor: Scale factor between levels in the scale pyramid 	
ORBextractor.scaleFactor: 1.2

# ORB Extractor: Number of levels in the scale pyramid	
ORBextractor.nLevels: 8

# ORB Extractor: Fast threshold
# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
# You can lower these values if your images have low contrast			
ORBextractor.iniThFAST: 20
ORBextractor.minThFAST: 7

#--------------------------------------------------------------------------------------------
# Viewer Parameters
#--------------------------------------------------------------------------------------------
Viewer.KeyFrameSize: 0.05
Viewer.KeyFrameLineWidth: 1
Viewer.GraphLineWidth: 0.9
Viewer.PointSize:2
Viewer.CameraSize: 0.08
Viewer.CameraLineWidth: 3
Viewer.ViewpointX: 0
Viewer.ViewpointY: -0.7
Viewer.ViewpointZ: -1.8
Viewer.ViewpointF: 500

Greetings
Sebastian

@M1234Thomas
Copy link

@sebsuk : thanks for the configuration data. Is it working well for you?
I am trying to run the camera in 720 mode outdoor, but it works well only with VGA configurations.
Did you face anything similar?

@sebsuk
Copy link

sebsuk commented Feb 4, 2017

Hello @M1234Thomas,

if I understand you right, you use my data at a lower resultion? This won't work, because the calibration parameters are strictly dependent to the camera resolution. Furthermore the parameters fit only to my camera. The production deviation is the reason why you have to do your own camera calibration, otherwise ZED could give everybody the right parameters on delivery.

I would recommend you (if you don't have Matlab with the toolbox) to use the ROS stereo calibration tool. It is easy to use while you achieve good results in the format you need.

Greetings
Sebastian

@M1234Thomas
Copy link

M1234Thomas commented Feb 6, 2017

@sebsuk,
I do not use your data. I have calibrated the camera on my own.
I try to run ORB SLAM2 with the calibrated configurations(resolution 720) outdoors, the tracking is lost all the time with frequent attempt to relocalize the place. But the images captured at 720 resolution works well with VGA configuration.
This is confusing me. I tried to use it indoor and works well for images at 720 with configurations of 720 itself.

@mtee
Copy link

mtee commented Mar 2, 2017

I am also using the ZED camera but cannot get the camera calibration to result in such parameters that the projection matrices (LEFT.P and RIGHT.P) contain the same values as the fx, fy, cx, cy in the camera matrix. How strict is the requirement that the values should be equal? Or should I just take the values from the projection matrix and define them as fx, fy, cx, cy as well?

For the record, here are the results I got from the ROS stereo calibration tool:
Right camera:

camera matrix
701.806163 0.000000 625.376720
0.000000 703.708909 357.035448
0.000000 0.000000 1.000000
projection
680.194661 0.000000 628.896538 -82.198969
0.000000 680.194661 353.685852 0.000000
0.000000 0.000000 1.000000 0.000000

that is, instead of 701 the P[0, 0] is 680, and so on.

@diesbot
Copy link

diesbot commented Mar 6, 2017

Please correct me if I'm wrong, but according to the ZED SDK API, images received from the ZED using the grab(); retrieveImage() function are already rectified.
Is there any benefit in doing it manually?

@mtee
Copy link

mtee commented Mar 6, 2017

as far as I know for the zed camera, the only way to get the rectified image out of the box is to use ROS-wrapper for ZED: there are ROS-topics with rectified images. However, I would like to avoid using ROS for that. The standalone tools that are supplied with the SDK return raw footage (not rectified, not undistorted). Meanwhile I will look into writing a small program that would use the grab() function, thank you for the hint.

@arkrobo
Copy link

arkrobo commented May 16, 2017

Hi, I installed zed sdk v2.0.0 & cuda 8 (Laptop dell precision 7710 - nvidia graphic card quadro M5000M, windows 8.1 pro). I have updated zed camera firmware to zed_fw_v1142_spi through ZED Explorer.

When run Depth viewer I am get "Unable to open the ZED. Error Code: calibration Filte not available" message.

I downloaded SN1947.conf file from "https://www.stereolabs.com/developers/calib/"pasted "C:\Users\Elgi\AppData\Roaming\Stereolabs\settings" folder. Also, I generated conf file using ZED Calibration. But, Still the same error persist.

Can you suggest how to overcome this?

@antithing
Copy link

Hi, try running the zed calibration tool that ships with the sdk. it will create a calibration file for you.

@ArtlyStyles
Copy link

Hi, I am also thinking using ZED with ORB SLAM. I am wondering if you use ROS to link ZED and ORB SLAM or write C++ code to integrate the two component directly? For my application, I hope to run the SLAM algorithm real time. I have ZED and Nvidia TX2 board on a "MIT RaceCar" (RC car) which is operated by ROS. So using ROS to connect the two is easy for me. However, I noticed the ZED images through ROS node has a big latency, which may impact the real time performance.

@rhklite
Copy link

rhklite commented Oct 9, 2018

@ArtlyStyles I managed to hook up ORBSLAM2 with ZED via this wrapper by ETHZ. Make sure you make the appropriate changes to the .yaml files for your camera parameters and .launch files for the topics subscribed by ORBSLAM2. I sent the already rectified images to ORBSLAM2.

@karankatiyar92
Copy link

Hello,
to help you with the calibration parameters, i'll share my ZED parameters with you. I got them by using the MATLAB Stereo Calibration Tool and they seem to work well. They are for the 2k mode with 15 fps. I did not changed the algorithm parameters (except of the number of features), but the camera parameters:

%YAML:1.0

#--------------------------------------------------------------------------------------------
# Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------

# Camera calibration and distortion parameters (OpenCV) 
Camera.fx: 1354.53380
Camera.fy: 1354.5338
Camera.cx: 1105.07919
Camera.cy: 620.120018

Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0

Camera.width: 2208
Camera.height: 1242

# Camera frames per second 
Camera.fps: 15.0

# stereo baseline times fx
Camera.bf: 162.703890988

# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 1

# Close/Far threshold. Baseline times.
ThDepth: 35

#--------------------------------------------------------------------------------------------
# Stereo Rectification. Only if you need to pre-rectify the images.
# Camera.fx, .fy, etc must be the same as in LEFT.P
#--------------------------------------------------------------------------------------------
LEFT.height: 1242
LEFT.width: 2208
LEFT.D: !!opencv-matrix
   rows: 1
   cols: 5
   dt: d
   data: [-0.1698, 0.0228, 0.0, 0.0, 0.0]
LEFT.K: !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [1390.246,0,1142.087,0,1391.216,635.691,0,0,1]
LEFT.R:  !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [0.99989096, -0.002573, 0.01454137, 0.00247185, 0.99997266, 0.00696985, -0.01455891, -0.00693315, 0.99986998]
LEFT.P:  !!opencv-matrix
   rows: 3
   cols: 4
   dt: d
   data: [1.35453380e+03, 0.0, 1.10507919e+03, 0.0, 0.0, 1.3545338e+03, 6.20120018e+02, 0.0, 0.0, 0.0, 1.0, 0.0]
   
   
RIGHT.height: 1242
RIGHT.width: 2208
RIGHT.D: !!opencv-matrix
   rows: 1
   cols: 5
   dt: d
   data: [-0.1702, 0.0231, 0.0, 0.0, 0.0]
RIGHT.K: !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [1391.096,0,1104.712,0,1391.801,604.589,0,0,1]
RIGHT.R:  !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [0.99995885, -0.00188973, 0.00887258, 0.00195136, 0.99997399, -0.0069431, -0.00885923, 0.00696013, 0.99993653]
   
RIGHT.P:  !!opencv-matrix
   rows: 3
   cols: 4
   dt: d
   data: [1.35453380e+03, 0.0, 1.10507919e+03, -1.68893794e+02, 0.0, 1.3545338e+03, 6.20120018e+02, 0.0, 0.0, 0.0, 1.0, 0.0]
   
#--------------------------------------------------------------------------------------------
# ORB Parameters
#--------------------------------------------------------------------------------------------

# ORB Extractor: Number of features per image
ORBextractor.nFeatures: 2500

# ORB Extractor: Scale factor between levels in the scale pyramid 	
ORBextractor.scaleFactor: 1.2

# ORB Extractor: Number of levels in the scale pyramid	
ORBextractor.nLevels: 8

# ORB Extractor: Fast threshold
# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
# You can lower these values if your images have low contrast			
ORBextractor.iniThFAST: 20
ORBextractor.minThFAST: 7

#--------------------------------------------------------------------------------------------
# Viewer Parameters
#--------------------------------------------------------------------------------------------
Viewer.KeyFrameSize: 0.05
Viewer.KeyFrameLineWidth: 1
Viewer.GraphLineWidth: 0.9
Viewer.PointSize:2
Viewer.CameraSize: 0.08
Viewer.CameraLineWidth: 3
Viewer.ViewpointX: 0
Viewer.ViewpointY: -0.7
Viewer.ViewpointZ: -1.8
Viewer.ViewpointF: 500

Greetings
Sebastian

what is k1,k2 and p1 p2 called ? how do i get them ?

@virajawate
Copy link

You can get those with /zed2i/zed_node/left/camera_info topic

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests