New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calibration and setting suggestions #89
Comments
Hi, I am also using the zed camera, I can help you with the baseline question... The value should be baseline in METRES * fx. So if your focal value is 700, the bf is 700 * 0.120 = 84. |
Thank you antithing. I will make the changes. Do you think you could send me a copy of your settings file to check against yours? You had emailed me directly a while ago, but unfortunately I accidently deleted your email address. |
Camera.bf: = color camera focal * baseline it's too hard to make clear about these three values. |
Camera.bf is the horizontal focal length (in pixels) multiplied by the baseline (in meters). The best way to understand the setting file is to look at the different examples. |
thanks for your answer, still DepthMapFactor I have not make clear. |
see here: http://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats |
got it! a pixel value of 5000 in the depth image corresponds to a distance of 1 meter from the camera |
I am using the ZED camera too. Everything works fine but why is there only one set of camera calibration parameters in the settings file for stereo cameras? The actual calibration values for left and right camera are quite different. Currently I am using the left camera's parameters. What is the correct way to proceed and has this difference been accounted for somewhere? |
I have the same question about the stereo camera parameters. Did you find out why the software is using only parameters of the one of the cameras? |
Parameters for just one camera are used when the images are already rectified. See for instance the KITTI dataset example: https://github.com/raulmur/ORB_SLAM2/blob/master/Examples/Stereo/KITTI04-12.yaml. If images are not rectified you need to provide the intrinsic and extrinsic calibration of the stereo pair, as in the EuRoC dataset example: https://github.com/raulmur/ORB_SLAM2/blob/master/Examples/Stereo/EuRoC.yaml. About rectification: http://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#stereorectify |
Has anybody successfully used ORB_SLAM2 with Jetson TX1 board with ZED? What is the frame rate? |
@aseyfi Yes. I am getting update rates of 2-4 Hz. But I have GPU accelerated a section of the code to obtain 25-30 Hz with monocular camera. Haven't tested it with ZED yet. |
Is your code publicly available? |
Yes I used the same calibration. Sorry I can't share the code publicly. |
Can you share which area of the code you chose to implement the GPU acceleration in? |
I am using ZED camera as well. |
Hello, If you are using zed-ros-wrapper, the images are already un-distorted and rectified. The only thing that you need is the parameters of the camera corresponding to the rectified images. They are published as two ROS topics for the left and right camera separately. I think the topic names are left_calibration and right_calibration. You can see the list of the topics using "rostopic list" in the command line while ZED wrapper is running. Ahmad
|
Unfortunately I am not using the wrapper. |
Hello,
Greetings |
@sebsuk : thanks for the configuration data. Is it working well for you? |
Hello @M1234Thomas, if I understand you right, you use my data at a lower resultion? This won't work, because the calibration parameters are strictly dependent to the camera resolution. Furthermore the parameters fit only to my camera. The production deviation is the reason why you have to do your own camera calibration, otherwise ZED could give everybody the right parameters on delivery. I would recommend you (if you don't have Matlab with the toolbox) to use the ROS stereo calibration tool. It is easy to use while you achieve good results in the format you need. Greetings |
@sebsuk, |
I am also using the ZED camera but cannot get the camera calibration to result in such parameters that the projection matrices (LEFT.P and RIGHT.P) contain the same values as the fx, fy, cx, cy in the camera matrix. How strict is the requirement that the values should be equal? Or should I just take the values from the projection matrix and define them as fx, fy, cx, cy as well? For the record, here are the results I got from the ROS stereo calibration tool:
that is, instead of 701 the P[0, 0] is 680, and so on. |
Please correct me if I'm wrong, but according to the ZED SDK API, images received from the ZED using the |
as far as I know for the zed camera, the only way to get the rectified image out of the box is to use ROS-wrapper for ZED: there are ROS-topics with rectified images. However, I would like to avoid using ROS for that. The standalone tools that are supplied with the SDK return raw footage (not rectified, not undistorted). Meanwhile I will look into writing a small program that would use the grab() function, thank you for the hint. |
Hi, I installed zed sdk v2.0.0 & cuda 8 (Laptop dell precision 7710 - nvidia graphic card quadro M5000M, windows 8.1 pro). I have updated zed camera firmware to zed_fw_v1142_spi through ZED Explorer. When run Depth viewer I am get "Unable to open the ZED. Error Code: calibration Filte not available" message. I downloaded SN1947.conf file from "https://www.stereolabs.com/developers/calib/"pasted "C:\Users\Elgi\AppData\Roaming\Stereolabs\settings" folder. Also, I generated conf file using ZED Calibration. But, Still the same error persist. Can you suggest how to overcome this? |
Hi, try running the zed calibration tool that ships with the sdk. it will create a calibration file for you. |
Hi, I am also thinking using ZED with ORB SLAM. I am wondering if you use ROS to link ZED and ORB SLAM or write C++ code to integrate the two component directly? For my application, I hope to run the SLAM algorithm real time. I have ZED and Nvidia TX2 board on a "MIT RaceCar" (RC car) which is operated by ROS. So using ROS to connect the two is easy for me. However, I noticed the ZED images through ROS node has a big latency, which may impact the real time performance. |
@ArtlyStyles I managed to hook up ORBSLAM2 with ZED via this wrapper by ETHZ. Make sure you make the appropriate changes to the .yaml files for your camera parameters and .launch files for the topics subscribed by ORBSLAM2. I sent the already rectified images to ORBSLAM2. |
what is k1,k2 and p1 p2 called ? how do i get them ? |
You can get those with /zed2i/zed_node/left/camera_info topic |
I am currently running ORB_SLAM2 with a ZED stereo camera on an NVIDIA TX1 embedded computer and I have a number of questions regarding setting up the system.
Thank you
The text was updated successfully, but these errors were encountered: