Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ORB_SLAM2 ROS node is not estimating trajectory (Gazebo Simulation) #6

Open
alejodosr opened this issue Jan 29, 2016 · 9 comments
Open

Comments

@alejodosr
Copy link

Hi,

I have successfully built the OBR-SLAM2 library (both normal and ROS variant). Using the Linux version (without ROS) and with a predefined dataset (TUM2 for example), it works perfectly. However, after building ROS node and executing it with a proper /camera/image_raw being published I don't get any Odometry, Trajectory estimation or mapping. In this video is shown what I have previously explained:

https://youtu.be/3OC2ARB9baI

This is the configuration of the Gazebo camera:

  <sensor name='camera' type='camera'>
      <update_rate>30</update_rate>
      <camera name='head'>
        <horizontal_fov>1.5708</horizontal_fov>
        
        <clip>
          <near>1</near>
          <far>500</far>
        </clip>
        <noise>
          <type>gaussian</type>
          <mean>0</mean>
          <stddev>0.007</stddev>
        </noise>
      </camera>
      <plugin name='camera_controller' filename='libgazebo_ros_camera.so'>
        <alwaysOn>true</alwaysOn>
        <updateRate>0.0</updateRate>
        <cameraName>camera</cameraName>
        <imageTopicName>image_raw</imageTopicName>
        <cameraInfoTopicName>camera_info</cameraInfoTopicName>
        <frameName>camera_link</frameName>
        <hackBaseline>0.07</hackBaseline>
    <Fx>517.306408</Fx>
        <Fy>516.469215</Fy>
    <Cx>318.643040</Cx>
        <Cy>255.313989</Cy>
        <distortionK1>0.0</distortionK1>
        <distortionK2>0.0</distortionK2>
        <distortionK3>0.0</distortionK3>
        <distortionT1>0.0</distortionT1>
        <distortionT2>0.0</distortionT2>
        <robotNamespace>/</robotNamespace>
      </plugin>
    </sensor>

This is part of the calibration file:

#--------------------------------------------------------------------------------------------
 Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------

 Camera calibration and distortion parameters (OpenCV) 
Camera.fx: 517.306408
Camera.fy: 516.469215
Camera.cx: 318.643040
Camera.cy: 255.313989

Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0
Camera.k3: 0.0

 Camera frames per second 
Camera.fps: 30.0

 Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 1

And the Vocabulary file I used was the one located at Vocabulary/ORBvoc.txt.

I hope I explained it properly, tell me if you need further information. Thank you.

@MarcGyongyosi
Copy link

have you tried taking off and hovering a little in place in order for ORB Slam to properly initialize? from what i can tell from the video you are flying forward right away which might not be ideal....

@alejodosr
Copy link
Author

Hi @MarcGyongyosi ,

thanks for the answer. I have already hover for a while and move softly around the hovering point but not success in initializing the algorithm.. Maybe it is related to the vocabulary file. I am trying to generate another vocabulary file using some frames from the simulation as an input for the DBow2 library. However, I have to figure out how to generate a seralized *.txt vocabulary file, similar to the one is used by the algorithm..

@AHolliday
Copy link

I've been having similar issues with initialization using ROS. I've been running ORB SLAM 1 on some bagfiles of mine and it works great, initializing within the first two seconds every time and tracking all the way through. But ORB SLAM 2 mostly fails to initialize on the same bagfiles, and usually gets lost quickly on the times it does initialize. Is it possible I haven't converted the settings.yaml correctly?

Here's the parameter file used with ORB SLAM 1 for the bagfile:
floorSettings.txt

And here's the file used with ORB SLAM 2:
floorSettings2.txt
If you'd like me to provide an example bagfile let me know.

@MarcGyongyosi
Copy link

i've had the exact same experience - see issue #11

@alejodosr
Copy link
Author

Hi @AHolliday ,

Could you provide the code you use to generate those bagfiles from any set of images? I'd like to generate some bagfiles, in order to try if that is influencing the algorithm to initialize. Thanks.

@AHolliday
Copy link

Hi @alejodosr, I generated the bagfiles directly from a ROS camera node, so I don't have any code to manufacture a bagfile from a set of images. This shouldn't be too hard to do though - the rosbag API pretty easy to use and provides facilities for creating new bagfiles and writing individual messages to them.

@raulmur
Copy link
Owner

raulmur commented Feb 3, 2016

@LinHuican
Copy link

Hi,
I run RGB-D node with ROS, for my kinect2, things goes well, however, for rgbd_dataset_freiburg1_room.bag, things goes wrong.
The algorithm cannot estimate the trajectory as:
https://plus.google.com/photos/photo/117456307544477709329/6307912839097699730?icm=false&authkey=CMOVmP6dsYmaYA

Any suggestion?

@wsAndy
Copy link

wsAndy commented Jan 15, 2017

@LinHuican check your .yaml file, and set the DepthMapFactor to 1 .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants