Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

no output from ethzasl_sensor_fusion #28

Open
pelicanunimd opened this issue Aug 11, 2014 · 10 comments
Open

no output from ethzasl_sensor_fusion #28

pelicanunimd opened this issue Aug 11, 2014 · 10 comments

Comments

@pelicanunimd
Copy link

Hello,
we got an disturbing problem here. We've set up ethzasl_ptam and asctec_mav_framework to use it on our pelican with ethzasl_sensor_fusion.
ptam and mav_framework delivere the expected data on the outputs (/vslam/pose, /fcu/ekf_state_out) and we also mapped the inputs for the sensor_fusion on these outputs.
Everything looks to be fine and there are no errors but the sensor_fusion dont deliver any data on the output. Neither on /fcu/ekf_state_in nor on /ssf_core/pose. Echo call stays empty. We also configured the parameter files like described in the tutorial.
Can anybody tell us whats the problem?
We attached our rqt_graph.
rosgraph

@pelicanunimd
Copy link
Author

After some research we discovered that our Problem is that the init function is never called and the the variable initialized_ is never set to true.
Maybe now anybody can help us?

@simonlynen
Copy link
Contributor

Usually we trigger the filter initialization using dynamic reconfigure or a service call. Do you see the init-checkbox in dynamic reconfigure?

@pelicanunimd
Copy link
Author

Thank for the hint, the init is done and we get an output.
But there we have a new issue. The ouput stays constantly at zero for position and orientation when we echo /ssf_core/pose. Only the covariance matrix is changing.
In the ekf_state_in only the state changes, the values for angular_velocity and linear_acceleration also stays zero.
You may have any more hints?

@stephanweiss
Copy link
Contributor

please verify if the MAV is switched on and the parameters are correctly set in the dynamic reconfigure gui:
http://wiki.ros.org/ethzasl_sensor_fusion/Tutorials/sfly_framework (particularly section 4.3)

Best,
Stephan


From: pelicanunimd [notifications@github.com]
Sent: Tuesday, August 12, 2014 3:40 AM
To: ethz-asl/ethzasl_sensor_fusion
Subject: Re: [ethzasl_sensor_fusion] no output from ethzasl_sensor_fusion (#28)

Thank for the hint, the init is done and we get an output.
But there we have a new issue. The ouput stays constantly at zero for position and orientation when we echo /ssf_core/pose. Only the covariance matrix is changing.
In the ekf_state_in only the state changes, the values for angular_velocity and linear_acceleration also stays zero.
You may have any more hints?


Reply to this email directly or view it on GitHubhttps://github.com//issues/28#issuecomment-51898453.

@pelicanunimd
Copy link
Author

Thanks for ur answer.

We have set all parameters like described in the tutorial.
Mav_framwork is running and delivering data which appears to be right.

@stephanweiss
Copy link
Contributor

I read your previous post too fast - it is not ekf_state_in but ekf_state_out you need to map to hl_state_input (ekf_state_in is the update the filter sends to the HL controller for state integration, it is ok if the accs and gyros are zero in this message).

See the sample rxgraph here:
http://wiki.ros.org/ethzasl_sensor_fusion/Tutorials/sfly_framework?action=AttachFile&do=view&target=sfly_rxgraph.eps

hope this helps.

Best,
Stephan


From: pelicanunimd [notifications@github.com]
Sent: Tuesday, August 12, 2014 10:30 AM
To: ethz-asl/ethzasl_sensor_fusion
Cc: Stephan Weiss
Subject: Re: [ethzasl_sensor_fusion] no output from ethzasl_sensor_fusion (#28)

Thanks for ur answer.

We have set all parameters like described in the tutorial.
Mav_framwork is running and delivering data which appears to be right.


Reply to this email directly or view it on GitHubhttps://github.com//issues/28#issuecomment-51947825.

@pelicanunimd
Copy link
Author

We already have mappped ekf_state_out to hl_state_input (see our qt_graph which is attached to our first post).

You have any more hints, whats the reason for our problem?

@pelicanunimd
Copy link
Author

Hey again,
at first thanks for your hints, we got the issue fixed (got an Mistake in the parameter file).

With the setup of the camera we got an further issue. The reason is, that our axis are alinged different to your description. So we rotated the axis of our Kinect causing the x Axis to be inverted. But after this the z-axis is inverted also.
Is their an possibilite to invert just one axis?
If we didnt described the problem well pls fell free to ask us for more details, we dont know how to describe it better....

@stephanweiss
Copy link
Contributor

the rotation and translation between camera and IMU is captured in the q_ic and p_ic parameters in the yaml file respectively. If it is easier for you, you can express the rotation between the two sensors as a rotation matrix and convert it (e.g. with MATLAB) to a quaternion. There is no need to have the setup look exactly like ours.

Best,
Stephan


From: pelicanunimd [notifications@github.com]
Sent: Friday, August 29, 2014 3:46 AM
To: ethz-asl/ethzasl_sensor_fusion
Cc: Stephan Weiss
Subject: Re: [ethzasl_sensor_fusion] no output from ethzasl_sensor_fusion (#28)

Hey again,
at first thanks for your hints, we got the issue fixed (got an Mistake in the parameter file).

With the setup of the camera we got an further issue. The reason is, that our axis are alinged different to your description. So we rotated the axis of our Kinect causing the x Axis to be inverted. But after this the z-axis is inverted also.
Is their an possibilite to invert just one axis?
If we didnt described the problem well pls fell free to ask us for more details, we dont know how to describe it better....


Reply to this email directly or view it on GitHubhttps://github.com//issues/28#issuecomment-53861627.

@pelicanunimd
Copy link
Author

Hey again,

we got a woking setup with a camera bound upward. But our camera is aktually looking sideways (directed to positive y).
But we want the camera to look forward (directed to positive x).

Setup 1 is the working setup for the camera looking sideways. There we rotated the camera and IMU for 270 degrees around the x - axis, so its bound upward.

Setup 1:
initialization of camera w.r.t. IMU
init/q_ci/w: -0.7071678118
init/q_ci/x: 0.71
init/q_ci/y: 0.0
init/q_ci/z: 0.0

init/p_ci/x: 0.0
init/p_ci/y: 0.0
init/p_ci/z: 0.0

initialization of world w.r.t. vision
init/q_wv/w: -0.7071678118
init/q_wv/x: 0.71
init/q_wv/y: 0.0
init/q_wv/z: 0.0


In this setup we want the camera to look forward. For this we further rotated camera and IMU for 90 degrees around the z - axis. So we got the following setup:

Setup 2:
initialization of camera w.r.t. IMU
init/q_ci/w: -0.5
init/q_ci/x: 0.5
init/q_ci/y: 0.5
init/q_ci/z: -0.5

init/p_ci/x: 0.0
init/p_ci/y: 0.0
init/p_ci/z: 0.0

initialization of world w.r.t. vision
init/q_wv/w: -0.5
init/q_wv/x: 0.5
init/q_wv/y: 0.5
init/q_wv/z: -0.5

Our Questions are:

1.) Did we transformation right for the described setups?

2.) When they're right, why were recieving just NAN Errors?

Best
...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants