Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Odometry drifts with Euroc dataset Machine Hall (MH_01_easy) and Mynteye Camera #1

Closed
ghost opened this issue Feb 18, 2019 · 3 comments

Comments

@ghost
Copy link

ghost commented Feb 18, 2019

No description provided.

@ghost ghost changed the title Odometry drifts with Euroc dataset and Mynteye Camera Odometry drifts with Euroc dataset Machine Hall (MH_01_easy) and Mynteye Camera Feb 18, 2019
@huaizheng
Copy link
Member

huaizheng commented Feb 18, 2019

i. For each sequence in machine hall, as what we can notice, there is a phase of "stop->moving->stop" at the beginning which is particular for map initialization of visual(-inertial) SLAM. R-VIO does not need such a procedure to initialize the estimator, because it does not include map points (features) in the state vector. Also, R-VIO is a pure visual-inertial odometry focusing on high-precision motion tracking and quick initialization, for the sequences in machine hall we typically start the estimation after that particular phase to avoid the possible drift.
ii. For the Mynteye camera issue, as you did not provide any details, currently I have no clue on that. While the most important thing to run on your own sensor is to have a correct config file (*.yaml), especially for those fixed sensor parameters, and for the other (tunable) parameters you can change their values based on the types of motion and environment.

@poaongithub
Copy link

hi, is there a solution to R-VIO drifts when it on the stationary situation? thanks a lot.@huaizheng

@huaizheng
Copy link
Member

huaizheng commented Apr 5, 2019

@poaongithub First of all, the stationary case is a degenerate motion case for VINS which may cause drifts. As shown in the paper, the influence of such case can be mitigated by R-VIO's inverse depth-based measurement model that tends to only update orientation when motionless (for example, in the urban driving test we had to stop a while before the traffic light while the translational drift was very small). However, if at the very beginning when the state covariance has not "well-converged", it is hard to guarantee that such drift can be prevented. So in this code, I implemented a simple logic, to detect whether the sensor is moving or not using some thresholds that you can tune by yourself, to decide when to start the estimation. And there exist other solutions using map information (in SLAM-based VINS) to deal with the stationary phase in the navigation, for example, in the OKVIS and VINS-Mono.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants