Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example to use okvis and super point + brisk on maplab 2.0 #389

Open
ryanESX opened this issue Apr 23, 2023 · 10 comments
Open

Example to use okvis and super point + brisk on maplab 2.0 #389

ryanESX opened this issue Apr 23, 2023 · 10 comments

Comments

@ryanESX
Copy link

ryanESX commented Apr 23, 2023

After I reviewed 2.0 paper. I want to try to use okvis + super point to mapping. But I couldn't find in documentation or code. The closest one is in this launch file(https://github.com/ethz-asl/maplab/blob/master/maplab-launch/launch/alphasense-dev-kit/alphasense-dev-kit-maplab-node-w-okvis.launch) But I think it is for original okvis. any launch file or anyway to do this?
Also, maplab 2.0 supports a lot of modular design. Is it possible to use another odometry source like ORB-SLAM3/Vins-fusion?

@smauq
Copy link
Member

smauq commented Apr 23, 2023

Hi! Still working on the documentation and docker deployment. I'll be presenting at ICRA and have still a bunch of stuff to do until then. That is totally possible, to get you started, you can do things separately:

  • Running OKVIS so that it works with maplab. You can use this branch https://github.com/ethz-asl/okvis_ros/commits/feature/maplab_18.04 that publishes the correct odometry message for maplab. I would suggest running a separate workspace for okvis as it has some compile issues when putting it together with maplab. You need to also include the message repo in the okvis workspace https://github.com/ethz-asl/maplab_msgs/tree/devel/external_features. Please make sure you pull the correct branches. You should have:
    ├── okvis_ws/src
    │ ├── catkin_simple/
    │ ├── maplab_msgs/
    │ └── okvis_ros/
    You can see here okvis now publishes a maplab odom message on topic okvis_maplab_odometry. Similarly if you want to use for example ORBSLAM3 you can just go and modify the standard odometry message to include the biases for the imu from the estimator. Here's for example how I did it for FASTLIO2.
  • Change odometry source in maplab: Then just change the topic in the maplab config file. So for example if you were running EuRoC. You would change this line to the okvis odometry topic.
  • External features: For the external features what OS do you have? It would be easiest in Ubuntu 20.04 (or then otherwise running a docker). Let me know that first and I can let you know how to run the external features plugin.

@zjtde1990
Copy link

I have a similar problem. I want to change the method of keypoints detection and matching. For example, some feature extraction and matching algorithms based on deep learning,such as SuperPoint, etc. How can I do it on an Ubuntu 20.04 computer. Thank you very much and look forward to your reply.

@smauq
Copy link
Member

smauq commented Apr 28, 2023

Hi, so it should be relatively easy. Download this repo that runs the feature detection https://github.com/ethz-asl/maplab_features. If you want to use opencv features I suggest a different workspace (where you copy over the maplab_msgs and catkin_simple packages from dependecies). As there's still some conflicts on 20.04 I'm trying to fix.

Afterward you have this example:

To switch to superpoint you can look at the other config and launch file in maplab features that's for hilti, it's relatively straight forward. You also have to adapt the name of the features in the maplab calibration file.

After making the map in the console you can run ms to see if you have other types of features in the map. And then a few other commands would be:

  • v --vis_show_only_landmark_type=SuperPoint
  • lc --lc_feature_type=SuperPoint --lc_detector_engine=hnsw
  • optvi --ba_feature_type=SuperPoint

Let me know if you have any issues, I'll be working on improving stuff until ICRA.

@ryanESX
Copy link
Author

ryanESX commented May 3, 2023

@smauq Thanks for your suggestion. I'll try and update to reproduce your work. I will close this issue first. If there is any updates, I will give you feedback.

@ryanESX ryanESX closed this as completed May 3, 2023
@ryanESX
Copy link
Author

ryanESX commented May 4, 2023

@smauq I have another similar question. Is it possible to use other vio(like orbslam3/vins fusion) to do localization mode? like following tutorial, but we use other vio source.
https://maplab.asl.ethz.ch/docs/master/pages/tutorials-rovioli/C_Running-ROVIOLI-in-Localization-mode.html

@ryanESX ryanESX reopened this May 4, 2023
@smauq
Copy link
Member

smauq commented May 6, 2023

@ryanESX Not yet implemented, but it's on my todo list. It would not be a huge task to add a small section in maplab_node that gives you raw localizations (I'll see when I get to that). However these would be raw localizations, i.e. unfiltered. So they could be jumping all over the place. Integrating these into ORBSLAM / VINS would be a more involved process, as you would need to go change their code to take localizations from some other framework and somehow integrate them into their state.

If you just want raw localizations, and get to the point you need them I can have a look at what's the easiest changes I can do to output them from maplab_node.

@ryanESX
Copy link
Author

ryanESX commented May 18, 2023

@smauq Sorry for late reply. I think first step can be raw localization. And VIO can be benefit from loosely-coupled way. It will be great if maplab_node can provide raw localization. Even maplab provide "jumping" position, VIO can try to "fuse" it. There has some example from open_vins: https://github.com/rpng/ov_secondary ( loosely-coupled). And tightly-coupled example https://github.com/HKUST-Aerial-Robotics/VINS-Fusion/tree/master/loop_fusion.
Of course, tightly-coupled way like loop_fusion in vins_fusion could have better result. But as you said, easiest way is to use raw localization. And is it easy to use raw localization to integrate with different VIO/VSLAM.

@cheng-chi
Copy link

Hi @smauq is there any update on docker depolyment? Even a semi-finished working development branch will be very helpful! Thanks!

@smauq
Copy link
Member

smauq commented Jun 6, 2023

@cheng-chi You can find one here https://github.com/ethz-asl/maplab/tree/wiki_update2/docs/pages/installation
Before I can merge that, I still need to make one change regarding opencv and noetic. Otherwise, the feature detection module and maplab can't coexist in the same docker environment, which is very annoying. I'll ping you here as soon as that's done, was just working on it now.

@cheng-chi
Copy link

@smauq Thank you so much for the update!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants