New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue with pose in own datasets #4
Comments
Thank you for your attention and testing. I have checked the pose file and in_STD.bag. It seems like the timestamp of the point cloud recorded by the bag is inconsistent with the timestamp of the pose. |
想请问一下,rosbag包的雷达话题必须是"sensor_msgs/PointCloud2"吗?“livox_ros_driver/CustomMsg”话题貌似不能成功加载(我把这个 |
Dear hku-mars team:
First of all: congratulations for your wonderfull work.
I'm trying to test STD on my own datasets recorded with a Livox Horizon, and I found a issue that I'm not achieving to resolve.
To get the undistorted cloud and the poses I'm trying to use FAST-LIO2.
This is what I do:
Finished the proccess I have the whole FAST-LIO2 topics in a bag file (I know it's not an efficient way to do it)
Then I need to proccess the next topics:
So, to rename "/cloud_registered_body" to "/cloud_undistort" and set the frame to "camera_init" as in the example datasets, and extract the poses I did the next python script:
This ouputs a correct poses file with some lines like the next:
I then run roslaunch std_detector demo_livox.launch configured to use the in_STD.bag and the outposes.txt obtained with the python script.
No errors, and everything in console looks fine, but on rviz the poses are not being applyed to the pointcloud, overlapping all scans without movement. Even it finds matches:
I don't know the cause and I'm loosing the perspective...
You can find the bags, the poses and the python script in the next GoogleDrive folder:
https://drive.google.com/drive/folders/1LZQAjIAIoE35cfyi_7Z5IxWl1W1ilotM?usp=sharing
If anybody can optimize the script or the workflow it shoud be great.
Thanks in advance
The text was updated successfully, but these errors were encountered: