Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems with /tf topic (erratic or nonexistent messages) #415

Closed
zbynekwinkler opened this issue May 10, 2020 · 4 comments
Closed

Problems with /tf topic (erratic or nonexistent messages) #415

zbynekwinkler opened this issue May 10, 2020 · 4 comments

Comments

@zbynekwinkler
Copy link

There seems to be some problem with /tf topic. Sometimes (I don't have deterministic reproduction steps) the message stream is somewhat erratic, sometimes even nonexistent.

$ rostopic hz /tf
subscribed to [/tf]
WARNING: may be using simulated time
average rate: 2045.699
        min: 0.000s max: 0.080s std dev: 0.00399s window: 762
average rate: 1482.576
        min: 0.000s max: 0.192s std dev: 0.00717s window: 1958
average rate: 1133.260
        min: 0.000s max: 0.492s std dev: 0.01289s window: 2059
no new messages
no new messages
average rate: 580.160
        min: 0.000s max: 2.788s std dev: 0.05181s window: 3048
average rate: 651.546
        min: 0.000s max: 2.788s std dev: 0.04612s window: 3879
no new messages
no new messages
average rate: 482.710
        min: 0.000s max: 2.960s std dev: 0.06197s window: 4440
average rate: 564.297
        min: 0.000s max: 2.960s std dev: 0.05468s window: 5732
average rate: 569.754
        min: 0.000s max: 2.960s std dev: 0.05421s window: 5833
no new messages
average rate: 464.726
        min: 0.000s max: 2.960s std dev: 0.06387s window: 6035
average rate: 525.869
        min: 0.000s max: 2.960s std dev: 0.05787s window: 7380
average rate: 540.293
        min: 0.000s max: 2.960s std dev: 0.05654s window: 7738
no new messages
no new messages
average rate: 454.804
        min: 0.000s max: 3.368s std dev: 0.06658s window: 8142
average rate: 498.630
        min: 0.000s max: 3.368s std dev: 0.06185s window: 9465
average rate: 508.680
        min: 0.000s max: 3.368s std dev: 0.06101s window: 9729
no new messages
no new messages
average rate: 444.620
        min: 0.000s max: 3.404s std dev: 0.06870s window: 10133
average rate: 462.969
        min: 0.000s max: 3.404s std dev: 0.06619s window: 10953

while this is the output for /tf, the /clock topic is rock solid 250Hz.

This has been collected with the latest docker images release
2020-05-06 (btw: thanks for the hashes on the release_notes page):

$ docker images --digests osrf/subt-virtual-testbed | grep 'cloudsim.*latest'
osrf/subt-virtual-testbed   cloudsim_sim_latest          sha256:d7f7bda4319a0640ecf36e28506f64a43b147b0cc3da0f8c2e71c84e6cc43749   20116f354c1e        3 days ago          9.38GB
osrf/subt-virtual-testbed   cloudsim_bridge_latest       sha256:856d404f2f2413cabe5f0a85547483eefb15d3ca36026a83efc0df8bfd2cb8fd   a92cfd712966        4 days ago          4.15GB

The simulation and the bridge has been started with one robot of the ROBOTIKA_X2 config 1 type. Various team members reported this problem independently (at least 3 different setups suffer from this problem).

We have not been using this topic yet but now we are trying to use some existing SLAM modules that depend on it and that is how we have discovered the problem.

@zbynekwinkler
Copy link
Author

The problem is most likely to surface in cave qualification world using SSCI_X4_SENSOR_CONFIG_2 robot. If the simulation is started first and given enough time to start up and bridge is started only after some delay, then even in this world with this robot, everything seems to be running ok. Only when the bridge is started right after the simulation, the problem appears. Also roswtf reports that the pose_tf_broadcaster's get/set logging level services are not responding.

@nkoenig
Copy link
Contributor

nkoenig commented May 15, 2020

A solution has been deployed to Cloudsim. Please give it a try.

@zbynekwinkler
Copy link
Author

So far it seems to be working. We will keep testing to see if the problem is indeed gone for good.

@nkoenig
Copy link
Contributor

nkoenig commented Aug 17, 2020

I'm closing this issue in order to triage the issue tracker. Please re-open or create a new issue if this problem persists.

@nkoenig nkoenig closed this as completed Aug 17, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants