Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hardware and Software Synchronization - Realsense Multicam #11669

Closed
kameel2311 opened this issue Apr 11, 2023 · 43 comments
Closed

Hardware and Software Synchronization - Realsense Multicam #11669

kameel2311 opened this issue Apr 11, 2023 · 43 comments

Comments

@kameel2311
Copy link


Required Info
Camera Model D455 and D435
Firmware Version 05.13.00.50 and 05.11.11.100
Operating System & Version Linux Ubuntu 21
Kernel Version (Linux Only) 5.15.0-69-generic
Platform PC
SDK Version 2.53.1
Language c++
Segment Robot

Issue Description

I am trying to synchronize a D455 and a D435 such that the depth frames correspond to the same scene. Upon reading I have found lots of issues and readings that date even back to 2018 and hence many of the issues could have become outdated with newer SDK versions. I thought we can use this issue to summarize all topics regarding both methods of Synchronization and will start with my understanding and trials.

Nominal Approach:
So the Multicam Example poses the idea of using a Pipeline for each Device. Such that under the hood, the pipeline pulls the streams and syncs the depth & color frames using the Syncer class. The output of which is a frameset. The framesets will not have the same timestamp as they are not triggered by the same source but the syncer matches the best possible match for syncing.

Software Approach:
After ensuring that the timestamps of various cameras come from the Global Time Domain, many issues mention the idea of using the timestamps to match the frames of various devices, but did not manage to find any exemplary codes on that. Moreover the notion of finding the initial time delay between frames of different devices to be subtracted was vaguely described, making the issue more complex.

One idea that came up and found an implementation on using the Syncer class as mentioned in #4158 (comment) , but it does not work as the frameset produced by the syncer object is one camera only , and is not consistent on which camera's frames are being given. My idea was to create two pipelines , one for each device, and enforce a frameset having two depth frames, using the idea here: #5847. Such that this frameset is supplied to syncer. Issue is syncer wont even have a buffer to align frames with and hence not sure if it even makes any sense.

Hardware Approach:
Next we to read on the hardware approach, but I was confused on why we cannot sync the RGB frames with the Depth frames of other devices. Moreover, the wiring is clearly explained in the whitepaper, but after that codewise was only mentioned to set one camera as master and other as slave, but should we still use a pipeline for each ? Or how should the code look like ? Moreover, to verify HW sync, there should be an increasing time shift, but between which frames exactly ? Also, does having the same FPS for each sensor mean that HW sync is successful ?

Last but not least, it would be great to know what could be synced between D455 and D435 and what to expected from using these devices. Thanks alot !

@MartyG-RealSense
Copy link
Collaborator

Hi @kameel2311 Whilst multiple RealSense cameras can be streamed simultaneously, they do not necessarily need to have their timestamps synchronized with hardware sync. Unsynched multicam is a very viable choice, as demonstrated by the RealSense SDK's C++ rs-multicam example program that you mentioned, which indeed does create a separate pipeline for each camera.

https://github.com/IntelRealSense/librealsense/tree/master/examples/multicam

#2219 is one of the best references for multicam C++ scripts that do not use hardware synchronization.


Global Time attempts to provide a common timestamp for multiple cameras - as described at #3909 - and is enabled by default on RealSense 400 Series cameras such as D435 and D455. So it would be viable to not use hardware synchronization but have Global Time enabled for all cameras that are attached to the same computer.


Regarding RGB sync, on the D435 and D455 camera models the RGB sensor module is attached to the camera's PCB circuit board separately via a cable. Because the RGB sensor is not integrated on the camera circuit board, RGB cannot be hardware synced.

If you need to use both RGB and depth then #10926 may be a useful reference regarding aligning hardware synced depth with non-synced RGB.


It is possible to hardware sync cameras of different models but it is not recommendable to do so due to the possibility of irregularities occurring because of performance differences in the hardware of each model.


In regard to verifying hardware sync, a high-tech way to do so is to use a high speed LED panel, as described at the link below.

https://support.intelrealsense.com/hc/en-us/community/posts/360049401673-Multi-sensor-synchronization-validation


Having the same FPS does not determine whether hardware sync is successful. When using Inter Cam Sync Mode 1 and 2 (master and slave), the measure of sync success is whether the timestamps of each camera gradually move apart over time. If the timestamps are perfectly synced all the time then hardware sync is not working. This is discussed in the section of Intel's multicam paper linked to below, under the paragraph heading Now to the somewhat counter intuitive aspect of time stamps.

https://dev.intelrealsense.com/docs/multiple-depth-cameras-configuration?_ga=2.76722134.2123322888.1679832719-1041004708.1589641371#3-multi-camera-programming


As hardware sync of RGB will not be supported on these cameras, assume that only depth is being timestamp-synced.

@kameel2311
Copy link
Author

kameel2311 commented Apr 12, 2023

Hello @MartyG-RealSense ! Thanks alot for your reply. So assuming that for now only the depth streams are of importance to be synced between the D455 and D435. Is HW sync or SW sync is to be recommended given different devices ? One thing is still not so clear about both syncs:

SW Sync:
Grouping the frames via timestamps seems a challenging given the time compensation to be accounted for is there any already implemented code for this use ? Moreover, does the notion of enforcing your own frameset of 2 depth frames , with a buffer and feeding that into a syncer object seem logical or will it not work ?

HW Sync:
Just to be sure, after configuring the flags for master and slaves, how is the general code structure for getting the frames from both devices, it will still be by creating a pipeline per device, no ?

Regarding the FPS not being an indication of successful HW Sync, then why was it the case that the FPS was used as the success metric in this tutorial video: https://www.youtube.com/watch?v=DSm7NYG5bZE&ab_channel=IntelRealSense

Last but not least: How can I know which stream is best for any camera and on what basis does the pipeline class choose the best fitting configuration per device ? Thanks !

@MartyG-RealSense
Copy link
Collaborator

I would not personally recommend mixing camera models, but if you have to use these two different cameras then the performance of a D435 and D455 is likely to be closer to one another than if one of the cameras was a D415 (which has a slow rolling shutter on both its depth and RGB sensors). The advice not to mix camera models dates back to when the only models available were D415 and D435.


In regard to software sync, synchronization between RGB and depth sensors on the same camera should automatically kick in when both sensor types have the same FPS and the RGB sensor option Auto-Exposure Priority has also been disabled, as described at #11203 (comment)

When the wait_for_frames() instruction is used in a script, the SDK should also attempt to find the closest timestamp match between different streams.

You can use C++ code to define a custom frameset by either using a custom processing block or the SDK's software-device interface, as described at #5847

In regard to hardware sync, creating a separate pipeline for each camera is a recommendable strategy for a multiple-camera application but not compulsory. As long as you define a master and slave with Inter Cam Sync Mode and the physical sync equipment such as cabling is set up correctly then hardware sync should occur regardless of how the script is structured.

On the linked-to video, they are highlighting with the mouse cursor that each camera is running at the same FPS. Whilst it is important that the slave cameras are synchronized to the same FPS as the master camera, the true indicator of sync success is the behaviour of the timestamps.

@kameel2311
Copy link
Author

@MartyG-RealSense gotcha , so now in attempts to only capture the Depth frames from the two cameras, I have created a script that is very similar to multicam example but with some modifications such that:

for(auto &&dev : ctx.query_devices()) {

      auto serial = dev.get_info(RS2_CAMERA_INFO_SERIAL_NUMBER);
      const char *str_detail = dev.get_info(info_type);
      serials.push_back(serial);

      rs2::pipeline pipe(ctx);
      rs2::config cfg;
      // cfg.enable_device(serial);
      if (strcmp(str_detail, str_D455) == 0)
      {
          cfg.enable_stream(RS2_STREAM_DEPTH, 0, 848, 480, RS2_FORMAT_Z16, 30);
      }
      else if (strcmp(str_detail, str_D435) == 0)
      {
          cfg.enable_stream(RS2_STREAM_DEPTH, 0, 848, 480, RS2_FORMAT_Z16, 30);
      }
      cfg.resolve(pipe);
      pipe.start(cfg);
      pipelines.emplace_back(pipe);
      colorizers[serial] = rs2::colorizer();
  }

Where I have noticed multiple observations that I dont quite understand:

  • Connecting a single device and running the script works and displays only the depth frame from the camera. But I sense that the params provided (resolution and FPS) are not being complied to.
  • Connecting the two cameras and running the script yields the following error:
 RealSense error calling rs2_pipeline_start_with_config(pipe:0x55b2a25f3a10, config:0x55b2a25f4630):
    xioctl(VIDIOC_S_FMT) failed, errno=16 Last Error: Device or resource busy
  • Using the enable_device(serial) and commenting out the enable_streams works but will show all sensors of all devices.

Any tips on what is happening ? Thanks !

@MartyG-RealSense
Copy link
Collaborator

Both the D435 and D455 will support 848x480 at 30 FPS if the camera is on a USB 3 connection (this configuration is not supported on USB 2), so having separate cfg.enable_stream instructions for D435 and D455 will be unnecessary and the 'If' and 'Else' conditions could be removed.

The cfg instructions could be removed entirely if you do not mind enabling the color stream too, as the default profile that will be applied without cfg instructions present is 848x480 / 30 FPS depth and 1280x720 / 30 FPS color.

@kameel2311
Copy link
Author

I only want to access the Depth streams, not sure how to purely select the depth streams without the RGB too not to use extra memory/computation. What I randomly tried was enabling both the device and selected streams and now it works and only streams the depth frames. Not sure how this got resolved.

Just to understand the SDK better, the cfg.enable_stream will read all streams of similar configuration ? What if I need different resolutions from different devices ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 12, 2023

If only the depth streams are defined in the cfg instructions then the RGB should not be enabled, otherwise depth and RGB will be enabled if cfg instructions are not included in the script.

You can define separate sets of cfg instructions, each with a different camera serial number, and create a separate pipeline for each camera. An example of this method in the Python language is at #1735 (comment)

The main disadvantage of that method is having to manually define serial numbers for specific cameras instead of having the flexibility of the SDK auto-detecting the serial number of whatever cameras are attached.

@kameel2311
Copy link
Author

kameel2311 commented Apr 12, 2023

Thanks very much , the only unclassified detail is how does the Pipeline class choose which stream to use, as with the sensor-control example there exists lots of stream profiles per same sensor that allows for higher FPS/Resolution , or are they fictitious streams that might not actually be pulled from the sensor ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 12, 2023

In a multiple camera script (such as rs-multicam) where cfg instructions are not defined and the serial number is auto-detected and a pipeline auto-created for it, the default stream profile will be applied to that camera's assigned pipeline.

In a script with multiple pipelines and cfg configurations like #1735 (comment) the script will know which stream configuration to apply to which camera because each camera pipeline will have its own start() instruction with the name of the cfg set to apply inside its brackets. For example:

pipeline_1.start(config_1)
pipeline_2.start(config_2)

Each RealSense camera model has its own default profile. On D435 / D435i and D455, that will typically be 848x480 / 30 FPS depth and 1280x720 / 30 FPS color. The D415 model meanwhile has a default profile of 1280x720 / 30 FPS depth and 1920x1080 / 30 FPS color.

An easy way to work out the supported resolution / FPS for each camera is to choose a resolution in the RealSense Viewer and then choose an FPS. If the FPS is supported for that resolution then the FPS selection will not change. If the chosen FPS is not supported for that particular resolution then the Viewer automatically suggests a different resolution / FPS instead.

Not every stream format listed by a tool such as rs-sensor-control may be accessible by RealSense users. For example, if the RAW10 depth format is selected in the Viewer then when the stream is enabled there is no image and a message that says that RAW10 cannot be rendered in the Viewer.

@kameel2311
Copy link
Author

Very informative thanks alot. Will be trying out Hardware Synchronization in the upcoming weeks ! For now , thanks for your assistance

@MartyG-RealSense
Copy link
Collaborator

You are very welcome. I'm pleased that I could help :)

@kameel2311
Copy link
Author

I am now digging into the RealSense2-ROS 1 packages but do not really understand the main Package configuration and how ROS is using the SDK (pipelines/configs) and all those details. Therefore I am not sure on how to setup the HW Sync Master/Slave flags when launching different cameras.

Is there an article that also summarizes the ROS packages architecture with relations to the RealSense SDK ? Or would it be adviced to wrap the code I have modified to a catkin workspace and use that as a node in some sort ?

@MartyG-RealSense
Copy link
Collaborator

The RealSense ROS wrapper acts as a compatibility layer between the RealSense SDK and ROS so that SDK sensor streams can be translated into a format that is compliant with ROS standards. A lot of the ROS wrapper's parameters are therefore using the RealSense SDK's functions in the background instead of writing completely new code just for the wrapper.

There is not an architecture guide that shows how the ROS wrapper interfaces with the SDK. In general, ROS parameters are used to access equivalent functions in the SDK in the background (such as the align_depth ROS parameter being equivalent to aligning depth to color with the align_to instruction in the SDK).

It is possible to create ROS node scripts that can interface with the SDK. Official ROS1 wrapper examples of such scripts are at the link below.

https://github.com/IntelRealSense/realsense-ros/tree/ros1-legacy/realsense2_camera/scripts

In the ROS wrapper, master and slave status is set with the parameter inter_cam_sync_mode. Setting this parameter can be problematic though, as described at IntelRealSense/realsense-ros#984 (comment) and the comments beneath it.

Another approach is to set Inter Cam Sync Mode in a json custom camera configuration file and then load that json into the ROS wrapper during launch - see IntelRealSense/realsense-ros#1540 (comment)

@kameel2311
Copy link
Author

kameel2311 commented Apr 13, 2023

And I would guess the same would apply for Disabling "Auto Exposure Priority" option for intra frame sync, no ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 13, 2023

If Auto-Exposure is enabled and Auto-Exposure Priority (an rgb_camera setting) is disabled, then if both depth and color streams are enabled then a constant FPS should be enforced for both streams instead of allowing FPS to lag below its defined speed.

@kameel2311
Copy link
Author

kameel2311 commented Apr 13, 2023

Yes I was wondering on how would the JSON look like for both options, as trying it with the D455 as such give me the error code below:
{
"Inter Cam Sync Mode": 0.0
}

[ERROR] [1681392633.618785263]: An exception has been thrown: Inter Cam Sync Mode key is not supported by the connected device!
[ERROR] [1681392633.618824654]: Exception: Inter Cam Sync Mode key is not supported by the connected device!

Update: Managed via
rosrun dynamic_reconfigure dynparam set /camera/stereo_module inter_cam_sync_mode 1
rosrun dynamic_reconfigure dynparam set /camera/rgb_camera auto_exposure_priority False

@MartyG-RealSense
Copy link
Collaborator

Thanks very much for the confirmation that setting the sync mode via rosrun worked for you!

@MartyG-RealSense
Copy link
Collaborator

Hi @kameel2311 Do you require further assistance with this case, please? Thanks!

@kameel2311
Copy link
Author

kameel2311 commented Apr 21, 2023

@MartyG-RealSense Actually yes ! So I just connected both cameras without the RC Filter for now to just test if HW Sync even works as shown in the image attached.

hwsync

Now I have encountered rather weird behaviour when I set the D455 as a Master and D435 as a slave. As shown below the D435 is acting very strangely.

image

On the otherhand, setting D435 as the master will cause each to run on about 30Fps , but not even sure if HW Sync is working.

So my questions are: What could be an explanation to these observations and how is the time drift being computed as discussed in different issues to check the Unintiuative HW Sync check.

@MartyG-RealSense
Copy link
Collaborator

Intel recommend that the Master camera is enabled first and the Slave camera is enabled second. Can you confirm whether you are always enabling in this order, please?

The paragraph above Now to the somewhat counter intuitive aspect of time stamps explains about drift. I have quoted the paragraph below.


Each frame from each camera will have a frame number and time stamp that can be queried. Each camera will however have its own very precise time clock, and while they in principle should be identical in nature, their times are derived from individual crystals and they will inevitably drift a few milliseconds over the course of 10s of minutes. This also means that time stamps from different cameras will have an offset. This offset can be subtracted in software, to “align” them all at a specific time.


#5989 (comment) describes how a RealSense user came up with code for calculating drift.

@kameel2311
Copy link
Author

So technically if the Offset is increasing in a linear and steady fashion, it confirms hardware sync ?

Yes that is the case , so I hook both cameras and change the stereo controls. D455 master other slave and turn master stream visualization first. D435 acts very weird when its a slave but not a master.

Note the wires I used is without the RC filter, could that be the issue ? (Although D455 being a slave does not exhibit that behaviour)

@MartyG-RealSense
Copy link
Collaborator

Yes, if the offset is increasing predictably over time then it confirms that hardware sync is working correctly.

Hardware sync cables do not usually require RC filters, though if the cables are long then they are at greater risk of experiencing electrostatic discharge (ESD) that resets the camera's frame counters and so can have components optionally built into the cable to protect against ESD discharge.

@kameel2311
Copy link
Author

kameel2311 commented Apr 24, 2023

Okay then in my case they are about 15cms long, so will skip the RC filter for now. I have updated the D435 Firmware to 05.14.00.00 but the issue still persists. Any clue on what is going on or what could be the case ?

Is this document, they state connecting Pin1 together https://vcl3d.github.io/VolumetricCapture/docs/hardware/rs2_hardware/, is that needed or only Pin9 and Pin5 together ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 24, 2023

In the official Intel documentation for hardware sync, only Pin 5 and Pin 9 need to be connected. RealSense users who have attached a wire to Pin 1 have found that it is very noisy, like in the report at #1212

You can see in the image below that Pin 1 has no wire attached to it.

https://dev.intelrealsense.com/docs/multiple-depth-cameras-configuration

image

image

@kameel2311
Copy link
Author

kameel2311 commented Apr 24, 2023

So I am not sure if HW Sync is actually working, here are snippets of the code that is relevant, note due to the order of Cameras being Hooked to the Laptop , the D455 is being Accessed first:
Setting Inter Cam Modes:

for (rs2::sensor sensor : dev.query_sensors())
        {
            // std::cout << sensor.get_info(RS2_CAMERA_INFO_NAME) << std::endl;

            if (auto depth_sensor = sensor.as<rs2::depth_sensor>())
            {
                if (strcmp(str_detail, str_D455) == 0)
                {
                    // std::cout << "Depth Sensor Sync Modified for " << str_detail << std::endl;
                    sensor.set_option(RS2_OPTION_INTER_CAM_SYNC_MODE, 1);
                }
                else if (strcmp(str_detail, str_D435) == 0)
                {
                    // std::cout << "Depth Sensor Sync Modified for " << str_detail << std::endl;
                    sensor.set_option(RS2_OPTION_INTER_CAM_SYNC_MODE, 2);
                }
            }
        }

Enabling Depth Streams:

rs2::pipeline pipe(ctx);
        rs2::config cfg;
        cfg.enable_device(serial);
        if (strcmp(str_detail, str_D455) == 0)
        {
            cfg.enable_stream(RS2_STREAM_DEPTH, 0, 848, 480, RS2_FORMAT_Z16, 30);
        }
        else if (strcmp(str_detail, str_D435) == 0)
        {
            cfg.enable_stream(RS2_STREAM_DEPTH, 0, 848, 480, RS2_FORMAT_Z16, 30);
        }
        // cfg.enable_stream(RS2_STREAM_DEPTH, 0);
        cfg.resolve(pipe);
        pipe.start(cfg);
        pipelines.emplace_back(pipe);
        colorizers[serial] = rs2::colorizer();

Then the Polled Frameset Timestamps Logged:

 if (new_frames.size() == 2)
            {
                std::cout << std::to_string(frame.get_timestamp()) << ",";
            }

Note the If statement is to only check when both cameras have input feed (without HW sync sometimes feed from one is seen, didnt check with HW sync)

Subtracting the Timestamps and plotting shows:
plot

Which is clearly not stable. So the following questions pop and would like to get you professional opinion on:

  • Which timestamp should be read , the internal or global clock ?
  • Is there a way to even check that the Pins are connected to each other ?
  • Is my way of measuing the drift correct ?
  • Is polling for Frames the correct way or should it be Waiting for Frames ?
  • Lastly , I will be reverting back to the ROS Wrapper , is there a way to check the drift from the Wrapper functions themselves ?

Much thanks !

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 24, 2023

Is your section of code containing the INTER_CAM_SYNC_MODE instructions placed before the pipe.start(cfg) line like the C++ hardware sync script at #2637


I believe that FRAME_TIMESTAMP - which is on the hardware clock - will be an appropriate timestamp to read.


There is not a way to check the connectivity between the pins with software. The only clue that there might be a problem with the wiring is if the timestamps are not drifting apart, suggesting that hardware sync is not active.

Pin 5 on the slave camera might be able to have the arrival of a trigger pulse from the master camera detected if a electrical measuring device such as a voltmeter is attached to slave Pin 5.


I cannot confirm if your method of checking drift is correct as there are very few examples of checking drift to compare it to.


If it is a multiple camera application then using poll_for_frames() instead of wait_for_frames() is recommended, as advised by a RealSense team member at #2422 (comment)


There is not a wrapper function to check for drift in the ROS wrapper.

ROS has some capability to compensate for drift, as described at IntelRealSense/realsense-ros#796 (comment) and IntelRealSense/realsense-ros#1906 (comment) by the RealSense team member who created the ROS1 wrapper.

@kameel2311
Copy link
Author

kameel2311 commented Apr 24, 2023

Yes, its actually just before it in the code


So I will be using the frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP); function instead


My understanding of drift was wrong as I thought it was the difference in Global Host timestamps. Now I am measuring it using this method:

for (const auto &frame : new_frames)
{
    // Frame Manipulation
    const void *depth = frame.get_data();
    const int size = frame.get_data_size();
    // std::cout << size << " , " << depth << std::endl;

    // Get the serial number of the current frame's device
    auto serial = rs2::sensor_from_frame(frame)->get_info(RS2_CAMERA_INFO_SERIAL_NUMBER);
    // Apply the colorizer of the matching device and store the colorized frame
    render_frames[frame.get_profile().unique_id()] = colorizers[serial].process(frame);

    if (strcmp(serial, str_D455_serial) == 0)
    {
        D455_current = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
        D455_drift = D455_current - D455_last_time;
        D455_last_time = D455_current;
    }
    else if (strcmp(serial, str_D435_serial) == 0)
    {
        D435_current = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
        D435_drift = D435_current - D435_last_time;
        D435_last_time = D435_current;
    }
}

if (new_frames.size() == 2)
{
    std::cout << "Drift D455: " << D455_drift << "  |  "
              << "Drift D435: " << D435_drift << std::endl;
}

But the output is not constant when HW Sync Cables are not connected:
Experiment 1: using poll for frames
Intel RealSense D455,Intel RealSense D435,
Drift D455: 1905086205 | Drift D435: 95442963
Drift D455: 33344 | Drift D435: 66688
Drift D455: 45205 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 36961
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33525 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 67885
Drift D455: 33344 | Drift D435: 33655
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33404 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33423
Drift D455: 66687 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 66719
Drift D455: 33344 | Drift D435: 66687
Drift D455: 66688 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 66688
Drift D455: 66688 | Drift D435: 33344
Drift D455: 33343 | Drift D435: 66688
Drift D455: 66688 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 33343
Drift D455: 33344 | Drift D435: 66688
Drift D455: 33344 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33343 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33343
Drift D455: 66688 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 66688
Drift D455: 66688 | Drift D435: 66688
Drift D455: 33343 | Drift D435: 66688
Drift D455: 66688 | Drift D435: 66688
Drift D455: 33344 | Drift D435: 66688
Drift D455: 66688 | Drift D435: 33343
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33343 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 66688
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 66687

And when the Cables are connected:

Intel RealSense D455,Intel RealSense D435,
Drift D455: 1810129531 | Drift D435: 83449547
Drift D455: 78498 | Drift D435: 60044
Drift D455: 33344 | Drift D435: 40051
Drift D455: 33344 | Drift D435: 40107
Drift D455: 68859 | Drift D435: 60000
Drift D455: 33344 | Drift D435: 20112
Drift D455: 33344 | Drift D435: 19927
Drift D455: 33344 | Drift D435: 39987
Drift D455: 33474 | Drift D435: 40107
Drift D455: 33344 | Drift D435: 40020
Drift D455: 33344 | Drift D435: 40124
Drift D455: 33397 | Drift D435: 40098
Drift D455: 33344 | Drift D435: 20086
Drift D455: 66739 | Drift D435: 60014
Drift D455: 33344 | Drift D435: 40028
Drift D455: 33344 | Drift D435: 40024
Drift D455: 33344 | Drift D435: 20109
Drift D455: 33344 | Drift D435: 20060
Drift D455: 33343 | Drift D435: 19962
Drift D455: 33344 | Drift D435: 60137
Drift D455: 66688 | Drift D435: 40033
Drift D455: 33344 | Drift D435: 20049
Drift D455: 33344 | Drift D435: 20080
Drift D455: 33344 | Drift D435: 40064
Drift D455: 33344 | Drift D435: 19944
Drift D455: 66688 | Drift D435: 59979
Drift D455: 33344 | Drift D435: 40020
Drift D455: 33344 | Drift D435: 20107
Drift D455: 33344 | Drift D435: 40031
Drift D455: 33343 | Drift D435: 40042
Drift D455: 33344 | Drift D435: 19944
Drift D455: 33344 | Drift D435: 40042
Drift D455: 33344 | Drift D435: 20095
Drift D455: 33344 | Drift D435: 20079
Drift D455: 33344 | Drift D435: 40014
Drift D455: 33344 | Drift D435: 40034
Drift D455: 33344 | Drift D435: 19980
Drift D455: 33344 | Drift D435: 39998
Drift D455: 33344 | Drift D435: 40068
Drift D455: 33344 | Drift D435: 20056
Drift D455: 33344 | Drift D435: 60106
Drift D455: 33343 | Drift D435: 19927
Drift D455: 33344 | Drift D435: 40049
Drift D455: 33344 | Drift D435: 40033
Drift D455: 33344 | Drift D435: 20038
Drift D455: 33344 | Drift D435: 19997
Drift D455: 33344 | Drift D435: 40066
Drift D455: 33344 | Drift D435: 40037
Drift D455: 66688 | Drift D435: 40014
Drift D455: 33344 | Drift D435: 40060
Drift D455: 33344 | Drift D435: 40015
Drift D455: 33344 | Drift D435: 40022
Drift D455: 33343 | Drift D435: 40065
Drift D455: 66688 | Drift D435: 60137
Drift D455: 66688 | Drift D435: 40047
Drift D455: 33344 | Drift D435: 40041
Drift D455: 33344 | Drift D435: 20102
Drift D455: 33344 | Drift D435: 39985
Drift D455: 66688 | Drift D435: 60003
Drift D455: 33344 | Drift D435: 40020
Drift D455: 33344 | Drift D435: 40126
Drift D455: 66687 | Drift D435: 60181
Drift D455: 66688 | Drift D435: 59874
Drift D455: 33344 | Drift D435: 39960


Experiment 2: using wait for frames gave results closer to the expected results in case of no cable:
Intel RealSense D455,Intel RealSense D435,
Drift D455: 1905086205 | Drift D435: 95442963
Drift D455: 33344 | Drift D435: 66688
Drift D455: 45205 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 36961
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33525 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 67885
Drift D455: 33344 | Drift D435: 33655
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33404 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33423
Drift D455: 66687 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 66719
Drift D455: 33344 | Drift D435: 66687
Drift D455: 66688 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 66688
Drift D455: 66688 | Drift D435: 33344
Drift D455: 33343 | Drift D435: 66688
Drift D455: 66688 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 33343
Drift D455: 33344 | Drift D435: 66688
Drift D455: 33344 | Drift D435: 33344
Drift D455: 66688 | Drift D435: 33344

And with cabel, not the case:
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33343 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33343
Drift D455: 33344 | Drift D435: 36388
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 33344
Drift D455: 33344 | Drift D435: 62771
Drift D455: 33344 | Drift D435: 40330
Drift D455: 33343 | Drift D435: 38797
Drift D455: 33344 | Drift D435: 31133
Drift D455: 66688 | Drift D435: 33715
Drift D455: 33344 | Drift D435: 39561
Drift D455: 33344 | Drift D435: 39725
Drift D455: 33344 | Drift D435: 19997
Drift D455: 33344 | Drift D435: 40026
Drift D455: 33344 | Drift D435: 39583
Drift D455: 33344 | Drift D435: 20134
Drift D455: 33344 | Drift D435: 40215
Drift D455: 33344 | Drift D435: 60100
Drift D455: 66688 | Drift D435: 40150
Drift D455: 33343 | Drift D435: 60438
Drift D455: 33344 | Drift D435: 19997
Drift D455: 33344 | Drift D435: 20385
Drift D455: 33344 | Drift D435: 41020
Drift D455: 33344 | Drift D435: 20068
Drift D455: 33344 | Drift D435: 39691
Drift D455: 33344 | Drift D435: 58991

Anytips ? Plus can I assume the HW Influence as the D435 with cable is no longer a multiple of 33344 ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 25, 2023

Regarding the code below:

D455_current = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
D455_drift = D455_current - D455_last_time;
D455_last_time = D455_current;

It may work better if you do it like this.

D455_previous = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
D455_next = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
D455_drift = D455_next - D455_previous;

@kameel2311
Copy link
Author

kameel2311 commented Apr 27, 2023

Not sure if I get your code , it is basically reading the same time stamp and the drift will just be zero .. ?
Also any inputs on the frames skipping ?

@MartyG-RealSense
Copy link
Collaborator

The intention of my code was to read the timestamp in one instant and read it again in the next instant. But yes, inserting a small sleep time period in-between the two calls may be beneficial to ensure that the difference does not read as zero.

Usually when a timestamp doubles it is due to global time being true and setting global time to false resolves it, though I am not convinced that it is the cause in this particular case as I would expect the timestamps to always be double instead of occasionally.

@kameel2311
Copy link
Author

But the code that I have used is also doing the same thing, no ? Any resources on how to set global domain to false ? And should it be set to false to notice the drift , or does it not play a role ?

@MartyG-RealSense
Copy link
Collaborator

Your code works similarly but I feel that it would be better to have current and next values as separately named variables. Also, the first time that the code is run, the value of D455_last time will be '0' because it is called in the second line but the value of D455_last_time is not yet set until the third line.

D455_current = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
D455_drift = D455_current - D455_last_time;
D455_last_time = D455_current;

In C++ code, global time should be able to be disabled with sensor.set_option(RS2_OPTION_GLOBAL_TIME_ENABLED, 0);

An example of using this instruction is at #7614 (comment)

Whilst it is simple to implement this instruction to test whether it makes a difference to the timestamps, I doubt that it is the cause of the doubled values in this particular case. But there is no harm in testing it with the above instruction to eliminate the possibility.

@kameel2311
Copy link
Author

kameel2311 commented Apr 27, 2023

Have updated the code as follows but it did not make a difference in terms of drift consistency:

D455_current = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
D455_drift = D455_current - D455_last_time;
D455_last_time = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);

Regarding turning off the global time, not sure if is working as using sensor.get_option(RS2_OPTION_GLOBAL_TIME_ENABLED) still returns 1. (Defined the parameter before enabling the stream)


What I am noticing when using D435 as slave . the drift is very unstable as follows:
Drift D455: 33344 | Drift D435: 20097
Drift D455: 33344 | Drift D435: 19883
Drift D455: 33344 | Drift D435: 20150
Drift D455: 33344 | Drift D435: 19935
Drift D455: 33344 | Drift D435: 20046
Drift D455: 33344 | Drift D435: 20046
Drift D455: 33344 | Drift D435: 40050
Drift D455: 33344 | Drift D435: 39957
Drift D455: 33344 | Drift D435: 19935
Drift D455: 33343 | Drift D435: 40020
Drift D455: 33344 | Drift D435: 39973
Drift D455: 33344 | Drift D435: 20108
Drift D455: 33344 | Drift D435: 40041
Drift D455: 33344 | Drift D435: 19883
Drift D455: 33344 | Drift D435: 60109
Drift D455: 33344 | Drift D435: 60109

After checking on realsense viewer and configuring the params for master slave for each device , the FPS of D435 explains the inconsistency and very unstable depth images.:

d35fps

Weird behaviour, although both streams should be operating at 30Fps no ? Can it be reproduced to check if nothing is wrong from my side ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 27, 2023

I do not have hardware sync cabling to test sync between two cameras to replicate your results, unfortunately.

It is okay for 30 FPS to show as 29.79 FPS for your D455. If your D435 camera is also set to 30 FPS then an actual FPS of 50.25 is clearly incorrect.

I also note that the D455 depth resolution shows as 424x240 whilst the D435 depth resolution shows as 848x480. This suggests that the Decimation Filter post-processing filter is enabled on the D455 but disabled on the D435, as the Decimation Filter halves the depth resolution when enabled.

Given how different the D435 image is, could you check please whether the entire list of post-processing filters is disabled on the D435. This would be indicated by a red (Off) icon beside 'Post-Processing' and the list of filters all having grey icons beside them.

image

In a script that you have written yourself, all post-processing filters are disabled by default as they have to be deliberately programmed into the script, whilst the Viewer has a range of post-processing filters enabled by default.

@kameel2311
Copy link
Author

Yes the Filters where off but the issue persisted, so I conducted a couple of tests to check if the issue is the cable itself.
Turning on the D435 Depth Stream with Sync Cable NOT Connected:

  1. Inter Cam Sync set to 0/1 and Post Processing On: Resolution 424x240 and FPS 30
  2. Inter Cam Sync set to 2 and Post Processing On: Resolution 424x240 and FPS 30
  3. Inter Cam Sync set to 2 and Post Processing Off: Resolution 848x480 and FPS 30

Turning on the D435 Depth Stream with Sync Cable Connected but D455 Not connected to Laptop

  1. Inter Cam Sync set to 0/1 and Post Processing On: Resolution 424x240 and FPS 30
  2. Inter Cam Sync set to 2 and Post Processing On: Resolution 424x240 and FPS 50 (Although should be 30)
  3. Inter Cam Sync set to 2 and Post Processing Off: Resolution 848x480 and FPS 50 (Although should be 30)

Turning on the D435 Depth Stream with Sync Cable Connected but D455 connected to Laptop , D455 Stream OFF

  1. Inter Cam Sync set to 0/1 and Post Processing On: Resolution 424x240 and FPS 30
  2. Inter Cam Sync set to 2 and Post Processing On: Resolution 424x240 and FPS 50 (Although should be 30)
  3. Inter Cam Sync set to 2 and Post Processing Off: Resolution 848x480 and FPS 50 (Although should be 30)

Turning on the D435 Depth Stream with Sync Cable Connected but D455 connected to Laptop , D455 Stream ON First and Inter Cam Sync set to 1

  1. Inter Cam Sync set to 0/1 and Post Processing On: Resolution 424x240 and FPS 30
  2. Inter Cam Sync set to 2 and Post Processing On: Resolution 424x240 and FPS 50 (Although should be 30)
  3. Inter Cam Sync set to 2 and Post Processing Off: Resolution 848x480 and FPS 50 (Although should be 30)

Testing on D455 with cable and scenarios mentioned above yielded the same behaviour as seen in the D435.

After changing the cables , I noticed that setting Inter Cam Sync being set to 2 but not having a master and having the cable connected caused the issue, and it makes sense as its a Floating Pin. So could be a sign of bad wiring for other future issues.


Now the Drifts of each Cameras are not changing much
33344,33344
33344,33344
33344,33344
33344,33344
33344,33344
33344,33344
33344,33344
33344,33344
33344,33344
33343,33344
33343,33344
33344,33344
33344,33344
33344,33344
33344,33344
33344,33344
33344,33344
33344,33344
33344,33344

After 15 Mins. Is more time needed to notice the drift ? (Global was not disabled as mentioned above although set option was used)

@MartyG-RealSense
Copy link
Collaborator

The multiple camera white-paper documentation states that when using mode 1 and 2, timestamps will "inevitably drift a few milliseconds over the course of 10s of minutes".

If it is 'tens of minutes' (multiples of 10 minutes, such as an hour) then it suggests that it may take longer than 15 minutes for drift to become apparent.

If hardware sync is not working then there will be no drift.

@kameel2311
Copy link
Author

kameel2311 commented Apr 28, 2023

Will try to leave it for some hours then , when milliseconds is mentioned , what is the expected value change from the nominal value of 33344 ? My expectation is that 33344 is nano seconds and hence should be seeing increases in terms of 34344 ... 35344. But could mean that we should see drifts in the nano scale and always getting 33344 and not 33350 for example implies HW is failing ?

Also the thing is, how should it be detected , as a single line having the drift and then the next line is back to no drift ? As we are measuring the drift between consecutive frames


Another test that I thought could be interesting is to set the D435 inter cam mode to GenLock (4) and hence I only saw feed from the D435 when the D455 which confirms the wiring. But I noticed that the frame rate dropped from 30 fps to 15 fps , is this the normal behaviour in the case of GenLock Mode ?

But setting it to more than 4 , example 100 or so made the D435 run on 30Fps , I do not quite understand this behaviour

image

@kameel2311
Copy link
Author

The multiple camera white-paper documentation states that when using mode 1 and 2, timestamps will "inevitably drift a few milliseconds over the course of 10s of minutes".

If it is 'tens of minutes' (multiples of 10 minutes, such as an hour) then it suggests that it may take longer than 15 minutes for drift to become apparent.

If hardware sync is not working then there will be no drift.

Now I am a bit more confused as the Drift you explained in this #3565 post is not the same drift idea explained here

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 28, 2023

There is no information available about an expected rate of drift other than the general "10s of minutes" statement. So I do not have further guidance that I can offer on that subject, unfortunately.


If hardware sync is working then I would assume that drift will take the form of the D435 and D455 frame values initially being the same and then the slave camera's values very gradually differentiating from the master camera's values over a period of 10s of minutes.


Genlock mode works differently to modes 1 and 2 and the mathematics of FPS can be complicated. For example, if a Master and slave camera are used and the Master's FPS is 30 then the slave camera should be set to 90 FPS.

It can also be configured to release a 'burst' of frames with each trigger by using a 'burst count' value from 1 to 255 (like the 100 value that you tried).

A PDF documentation of Genlock is linked to below and I recommend reading that if you wish to use genlock because of the system's complexity. Support for genlock has been withdrawn by Intel because it was an experimental proof of concept and Intel recommend using modes 1 and 2 only now. Use of genlock is still supported in the SDK though.

External Synchronization.pdf


#3565 is discussing hardware sync with modes 1 and 2, not the genlock mode (4+). As mentioned above, modes 4+ have very different behaviour to modes 1 and 2. For example, a slave set to '2' will listen on each frame for a trigger for a certain time period and then initiate an unsynched capture if it does not hear a trigger within that time period. With modes 4+ though, the camera waits indefinitely for a trigger signal and does not capture until it receives one.

@kameel2311
Copy link
Author

kameel2311 commented Apr 28, 2023

I think there has been a misunderstanding of what the drift is in terms of Drift inbetween the same camera timestamp or the difference of the two cameras timestamps.

So a final test has been conduced where the drift between the timestamps of the different cameras:

if (strcmp(serial, str_D455_serial) == 0)
{
    // std::cout << frame.get_frame_timestamp_domain() << std::endl;
    // D455_current = frame.get_timestamp();
    D455_current = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
    // D455_drift = D455_current - D455_last_time;
    // D455_last_time = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
}
else if (strcmp(serial, str_D435_serial) == 0)
{
    // std::cout << frame.get_frame_timestamp_domain() << std::endl;
    // D435_current = frame.get_timestamp();
    D435_current = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
    // D435_drift = D435_current - D435_last_time;
    // D435_last_time = frame.get_frame_metadata(RS2_FRAME_METADATA_SENSOR_TIMESTAMP);
}
}
drift = abs(D455_current - D435_current);

The results are quite interesting and I think they showcase the drifting. Note that the code only ran for 6 minutes, but as the timestamps are expressed on the nanosecond scale , it is sufficient to see the drift in that scale.

The red plot shows the drift when both inter cam modes are 0 , and the drift just jitters around a constant number (not zero as I am not compensating for the initial differences between the different HW Clocks). The Blue Plot shows the drift (removed some outliers for visualization purposes, hence the artifacts middle of plot) when D455 is set to inter cam mode 1 and D435 to 2:

image

Do you think this is a proof of viable HW Sync ? Moreover, is there a way to obtain the SensorTimeStamp for each camera's frame using the ROS Wrapper ? Or another way via ROS that leads to the same results ?

@MartyG-RealSense
Copy link
Collaborator

If timestamps are constant then there is no sync. If timestamps drift apart then there is sync. So your description of the behaviour of the timestamps when Inter Cam Sync Mode = 2 does suggest that sync is occurring.

I believe that performing a rostopic echo on the info topic of a stream will show the timestamps. An example of such a stream in the ROS wrapper is /camera/depth/camera_info/

@MartyG-RealSense
Copy link
Collaborator

Hi @kameel2311 Do you require further assistance with this case, please? Thanks!

@MartyG-RealSense
Copy link
Collaborator

Case closed due to no further comments received.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants