Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi Camera HW Sync Questions #2637

Closed
VasuAgrawal opened this issue Oct 29, 2018 · 14 comments
Closed

Multi Camera HW Sync Questions #2637

VasuAgrawal opened this issue Oct 29, 2018 · 14 comments
Assignees

Comments

@VasuAgrawal
Copy link
Contributor

VasuAgrawal commented Oct 29, 2018

Required Info
Camera Model D400 (4x D435)
Firmware Version 2.16.2 (Cameras running 5.10.6)
Operating System & Version Ubuntu 16.04 LTS
Kernel Version (Linux Only) 4.4.0-138-generic
Platform PC
SDK Version 2
Language C++ / ROS
Segment Robot

Issue Description

I'm trying to develop a robot perception system which uses 4x hardware synced RealSense D435 cameras. However, I'm having some trouble actually getting the hardware sync to work properly. I've also stumbled across some (what I think are) discrepancies in the documentation that I'd like to get addressed if possible.

This is what my setup currently looks like.

img_20181028_225921

There are currently only 3 cameras on the board. I figured I'd get those to work before adding a fourth. Each camera is configured to stream at 848x480 at 30fps on both depth (Z16) and color (RGB8). All of the cameras have been wired such that their pins 5 (VSYNC) and 9 (GROUND) are connected. No other pins on the cameras have been connected. The 3 cameras are plugged in to a powered USB 3.0 hub, which is then plugged into the USB 3.0 port on my laptop.

I read this version of the multi camera whitepaper, as well as a number of different GitHub issues and other forum posts when trying to develop the following code. I also referred to the Doxyen documentation. This code should, in theory, do the following:

  1. Similar to this response by @dorodnic , spawn a new thread per RS device, spaced 3 seconds apart (to allow for initialization time).
  2. Identify the sensors available on the device (grabbing the depth and color sensors explicitly).
  3. Configure the depth and color sensors for manual exposure and gain, with auto exposure disabled (which seems to be a requirement for hardware sync). Also, set the frame queue size for each sensor to be 1, so only the most recent frame is used.
  4. Configure the first depth sensor as the master, and all other depth sensors (if any) as slaves. RGB cameras do not support this configuration option so they're left in the default state.
  5. Create a pipeline to read from the device corresponding to the current thread.
  6. Display the timestamp and frame number for the frameset and frames inside the frameset for each frame received in the pipeline. Also, display the drift (difference from the previous frame).

Assuming everything is configured correctly in hardware and in software, I should observe a drift in the timestamps (according to the whitepaper). However, that doesn't seem to be happening. As a result, I have a number of questions:

  1. Is there example source code provided on configuring and validating hardware sync? The whitepaper makes reference to a "test app (provided separately)", but I've been unable to locate it. There was some mention in a GitHub issue about this functionality being rolled into the realsense-viewer, but I've been unable to trigger sync with multiple cameras in the viewer (the option remains grayed out).
  2. Regarding validating hardware sync - the whitepaper mentions that "you can actually validate that your sync is working by looking at the drift in the difference of the time stamps". Exactly which time stamps do I need to look at the drift of? I'm currently looking at the drift between consecutive frames from individual devices, which doesn't seem to change. Should I instead be looking at the drift between timestamps on frames from different devices, or something else entirely?
  3. How do I set hardware sync on a rs2::device? The whitepaper (3.A HW sync command) says that the pointer can be either "Sensor" or the "Device". However, I've been unable to set RS2_OPTION_INTER_CAM_SYNC_MODE on anything other than the depth sensor. Furthermore, since I'm unable to set any flags on RGB, how does the RGB image sensor get hardware synced?
  4. What should the pulse length be for external sync? According to the whitepaper, a 100 microsecond pulse is required at the nominal frame rate. However, when looking at the signal generated by the master on an oscilliscope, I observe a 50 microsecond pulse instead. This is true regardless of which camera is acting as the master.
  5. The whitepaper says that "it is best to set the queue size (the “capacity” parameter) to 1 if enabling depth-only, and 2 if enabling both depth and color streams." Is this queue size the same as the RS2_OPTION_FRAMES_QUEUE_SIZE that I'm currently setting to 1 for each sensor? Along the same lines, I saw some references to using a frame queue to poll for frames, rather than the rs2::pipeline. However, I wasn't able to find any sample code using it in the library. Is there a simple example available?
  6. While I might not be hitting bandwidth limits with 2 or 3 cameras, I'm likely to brush up against them with 4. The whitepaper mentions that compression could help solve this issue. Is it available in the current firmware (5.10.6)? If so, how do I enable it? If not, is there an estimate on when it might be available?
  7. Should the frame number for RGB and depth always be the same (for a given device)? With the program I have written, I'm observing that the frame numbers differ, sometimes quite significantly (many hundreds of frames, in a total of 500 frames received). Would this be due to ESD events, or is this a software bug? Similarly, I observe occasionally when running the program that one of the sensors in a device (usually the second or third device to be initialized) resets quite a few times (setting frame counter to 0, and keeping timestamp the same) before continuing. I've observed this even with 2 devices connected directly to my laptop, rather than through a USB hub.
  8. What timestamp and frame number are accessed through a rs2::frameset containing both a depth image and a color image? Is it always the data corresponding to a particular type of frame, random, or a different timestamp and frame number altogether? My program seems to indicate that the frameset always has the depth image's data.
  9. Is there a way to do hardware sync with the ROS driver? (A cursory glance would indicate no, but I haven't taken a deeper look).

I do apologize for the long set of questions. I'm hoping this issue can become a reference for anyone else trying to set up a multi camera hardware sync.

Source code:

#include <iostream>
#include <map>
#include <chrono>
#include <thread>
#include <string>
#include <vector>
#include <limits>

#include <librealsense2/rs.hpp>
#include <librealsense2/rs_advanced_mode.hpp>

int main()
{
   
    // Create a single context for managing all the realsense devices.
    rs2::context ctx;
    std::vector<std::thread> threads;
   
    std::size_t dev_id = 0;
    for (auto&& dev : ctx.query_devices())
    {
        threads.emplace_back([dev, dev_id](){

            std::cout << "Device (Thread) ID: " << dev_id << std::endl;
            std::string serial = dev.get_info(RS2_CAMERA_INFO_SERIAL_NUMBER);
            std::cout << "Camera with serial number: " << serial << std::endl;

            auto advanced_dev = dev.as<rs400::advanced_mode>();
            auto advanced_sensors = advanced_dev.query_sensors();

            bool depth_found = false;
            bool color_found = false;
            rs2::sensor depth_sensor;
            rs2::sensor color_sensor;
            for (auto&& sensor : advanced_sensors) {
                std::string module_name = sensor.get_info(RS2_CAMERA_INFO_NAME);
                std::cout << module_name << std::endl;

                if (module_name == "Stereo Module") {
                    depth_sensor = sensor;
                    depth_found = true;
                } else if (module_name == "RGB Camera") {
                    color_sensor = sensor;
                    color_found = true;
                }
            }

            if (!(depth_found && color_found)) {
                std::cout << "Unable to find both stereo and color modules" <<
                    std::endl;
            }
            
            depth_sensor.set_option(RS2_OPTION_ENABLE_AUTO_EXPOSURE, 0);
            depth_sensor.set_option(RS2_OPTION_EXPOSURE, 8500); // microseconds
            depth_sensor.set_option(RS2_OPTION_GAIN, 16);
            depth_sensor.set_option(RS2_OPTION_FRAMES_QUEUE_SIZE, 1);

            // RGB sync doesn't work, need to use depth as master.
            if (dev_id == 0) {
                std::cout << "Setting " << dev_id << " to master!" << std::endl;
                depth_sensor.set_option(RS2_OPTION_INTER_CAM_SYNC_MODE, 1);
            } else {
                std::cout << "Setting " << dev_id << " to slave!" << std::endl;
                depth_sensor.set_option(RS2_OPTION_INTER_CAM_SYNC_MODE, 2);
            }

            color_sensor.set_option(RS2_OPTION_ENABLE_AUTO_EXPOSURE, 0);
            color_sensor.set_option(RS2_OPTION_EXPOSURE, 100); // 1/10 ms (10)
            color_sensor.set_option(RS2_OPTION_GAIN, 64);
            color_sensor.set_option(RS2_OPTION_FRAMES_QUEUE_SIZE, 1);

            rs2::pipeline pipe;
            rs2::config cfg;
            cfg.enable_device(serial);
            cfg.enable_stream(RS2_STREAM_DEPTH, 848, 480, RS2_FORMAT_Z16, 30);
            cfg.enable_stream(RS2_STREAM_COLOR, 848, 480, RS2_FORMAT_RGB8, 30);

            rs2::pipeline_profile profile = pipe.start(cfg);

            double last_time = 0;

            for (int frame_count = 0; frame_count < 500; ++frame_count) {

                rs2::frameset frames = pipe.wait_for_frames();

                rs2::frame color_frame = frames.get_color_frame();
                rs2::frame depth_frame = frames.get_depth_frame();

                std::cout << frames.size() << " frames from " << serial << ": ";
                std::cout.precision(std::numeric_limits<double>::max_digits10);

                std::cout << "Drift: " << depth_frame.get_timestamp() - last_time << ", ";
                last_time = depth_frame.get_timestamp();

                for (const rs2::frame& f : {(rs2::frame)frames, depth_frame, color_frame}) {
                    switch(f.get_frame_timestamp_domain()) {
                        case (RS2_TIMESTAMP_DOMAIN_HARDWARE_CLOCK):
                            std::cout << "Hardware Clock ";
                            break;
                        case (RS2_TIMESTAMP_DOMAIN_SYSTEM_TIME):
                            std::cout << "System Time ";
                            break;
                        default:
                            std::cout << "Unknown ";
                            break;
                    }
                    std::cout << "TS: " << std::scientific << f.get_timestamp() 
                        << "(" << f.get_frame_number() << "), ";
                }

                std::cout << std::endl;
            }
        });

        dev_id++;
        std::this_thread::sleep_for(std::chrono::seconds(3));
    }

	for (auto& t : threads) t.join(); // Must join / detach all threads
}

P.S. If there are better ways of using the RealSense API than what I've demonstrated above, please let me know.

@dorodnic
Copy link
Contributor

Quick question before I try to dive into all the individual items - are you getting the Hardware Clock printout?

@VasuAgrawal
Copy link
Contributor Author

Thanks for the quick response! I should've included some example output.

The attached files are both with 3 cameras plugged in through a powered USB hub. I am getting the Hardware Clock output in both.

3_cameras_hub_multi.log
3_cameras_hub_multi_2.log

@dorodnic
Copy link
Contributor

Is there example source code provided on configuring and validating hardware sync? The whitepaper makes reference to a "test app (provided separately)", but I've been unable to locate it. There was some mention in a GitHub issue about this functionality being rolled into the realsense-viewer, but I've been unable to trigger sync with multiple cameras in the viewer (the option remains grayed out).

What do you mean by "grayed out"? The Viewer should show either combo-box (or a slider sometimes) named Inter-Camera Sync, starting from firmware 5.9.13.6 for the D435 and 5.9.15.1 for D415.
What firmware version are you using? Did you try with newer ones?

Regarding validating hardware sync - the whitepaper mentions that "you can actually validate that your sync is working by looking at the drift in the difference of the time stamps". Exactly which time stamps do I need to look at the drift of? I'm currently looking at the drift between consecutive frames from individual devices, which doesn't seem to change. Should I instead be looking at the drift between timestamps on frames from different devices, or something else entirely?

Perhaps @agrunnet would be able to comment

How do I set hardware sync on a rs2::device? The whitepaper (3.A HW sync command) says that the pointer can be either "Sensor" or the "Device". However, I've been unable to set RS2_OPTION_INTER_CAM_SYNC_MODE on anything other than the depth sensor.

You are doing it correctly, I think at the time of writing of the white paper there was a typo in the Doxygen that leaked into the white-paper.

Furthermore, since I'm unable to set any flags on RGB, how does the RGB image sensor get hardware synced?

As far as I know D415 allows Depth+Color sync to another Depth+Color. D435 allows Depth to Color sync, and Depth to another Depth sync, but not Color to another Color, so at least one (or both) of the color sensors will not be synced. Perhaps firmware / hardware people will be able to find a solution in future update, but for now it is a limitation. For the D415, setting option on the depth sensor will also propagate to color (assuming regular checklist for sync is green - AE Priority Off and same FPS)

What should the pulse length be for external sync? According to the whitepaper, a 100 microsecond pulse is required at the nominal frame rate. However, when looking at the signal generated by the master on an oscilliscope, I observe a 50 microsecond pulse instead. This is true regardless of which camera is acting as the master.

I don't know the answer to that.

The whitepaper says that "it is best to set the queue size (the “capacity” parameter) to 1 if enabling depth-only, and 2 if enabling both depth and color streams." Is this queue size the same as the RS2_OPTION_FRAMES_QUEUE_SIZE that I'm currently setting to 1 for each sensor? Along the same lines, I saw some references to using a frame queue to poll for frames, rather than the rs2::pipeline. However, I wasn't able to find any sample code using it in the library. Is there a simple example available?

I would get rid of set_option(RS2_OPTION_FRAMES_QUEUE_SIZE, 1);.
In LabView app, @agrunnet is configuring each sensor directly with an output queue. In this case, you want to make sure you don't allow any unnecessary queuing. Frame queue size = 1 is default for C++ today (rs_processing.hpp:131)
I believe that using pipeline does not introduce noticeable extra-latency, so I think you should be better of with the default settings.

While I might not be hitting bandwidth limits with 2 or 3 cameras, I'm likely to brush up against them with 4. The whitepaper mentions that compression could help solve this issue. Is it available in the current firmware (5.10.6)?

Not yet, unfortunately.

If so, how do I enable it? If not, is there an estimate on when it might be available?

I don't think we have a date we can communicate.

Should the frame number for RGB and depth always be the same (for a given device)? With the program I have written, I'm observing that the frame numbers differ, sometimes quite significantly (many hundreds of frames, in a total of 500 frames received). Would this be due to ESD events, or is this a software bug? Similarly, I observe occasionally when running the program that one of the sensors in a device (usually the second or third device to be initialized) resets quite a few times (setting frame counter to 0, and keeping timestamp the same) before continuing. I've observed this even with 2 devices connected directly to my laptop, rather than through a USB hub.

Perhaps @RealSense-Customer-Engineering can offer more answers, I'm not 100% sure.

What timestamp and frame number are accessed through a rs2::frameset containing both a depth image and a color image? Is it always the data corresponding to a particular type of frame, random, or a different timestamp and frame number altogether? My program seems to indicate that the frameset always has the depth image's data.

It is always of the depth sensor, but I don't recommend to rely on this.

Is there a way to do hardware sync with the ROS driver? (A cursory glance would indicate no, but I haven't taken a deeper look).

Not at the moment, but we are implementing it right now (together with more options). So within couple of weeks. If you will really need it before, we can help you "hack" the wrapper to add it.

I wish I could be more helpful, this is a complex multidisciplinary topic, but I hope my colleagues will be able to fill the gaps.

@VasuAgrawal
Copy link
Contributor Author

Is there example source code provided on configuring and validating hardware sync? The whitepaper makes reference to a "test app (provided separately)", but I've been unable to locate it. There was some mention in a GitHub issue about this functionality being rolled into the realsense-viewer, but I've been unable to trigger sync with multiple cameras in the viewer (the option remains grayed out).

What do you mean by "grayed out"? The Viewer should show either combo-box (or a slider sometimes) named Inter-Camera Sync, starting from firmware 5.9.13.6 for the D435 and 5.9.15.1 for D415.
What firmware version are you using? Did you try with newer ones?

As far as I know, I'm using the newest firmware version available. I did a source build from master last night. The firmware running on all of the devices is 5.10.6. I'm running librealsense 2.16.2. This is what the viewer looks like for me with 2 devices enabled:

no_sync

How do I set hardware sync on a rs2::device? The whitepaper (3.A HW sync command) says that the pointer can be either "Sensor" or the "Device". However, I've been unable to set RS2_OPTION_INTER_CAM_SYNC_MODE on anything other than the depth sensor.

You are doing it correctly, I think at the time of writing of the white paper there was a typo in the Doxygen that leaked into the white-paper.

Got it, thanks.

Furthermore, since I'm unable to set any flags on RGB, how does the RGB image sensor get hardware synced?

As far as I know D415 allows Depth+Color sync to another Depth+Color. D435 allows Depth to Color sync, and Depth to another Depth sync, but not Color to another Color, so at least one (or both) of the color sensors will not be synced. Perhaps firmware / hardware people will be able to find a solution in future update, but for now it is a limitation. For the D415, setting option on the depth sensor will also propagate to color (assuming regular checklist for sync is green - AE Priority Off and same FPS)

How does the Depth to Color sync work in D435? More specifically, what do I need to do to enable it? Also, if color can't be synced to depth, does that mean I should have a separate pipeline for each depth sensor, which are all synced to each other (one pipeline per thread), and a separate set of pipelines for the color images, which will arrive asynchronously? If there aren't separate pipelines (as now), does that mean the wait_for_frames() will return every time the depth frame arrives (with whatever the last color frame was), and similarly for color (returning whatever the last depth frame to arrive was)?

The whitepaper says that "it is best to set the queue size (the “capacity” parameter) to 1 if enabling depth-only, and 2 if enabling both depth and color streams." Is this queue size the same as the RS2_OPTION_FRAMES_QUEUE_SIZE that I'm currently setting to 1 for each sensor? Along the same lines, I saw some references to using a frame queue to poll for frames, rather than the rs2::pipeline. However, I wasn't able to find any sample code using it in the library. Is there a simple example available?

I would get rid of set_option(RS2_OPTION_FRAMES_QUEUE_SIZE, 1);.
In LabView app, @agrunnet is configuring each sensor directly with an output queue. In this case, you want to make sure you don't allow any unnecessary queuing. Frame queue size = 1 is default for C++ today (rs_processing.hpp:131)
I believe that using pipeline does not introduce noticeable extra-latency, so I think you should be better of with the default settings.

Will do. I had set that originally because I noticed in the rs-sensor-control example that the options for both the stereo and color modules had queue sizes of 16 by default. If the value isn't used to determine the frame size for processing, then it doesn't make sense for me to set it.

While I might not be hitting bandwidth limits with 2 or 3 cameras, I'm likely to brush up against them with 4. The whitepaper mentions that compression could help solve this issue. Is it available in the current firmware (5.10.6)?

Not yet, unfortunately.

If so, how do I enable it? If not, is there an estimate on when it might be available?

I don't think we have a date we can communicate.

I understand. Glad to know that it's being worked on, though!

What timestamp and frame number are accessed through a rs2::frameset containing both a depth image and a color image? Is it always the data corresponding to a particular type of frame, random, or a different timestamp and frame number altogether? My program seems to indicate that the frameset always has the depth image's data.

It is always of the depth sensor, but I don't recommend to rely on this.

Got it. I guess the data on the frameset doesn't have any special meaning, then, and that I should just look at the metadata on the frames directly.

Is there a way to do hardware sync with the ROS driver? (A cursory glance would indicate no, but I haven't taken a deeper look).

Not at the moment, but we are implementing it right now (together with more options). So within couple of weeks. If you will really need it before, we can help you "hack" the wrapper to add it.

I was hoping to have this project completed for a demo on December 1st. Does that sound like a realistic timeline for the ROS wrapper, or should we investigate hacks?

I wish I could be more helpful, this is a complex multidisciplinary topic, but I hope my colleagues will be able to fill the gaps.

You've been plenty helpful. Thank you so much!

@dorodnic
Copy link
Contributor

How does the Depth to Color sync work in D435? More specifically, what do I need to do to enable it?

When working with a single camera, hardware sync between color and depth kicks-in once they run at the same actual FPS. To ensure it, you can disable Auto Exposure Priority control under RGB Camera > Controls (this way the camera won't reduce FPS in a dark room to get more light)
The way it works, as far as I know, is by color sensor sending sync signal to the depth sensor. This is why it is problematic to set it up in multi-camera scenario. Color sensor always wants to be the master.

Another hardware quirk is that color frames get their timestamp at a different time (mid exposure, I think?), so the timestamps are not perfectly aligned. We try to work around that in syncer class, that is also used inside each pipeline.

Also, if color can't be synced to depth, does that mean I should have a separate pipeline for each depth sensor, which are all synced to each other (one pipeline per thread), and a separate set of pipelines for the color images, which will arrive asynchronously? If there aren't separate pipelines (as now), does that mean the wait_for_frames() will return every time the depth frame arrives (with whatever the last color frame was), and similarly for color (returning whatever the last depth frame to arrive was)?

Yes, if you enable all depth streams (synced) and all color streams, the color streams might introduce some unwanted latency to the depth streams (since every wait for frame will try to match depth frame to a color frame). This should not create delays greater than the frame interval, but if this is unacceptable, you can switch to directly controlling each sensor (see sensor-control example)
The main question is what your application prioritizes? Having some color together with each depth frame or having all depth frames together?

@dorodnic
Copy link
Contributor

Also, the control for hardware sync is under Stereo Module > Controls > Inter Cam Sync Mode.
The Sync button is supposed to control the software sync, but it is not functional for now.

@VasuAgrawal
Copy link
Contributor Author

All of that makes sense. Thanks again :).

I had a few more questions regarding the details of the synchronization and triggering. Specifically:

  1. What times do the reported sensor / optical timestamps correspond to? I've looked at a couple of threads about this (namely this GitHub issue asking about the metadata, linked to from this forum post and which links to this documentation). Specifically, the thread mentions "Sensor(optical)_timestamp is a Firmware-generated timestamp that marks middle of sensor exposure." Can you be more precise about when that timestamp is generated? Is it a fixed (consistent) offset from the start of sensor exposure? Does it correspond the end of the trigger pulse? Something else? I'd also like to know if the answers are different for RGB and depth.
  2. When does depth exposure start relative to the hardware trigger (when devices are configured in slav mode)? Specifically, does the depth exposure start on one of the edges of the pulse, or does the pulse indicate that a frame should be taken whenever next available, or something else? I believe it's the first case, though I'd like to have some confirmation.
  3. Is it possible to run the sensors at lower framerates (e.g. 1 or 2 fps, instead of 15 / 30 / 60 / 90)?

To your earlier points - I'll definitely look at controlling sensors directly, as in that example. Based on our discussion, I believe our current approach for the application is to use a different RGB sensor which can be triggered externally in combination with the RealSense for depth images. If you'd like (or more appropriately, have the time), I'd be happy to discuss more specifics of our application with you offline.

@VasuAgrawal
Copy link
Contributor Author

@RealSense-Customer-Engineering Are there any updates here?

@Desperad0
Copy link

@RealSense-Customer-Engineering Are there any updates here?

All of that makes sense. Thanks again :).

I had a few more questions regarding the details of the synchronization and triggering. Specifically:

  1. What times do the reported sensor / optical timestamps correspond to? I've looked at a couple of threads about this (namely this GitHub issue asking about the metadata, linked to from this forum post and which links to this documentation). Specifically, the thread mentions "Sensor(optical)_timestamp is a Firmware-generated timestamp that marks middle of sensor exposure." Can you be more precise about when that timestamp is generated? Is it a fixed (consistent) offset from the start of sensor exposure? Does it correspond the end of the trigger pulse? Something else? I'd also like to know if the answers are different for RGB and depth.
  2. When does depth exposure start relative to the hardware trigger (when devices are configured in slav mode)? Specifically, does the depth exposure start on one of the edges of the pulse, or does the pulse indicate that a frame should be taken whenever next available, or something else? I believe it's the first case, though I'd like to have some confirmation.
  3. Is it possible to run the sensors at lower framerates (e.g. 1 or 2 fps, instead of 15 / 30 / 60 / 90)?

To your earlier points - I'll definitely look at controlling sensors directly, as in that example. Based on our discussion, I believe our current approach for the application is to use a different RGB sensor which can be triggered externally in combination with the RealSense for depth images. If you'd like (or more appropriately, have the time), I'd be happy to discuss more specifics of our application with you offline.

Have you solved the hardware sync problem? I set up the hardware sync setting as what whitepaper says and used the codes you posted. But I did not see the same time stamp from the cameras.

@RealSense-Customer-Engineering
Copy link
Collaborator

[Realsense Customer Engineering Team Comment]
The D400 series camera hardware triggering is different from the camera flash trigger that helps capture single image at same time. This hardware trigger pulse would be the reference of the start point of each frame (SOF) hence all slave cameras are streaming at same pace. Without the trigger, all Slave cameras will just run freely upon their individual clock.

When different cameras are sync'd, you will see all cameras capture an image at same time.

The depth sensor can run as low as 6 fps.

@Desperad0
Copy link

[Realsense Customer Engineering Team Comment]
The D400 series camera hardware triggering is different from the camera flash trigger that helps capture single image at same time. This hardware trigger pulse would be the reference of the start point of each frame (SOF) hence all slave cameras are streaming at same pace. Without the trigger, all Slave cameras will just run freely upon their individual clock.

When different cameras are sync'd, you will see all cameras capture an image at same time.

The depth sensor can run as low as 6 fps.

Now the I set the depth cameras synced but have no luck with color cameras. I am using realsense D415. I found even for one camera, the depth and color timestamps are not the same. I tried to found how to sync the internal sensors for D415 but still have no luck.
The following is the output of one camera
2,frames from ,839112062372:,System Time ,TS:,1.547171088126e+12(216),System Time ,TS:,1.547171088126e+12(216),System Time ,TS:,1.547171088130e+12(214),
2,frames from ,839112062372:,System Time ,TS:,1.547171088160e+12(217),System Time ,TS:,1.547171088160e+12(217),System Time ,TS:,1.547171088164e+12(215),
2,frames from ,839112062372:,System Time ,TS:,1.547171088193e+12(218),System Time ,TS:,1.547171088193e+12(218),System Time ,TS:,1.547171088197e+12(216),
2,frames from ,839112062372:,System Time ,TS:,1.547171088226e+12(219),System Time ,TS:,1.547171088226e+12(219),System Time ,TS:,1.547171088231e+12(217),
2,frames from ,839112062372:,System Time ,TS:,1.547171088260e+12(220),System Time ,TS:,1.547171088260e+12(220),System Time ,TS:,1.547171088264e+12(218),
2,frames from ,839112062372:,System Time ,TS:,1.547171088293e+12(221),System Time ,TS:,1.547171088293e+12(221),System Time ,TS:,1.547171088298e+12(219),
2,frames from ,839112062372:,System Time ,TS:,1.547171088326e+12(222),System Time ,TS:,1.547171088326e+12(222),System Time ,TS:,1.547171088332e+12(220),
2,frames from ,839112062372:,System Time ,TS:,1.547171088360e+12(223),System Time ,TS:,1.547171088360e+12(223),System Time ,TS:,1.547171088365e+12(221),
The following is the output of 3 D415 cameras.
2,frames from ,839112061702:,Drift:,3.326904296875e+01,System Time ,TS:,1.547177302739e+12(118),System Time ,TS:,1.547177302739e+12(118),System Time ,TS:,1.547177302727e+12(118),
2,frames from ,839112062372:,Drift:,3.332202148438e+01,System Time ,TS:,1.547177302738e+12(108),System Time ,TS:,1.547177302738e+12(108),System Time ,TS:,1.547177302735e+12(108),
2,frames from ,748512060635:,Drift:,3.334106445312e+01,System Time ,TS:,1.547177302739e+12(122),System Time ,TS:,1.547177302739e+12(122),System Time ,TS:,1.547177302740e+12(120),
2,frames from ,839112061702:,Drift:,3.330102539062e+01,System Time ,TS:,1.547177302772e+12(119),System Time ,TS:,1.547177302772e+12(119),System Time ,TS:,1.547177302761e+12(119),
2,frames from ,839112062372:,Drift:,3.332275390625e+01,System Time ,TS:,1.547177302771e+12(109),System Time ,TS:,1.547177302771e+12(109),System Time ,TS:,1.547177302769e+12(109),
2,frames from ,748512060635:,Drift:,3.330859375000e+01,System Time ,TS:,1.547177302772e+12(123),System Time ,TS:,1.547177302772e+12(123),System Time ,TS:,1.547177302773e+12(121),
2,frames from ,839112061702:,Drift:,3.318017578125e+01,System Time ,TS:,1.547177302805e+12(120),System Time ,TS:,1.547177302805e+12(120),System Time ,TS:,1.547177302794e+12(120),
2,frames from ,839112062372:,Drift:,3.329370117188e+01,System Time ,TS:,1.547177302805e+12(110),System Time ,TS:,1.547177302805e+12(110),System Time ,TS:,1.547177302802e+12(110),
2,frames from ,748512060635:,Drift:,3.288647460938e+01,System Time ,TS:,1.547177302805e+12(124),System Time ,TS:,1.547177302805e+12(124),System Time ,TS:,1.547177302807e+12(122),
2,frames from ,839112061702:,Drift:,3.284350585938e+01,System Time ,TS:,1.547177302838e+12(121),System Time ,TS:,1.547177302838e+12(121),System Time ,TS:,1.547177302828e+12(121),
2,frames from ,839112062372:,Drift:,3.332934570312e+01,System Time ,TS:,1.547177302838e+12(111),System Time ,TS:,1.547177302838e+12(111),System Time ,TS:,1.547177302836e+12(111),
2,frames from ,748512060635:,Drift:,3.329711914062e+01,System Time ,TS:,1.547177302838e+12(125),System Time ,TS:,1.547177302838e+12(125),System Time ,TS:,1.547177302840e+12(123),
2,frames from ,839112061702:,Drift:,3.361938476562e+01,System Time ,TS:,1.547177302872e+12(122),System Time ,TS:,1.547177302872e+12(122),System Time ,TS:,1.547177302862e+12(122),
2,frames from ,839112062372:,Drift:,3.365258789062e+01,System Time ,TS:,1.547177302872e+12(112),System Time ,TS:,1.547177302872e+12(112),System Time ,TS:,1.547177302870e+12(112),
2,frames from ,748512060635:,Drift:,3.333032226562e+01,System Time ,TS:,1.547177302871e+12(126),System Time ,TS:,1.547177302871e+12(126),System Time ,TS:,1.547177302874e+12(124),
2,frames from ,839112061702:,Drift:,3.273999023438e+01,System Time ,TS:,1.547177302905e+12(123),System Time ,TS:,1.547177302905e+12(123),System Time ,TS:,1.547177302895e+12(123),
2,frames from ,839112062372:,Drift:,3.299511718750e+01,System Time ,TS:,1.547177302905e+12(113),System Time ,TS:,1.547177302905e+12(113),System Time ,TS:,1.547177302903e+12(113),
2,frames from ,748512060635:,Drift:,3.327880859375e+01,System Time ,TS:,1.547177302905e+12(127),System Time ,TS:,1.547177302905e+12(127),System Time ,TS:,1.547177302908e+12(125),
2,frames from ,839112061702:,Drift:,3.349755859375e+01,System Time ,TS:,1.547177302938e+12(124),System Time ,TS:,1.547177302938e+12(124),System Time ,TS:,1.547177302929e+12(124),
2,frames from ,839112062372:,Drift:,3.328540039062e+01,System Time ,TS:,1.547177302938e+12(114),System Time ,TS:,1.547177302938e+12(114),System Time ,TS:,1.547177302937e+12(114),
2,frames from ,748512060635:,Drift:,3.331958007812e+01,System Time ,TS:,1.547177302938e+12(128),System Time ,TS:,1.547177302938e+12(128),System Time ,TS:,1.547177302941e+12(126),
2,frames from ,839112061702:,Drift:,3.321484375000e+01,System Time ,TS:,1.547177302971e+12(125),System Time ,TS:,1.547177302971e+12(125),System Time ,TS:,1.547177302962e+12(125),
2,frames from ,839112062372:,Drift:,3.333105468750e+01,System Time ,TS:,1.547177302971e+12(115),System Time ,TS:,1.547177302971e+12(115),System Time ,TS:,1.547177302970e+12(115),
2,frames from ,748512060635:,Drift:,3.334765625000e+01,System Time ,TS:,1.547177302971e+12(129),System Time ,TS:,1.547177302971e+12(129),System Time ,TS:,1.547177302975e+12(127),
2,frames from ,839112061702:,Drift:,3.369140625000e+01,System Time ,TS:,1.547177303005e+12(126),System Time ,TS:,1.547177303005e+12(126),System Time ,TS:,1.547177302996e+12(126),
2,frames from ,839112062372:,Drift:,3.331591796875e+01,System Time ,TS:,1.547177303005e+12(116),System Time ,TS:,1.547177303005e+12(116),System Time ,TS:,1.547177303004e+12(116),
2,frames from ,748512060635:,Drift:,3.414550781250e+01,System Time ,TS:,1.547177303005e+12(130),System Time ,TS:,1.547177303005e+12(130),System Time ,TS:,1.547177303008e+12(128),
2,frames from ,839112061702:,Drift:,3.287768554688e+01,System Time ,TS:,1.547177303038e+12(127),System Time ,TS:,1.547177303038e+12(127),System Time ,TS:,1.547177303030e+12(127),
2,frames from ,839112062372:,Drift:,3.332983398438e+01,System Time ,TS:,1.547177303038e+12(117),System Time ,TS:,1.547177303038e+12(117),System Time ,TS:,1.547177303038e+12(117),
2,frames from ,748512060635:,Drift:,3.246777343750e+01,System Time ,TS:,1.547177303038e+12(131),System Time ,TS:,1.547177303038e+12(131),System Time ,TS:,1.547177303042e+12(129),
2,frames from ,839112061702:,Drift:,3.388940429688e+01,System Time ,TS:,1.547177303072e+12(128),System Time ,TS:,1.547177303072e+12(128),System Time ,TS:,1.547177303063e+12(128),
2,frames from ,839112062372:,Drift:,3.341406250000e+01,System Time ,TS:,1.547177303071e+12(118),System Time ,TS:,1.547177303071e+12(118),System Time ,TS:,1.547177303072e+12(118),
2,frames from ,748512060635:,Drift:,3.398901367188e+01,System Time ,TS:,1.547177303072e+12(132),System Time ,TS:,1.547177303072e+12(132),System Time ,TS:,1.547177303076e+12(130),
2,frames from ,839112061702:,Drift:,3.291577148438e+01,System Time ,TS:,1.547177303105e+12(129),System Time ,TS:,1.547177303105e+12(129),System Time ,TS:,1.547177303097e+12(129),
2,frames from ,839112062372:,Drift:,3.327685546875e+01,System Time ,TS:,1.547177303105e+12(119),System Time ,TS:,1.547177303105e+12(119),System Time ,TS:,1.547177303105e+12(119),
2,frames from ,748512060635:,Drift:,3.271044921875e+01,System Time ,TS:,1.547177303105e+12(133),System Time ,TS:,1.547177303105e+12(133),System Time ,TS:,1.547177303109e+12(131),
2,frames from ,839112061702:,Drift:,3.313452148438e+01,System Time ,TS:,1.547177303138e+12(130),System Time ,TS:,1.547177303138e+12(130),System Time ,TS:,1.547177303130e+12(130

@Desperad0
Copy link

@RealSense-Customer-Engineering Are there any updates here?

Have you fixed the sync problem?

@HiOtto
Copy link

HiOtto commented Jul 9, 2019

[Realsense Customer Engineering Team Comment]
The D400 series camera hardware triggering is different from the camera flash trigger that helps capture single image at same time. This hardware trigger pulse would be the reference of the start point of each frame (SOF) hence all slave cameras are streaming at same pace. Without the trigger, all Slave cameras will just run freely upon their individual clock.

When different cameras are sync'd, you will see all cameras capture an image at same time.

The depth sensor can run as low as 6 fps.

Is it possible to run the sensors at another framerate while triggered by external pulse (e.g. 20/40 fps, instead of 15/30/60/90)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants