From 22afefd487063de5e4b53d0904a1a0e5f7018e9e Mon Sep 17 00:00:00 2001 From: Erol444 Date: Mon, 17 Oct 2022 16:05:39 +0900 Subject: [PATCH 1/3] added software syncing msgs docs --- docs/source/index.rst | 1 + docs/source/tutorials/message_syncing.rst | 81 +++++++++++++++++++++++ docs/source/tutorials/multiple.rst | 55 +++++---------- 3 files changed, 100 insertions(+), 37 deletions(-) create mode 100644 docs/source/tutorials/message_syncing.rst diff --git a/docs/source/index.rst b/docs/source/index.rst index 79cb1edf7..fd203585f 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -72,6 +72,7 @@ node functionalities are presented with code. tutorials/hello_world.rst tutorials/standalone_mode.rst + tutorials/message_syncing.rst tutorials/multiple.rst tutorials/maximize_fov.rst tutorials/debugging.rst diff --git a/docs/source/tutorials/message_syncing.rst b/docs/source/tutorials/message_syncing.rst new file mode 100644 index 000000000..5b128a4fc --- /dev/null +++ b/docs/source/tutorials/message_syncing.rst @@ -0,0 +1,81 @@ +Message syncing +############### + +There are two ways to synchronize messages from different sensors (frames, IMU packet, ToF, etc.); + +- :ref:`Software syncing ` (based on timestamp/sequence numbers) +- `Hardware syncing `__ (multi-sensor sub-ms accuracy, hardware trigger) + +Software syncing +**************** + +This documentation page focuses on software syncing. There are two approaches for it: + +- :ref:`Sequece number syncing` - for streams set to the same FPS, sub-ms accuracy can be achieved +- :ref:`Timestamp syncing` - for streams with different FPS, syncing with other sensors either onboard (eg. IMU) or also connected to the host computer (eg. USB ToF sensor) + +Sequece number syncing +====================== + +If we want to synchronize multiple messages from the same OAK, such as: + +- Camera frames from `ColorCamera `__ or `MonoCamera `__ (color, left and right frames) +- Messages generated from camera frames (NN results, disparity/depth, edge detections, tracklets, encoded frames, tracked features, etc.) + +We can use sequence number syncing, `demos here `__. +Each frame from ColorCamera/MonoCamera will get assigned a sequence number, which then also gets copied to message generated from that frame. + +For sequence number syncing **FPS of all cameras need to be the same**. On host or inside script node you can get message's sequence number like this: + +.. code-block:: depthai-python + + # Get the message from the queue + message = queue.get() + # message can be ImgFrame, NNData, Tracklets, ImgDetections, TrackedFeatures... + seqNum = message.getSequenceNum() + + +Through firmware sync, we're monitoring for drift and aligning the capture timestamps of all cameras (left, right, color), which are taken +at the MIPI Start-of-Frame (SoF) event. The Left/Right global shutter cameras are driven by the same clock, started by broadcast write +on I2C, so no drift will happen over time, even when running freely without a hardware sync. + +The RGB rolling shutter has a slight difference in clocking/frame-time, so when we detect a small drift, we're modifying the +frame-time (number of lines) for the next frame by a small amount to compensate. + +If sensors are set to the same FPS (default is 30), the above two approaches are **already integrated into depthai and enabled** +by default, which allows us to **achieve sub-ms delay between all frames** + messages generated by these frames! + +.. code-block:: bash + + [Seq 325] RGB timestamp: 0:02:33.549449 + [Seq 325] Disparity timestamp: 0:02:33.549402 + ----------- + [Seq 326] RGB timestamp: 0:02:33.582756 + [Seq 326] Disparity timestamp: 0:02:33.582715 + ----------- + [Seq 327] RGB timestamp: 0:02:33.616075 + [Seq 327] Disparity timestamp: 0:02:33.616031 + +Disparity and color frame timestamps indicate that we achieve well below sub-ms accuracy. + +Timestamp syncing +================= + +As opposed to sequence number syncing, **timestamp syncing** can sync: + +- **streams** with **different FPS** +- **IMU** results with other messages +- messages with **other devices connected to the computer**, as timestamps are synced to the host computer clock + +Feel free to check the `demo here `__ +which uses timestamps to sync IMU, color and disparity frames together, with all of these streams producing messages at different FPS. + +In case of **multiple streams having different FPS**, there are 2 options on how to sync them: + +#. **Removing some messages** from faster streams to get the synced FPS of the slower stream +#. **Duplicating some messages** from slower streams to get the synced FPS of the fastest stream + +**Timestamps are assigned** to the frame at the **MIPI Start-of-Frame** (SoF) events, +`more details here `__. + +.. include:: /includes/footer-short.rst diff --git a/docs/source/tutorials/multiple.rst b/docs/source/tutorials/multiple.rst index 6cdd5b521..6354b5606 100644 --- a/docs/source/tutorials/multiple.rst +++ b/docs/source/tutorials/multiple.rst @@ -1,35 +1,22 @@ Multiple DepthAI per Host ========================= -Learn how to discover DepthAI devices connected to your system, and use them individually. +You can find `Demo scripts here `__. +Learn how to discover multiple OAK cameras connected to your system, and use them individually. .. image:: /_static/images/tutorials/multiple/setup.jpg :alt: face -Shown on the left is Luxonis `uAI (BW1093) `__ which is actually plugged into -a `Raspberry Pi Compute Module Edition (BW1097) `__. +Shown on the left is Luxonis `OAK-1 `__ which is actually plugged into +an `OAK-D-CM3 `__. -So in this case, everything is running on the (single) Raspberry Pi 3B+ which is in the back of the BW1097. +So in this case, everything is running on the (single) Raspberry Pi 3B+ host which is in the back of the OAK-D-CM3. -Demo code -######### +Discovering OAK cameras +####################### -You can find demo code `here `__. The demo will find all devices connected to the host and display an RGB preview from each of them. - -Dependencies -############ - -You have already set up the Python API on your system (if you have a Raspberry Pi Compute Module it came pre-setup). -See :ref:`here ` if you have not yet installed the DepthAI Python API on your system. - -Discover DepthAI-USB Port Mapping -################################# - -The DepthAI multi-device support is currently done by selecting the device mx_id (serial number) of a connected DepthAI -device. - -If you'd like to associate a given DepthAI device with specific code (e.g. neural model) to be run on it, it is recommended -to plug in one device at a time, and then use the following code to determine which device is on which port: +You can use DepthAI to discover all connected OAK cameras, either via USB or through the LAN (OAK POE cameras). +The code snippet below finds all OAK cameras and prints their MxIDs (unique identifier) and their XLink state. .. code-block:: python @@ -52,28 +39,22 @@ For example, if the first device is desirable from above use the following code: .. code-block:: python - found, device_info = depthai.Device.getDeviceByMxId("14442C10D13EABCE00") - - if not found: - raise RuntimeError("Device not found!") - -You can then use the `device_info` to specify on which device you want to run your pipeline: - -.. code-block:: python - + # Specify MXID, IP Address or USB path + device_info = depthai.DeviceInfo("14442C108144F1D000") # MXID + #device_info = depthai.DeviceInfo("192.168.1.44") # IP Address + #device_info = depthai.DeviceInfo("3.3.3") # USB port name with depthai.Device(pipeline, device_info) as device: + # ... And you can use this code as a basis for your own use cases, such that you can run differing neural models -on different DepthAI/uAI models. +on different OAK models. Specifying POE device to be used ******************************** -You can specify the POE device to be used by the IP address as well. Here's the `code snippet `__. - -Now use as many DepthAI devices as you need! +You can specify the POE device to be used by the IP address as well, as shown in the code snippet above. -And since DepthAI does all the heavy lifting, you can usually use quite a -few of them with very little burden to the host. +Now use as many OAK cameras as you need! +And since DepthAI does all the heavy lifting, you can usually use quite a few of them with very little burden to the host. .. include:: /includes/footer-short.rst From 0e067c22dff1ca431f85fa1f362da6c30675c7d2 Mon Sep 17 00:00:00 2001 From: Erol444 Date: Mon, 17 Oct 2022 16:11:16 +0900 Subject: [PATCH 2/3] linked syncing from rgb-depth alignment --- docs/source/samples/StereoDepth/rgb_depth_aligned.rst | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/docs/source/samples/StereoDepth/rgb_depth_aligned.rst b/docs/source/samples/StereoDepth/rgb_depth_aligned.rst index a97aa5552..0370ac34a 100644 --- a/docs/source/samples/StereoDepth/rgb_depth_aligned.rst +++ b/docs/source/samples/StereoDepth/rgb_depth_aligned.rst @@ -4,6 +4,10 @@ RGB Depth alignment This example shows usage of RGB depth alignment. Since OAK-D has a color and a pair of stereo cameras, you can align depth map to the color frame on top of that to get RGB depth. +In this example, rgb and depth aren't perfectly in sync. For that, you would need to add :ref:`Software syncing`, which +has been added to the `demo here `__, +where RGB and depth frames have sub-ms delay. + Demo #### From e433d471bb68570561f5340471d613a2cc18e52b Mon Sep 17 00:00:00 2001 From: Erol444 Date: Mon, 17 Oct 2022 22:38:53 +0900 Subject: [PATCH 3/3] fixed building errors --- docs/source/samples/mixed/frame_sync.rst | 3 ++- docs/source/tutorials/message_syncing.rst | 4 ++-- 2 files changed, 4 insertions(+), 3 deletions(-) diff --git a/docs/source/samples/mixed/frame_sync.rst b/docs/source/samples/mixed/frame_sync.rst index 7c08141e7..cc8a49781 100644 --- a/docs/source/samples/mixed/frame_sync.rst +++ b/docs/source/samples/mixed/frame_sync.rst @@ -1,7 +1,8 @@ Frame syncing on OAK ==================== -This example showcases how you can use :ref:`Script` node to sync frames from multiple streams. It uses :ref:`ImgFrame`'s timestamps to achieve syncing precision. +This example showcases how you can use :ref:`Script` node to perform :ref:`Message syncing` of multiple streams. +Example uses :ref:`ImgFrame`'s timestamps to achieve syncing precision. Similar syncing demo scripts (python) can be found at our depthai-experiments repository in `gen2-syncing `__ folder. diff --git a/docs/source/tutorials/message_syncing.rst b/docs/source/tutorials/message_syncing.rst index 5b128a4fc..fc19da9c6 100644 --- a/docs/source/tutorials/message_syncing.rst +++ b/docs/source/tutorials/message_syncing.rst @@ -3,7 +3,7 @@ Message syncing There are two ways to synchronize messages from different sensors (frames, IMU packet, ToF, etc.); -- :ref:`Software syncing ` (based on timestamp/sequence numbers) +- :ref:`Software syncing ` (based on timestamp/sequence numbers) - `Hardware syncing `__ (multi-sensor sub-ms accuracy, hardware trigger) Software syncing @@ -27,7 +27,7 @@ Each frame from ColorCamera/MonoCamera will get assigned a sequence number, whic For sequence number syncing **FPS of all cameras need to be the same**. On host or inside script node you can get message's sequence number like this: -.. code-block:: depthai-python +.. code-block:: python # Get the message from the queue message = queue.get()