Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,7 @@ node functionalities are presented with code.

tutorials/hello_world.rst
tutorials/standalone_mode.rst
tutorials/message_syncing.rst
tutorials/multiple.rst
tutorials/maximize_fov.rst
tutorials/debugging.rst
Expand Down
4 changes: 4 additions & 0 deletions docs/source/samples/StereoDepth/rgb_depth_aligned.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@ RGB Depth alignment
This example shows usage of RGB depth alignment. Since OAK-D has a color and a pair of stereo cameras,
you can align depth map to the color frame on top of that to get RGB depth.

In this example, rgb and depth aren't perfectly in sync. For that, you would need to add :ref:`Software syncing`, which
has been added to the `demo here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing#host-rgb-depth-sync>`__,
where RGB and depth frames have sub-ms delay.

Demo
####

Expand Down
3 changes: 2 additions & 1 deletion docs/source/samples/mixed/frame_sync.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
Frame syncing on OAK
====================

This example showcases how you can use :ref:`Script` node to sync frames from multiple streams. It uses :ref:`ImgFrame`'s timestamps to achieve syncing precision.
This example showcases how you can use :ref:`Script` node to perform :ref:`Message syncing` of multiple streams.
Example uses :ref:`ImgFrame`'s timestamps to achieve syncing precision.

Similar syncing demo scripts (python) can be found at our depthai-experiments repository in `gen2-syncing <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing>`__
folder.
Expand Down
81 changes: 81 additions & 0 deletions docs/source/tutorials/message_syncing.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
Message syncing
###############

There are two ways to synchronize messages from different sensors (frames, IMU packet, ToF, etc.);

- :ref:`Software syncing <Software syncing>` (based on timestamp/sequence numbers)
- `Hardware syncing <https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/sync_frames.html>`__ (multi-sensor sub-ms accuracy, hardware trigger)

Software syncing
****************

This documentation page focuses on software syncing. There are two approaches for it:

- :ref:`Sequece number syncing` - for streams set to the same FPS, sub-ms accuracy can be achieved
- :ref:`Timestamp syncing` - for streams with different FPS, syncing with other sensors either onboard (eg. IMU) or also connected to the host computer (eg. USB ToF sensor)

Sequece number syncing
======================

If we want to synchronize multiple messages from the same OAK, such as:

- Camera frames from `ColorCamera <https://docs.luxonis.com/projects/api/en/latest/components/nodes/color_camera/#colorcamera>`__ or `MonoCamera <https://docs.luxonis.com/projects/api/en/latest/components/nodes/mono_camera/#monocamera>`__ (color, left and right frames)
- Messages generated from camera frames (NN results, disparity/depth, edge detections, tracklets, encoded frames, tracked features, etc.)

We can use sequence number syncing, `demos here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing#message-syncing>`__.
Each frame from ColorCamera/MonoCamera will get assigned a sequence number, which then also gets copied to message generated from that frame.

For sequence number syncing **FPS of all cameras need to be the same**. On host or inside script node you can get message's sequence number like this:

.. code-block:: python

# Get the message from the queue
message = queue.get()
# message can be ImgFrame, NNData, Tracklets, ImgDetections, TrackedFeatures...
seqNum = message.getSequenceNum()


Through firmware sync, we're monitoring for drift and aligning the capture timestamps of all cameras (left, right, color), which are taken
at the MIPI Start-of-Frame (SoF) event. The Left/Right global shutter cameras are driven by the same clock, started by broadcast write
on I2C, so no drift will happen over time, even when running freely without a hardware sync.

The RGB rolling shutter has a slight difference in clocking/frame-time, so when we detect a small drift, we're modifying the
frame-time (number of lines) for the next frame by a small amount to compensate.

If sensors are set to the same FPS (default is 30), the above two approaches are **already integrated into depthai and enabled**
by default, which allows us to **achieve sub-ms delay between all frames** + messages generated by these frames!

.. code-block:: bash

[Seq 325] RGB timestamp: 0:02:33.549449
[Seq 325] Disparity timestamp: 0:02:33.549402
-----------
[Seq 326] RGB timestamp: 0:02:33.582756
[Seq 326] Disparity timestamp: 0:02:33.582715
-----------
[Seq 327] RGB timestamp: 0:02:33.616075
[Seq 327] Disparity timestamp: 0:02:33.616031

Disparity and color frame timestamps indicate that we achieve well below sub-ms accuracy.

Timestamp syncing
=================

As opposed to sequence number syncing, **timestamp syncing** can sync:

- **streams** with **different FPS**
- **IMU** results with other messages
- messages with **other devices connected to the computer**, as timestamps are synced to the host computer clock

Feel free to check the `demo here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-syncing#imu--rgb--depth-timestamp-syncing>`__
which uses timestamps to sync IMU, color and disparity frames together, with all of these streams producing messages at different FPS.

In case of **multiple streams having different FPS**, there are 2 options on how to sync them:

#. **Removing some messages** from faster streams to get the synced FPS of the slower stream
#. **Duplicating some messages** from slower streams to get the synced FPS of the fastest stream

**Timestamps are assigned** to the frame at the **MIPI Start-of-Frame** (SoF) events,
`more details here <https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/sync_frames.html#frame-capture-graphs>`__.

.. include:: /includes/footer-short.rst
55 changes: 18 additions & 37 deletions docs/source/tutorials/multiple.rst
Original file line number Diff line number Diff line change
@@ -1,35 +1,22 @@
Multiple DepthAI per Host
=========================

Learn how to discover DepthAI devices connected to your system, and use them individually.
You can find `Demo scripts here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-multiple-devices>`__.
Learn how to discover multiple OAK cameras connected to your system, and use them individually.

.. image:: /_static/images/tutorials/multiple/setup.jpg
:alt: face

Shown on the left is Luxonis `uAI (BW1093) <https://shop.luxonis.com/products/bw1093>`__ which is actually plugged into
a `Raspberry Pi Compute Module Edition (BW1097) <https://shop.luxonis.com/products/depthai-rpi-compute-module-edition>`__.
Shown on the left is Luxonis `OAK-1 <https://shop.luxonis.com/products/bw1093>`__ which is actually plugged into
an `OAK-D-CM3 <https://shop.luxonis.com/products/depthai-rpi-compute-module-edition>`__.

So in this case, everything is running on the (single) Raspberry Pi 3B+ which is in the back of the BW1097.
So in this case, everything is running on the (single) Raspberry Pi 3B+ host which is in the back of the OAK-D-CM3.

Demo code
#########
Discovering OAK cameras
#######################

You can find demo code `here <https://github.com/luxonis/depthai-experiments/tree/master/gen2-multiple-devices>`__. The demo will find all devices connected to the host and display an RGB preview from each of them.

Dependencies
############

You have already set up the Python API on your system (if you have a Raspberry Pi Compute Module it came pre-setup).
See :ref:`here <Python API Reference>` if you have not yet installed the DepthAI Python API on your system.

Discover DepthAI-USB Port Mapping
#################################

The DepthAI multi-device support is currently done by selecting the device mx_id (serial number) of a connected DepthAI
device.

If you'd like to associate a given DepthAI device with specific code (e.g. neural model) to be run on it, it is recommended
to plug in one device at a time, and then use the following code to determine which device is on which port:
You can use DepthAI to discover all connected OAK cameras, either via USB or through the LAN (OAK POE cameras).
The code snippet below finds all OAK cameras and prints their MxIDs (unique identifier) and their XLink state.

.. code-block:: python

Expand All @@ -52,28 +39,22 @@ For example, if the first device is desirable from above use the following code:

.. code-block:: python

found, device_info = depthai.Device.getDeviceByMxId("14442C10D13EABCE00")

if not found:
raise RuntimeError("Device not found!")

You can then use the `device_info` to specify on which device you want to run your pipeline:

.. code-block:: python

# Specify MXID, IP Address or USB path
device_info = depthai.DeviceInfo("14442C108144F1D000") # MXID
#device_info = depthai.DeviceInfo("192.168.1.44") # IP Address
#device_info = depthai.DeviceInfo("3.3.3") # USB port name
with depthai.Device(pipeline, device_info) as device:
# ...

And you can use this code as a basis for your own use cases, such that you can run differing neural models
on different DepthAI/uAI models.
on different OAK models.

Specifying POE device to be used
********************************

You can specify the POE device to be used by the IP address as well. Here's the `code snippet <https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/getting-started-with-poe.html#manually-specify-device-ip>`__.

Now use as many DepthAI devices as you need!
You can specify the POE device to be used by the IP address as well, as shown in the code snippet above.

And since DepthAI does all the heavy lifting, you can usually use quite a
few of them with very little burden to the host.
Now use as many OAK cameras as you need!
And since DepthAI does all the heavy lifting, you can usually use quite a few of them with very little burden to the host.

.. include:: /includes/footer-short.rst