diff --git a/docs/source/components/device.rst b/docs/source/components/device.rst
index 8bfdfd104..517b24de5 100644
--- a/docs/source/components/device.rst
+++ b/docs/source/components/device.rst
@@ -3,9 +3,9 @@
Device
======
-Device represents an `OAK camera `__. On all of our devices there's a powerful vision processing unit
-(**VPU**), called `Myriad X `__.
-The VPU is optimized for performing AI inference algorithms and for processing sensory inputs (eg. calculating stereo disparity from two cameras).
+Device represents an `OAK camera `__. On all of our devices there's a powerful Robotics Vision Core
+(`RVC `__). The RVC is optimized for performing AI inference algorithms and
+for processing sensory inputs (eg. calculating stereo disparity from two cameras).
Device API
##########
@@ -21,7 +21,7 @@ When you create the device in the code, firmware is uploaded together with the p
# Upload the pipeline to the device
with depthai.Device(pipeline) as device:
- # Print Myriad X Id (MxID), USB speed, and available cameras on the device
+ # Print MxID, USB speed, and available cameras on the device
print('MxId:',device.getDeviceInfo().getMxId())
print('USB speed:',device.getUsbSpeed())
print('Connected cameras:',device.getConnectedCameras())
diff --git a/docs/source/components/nodes/color_camera.rst b/docs/source/components/nodes/color_camera.rst
index 1a51ad114..6323af090 100644
--- a/docs/source/components/nodes/color_camera.rst
+++ b/docs/source/components/nodes/color_camera.rst
@@ -105,7 +105,7 @@ Usage
Limitations
###########
-Here are known camera limitations for the Myriad X:
+Here are known camera limitations for the `RVC2 `__:
- **ISP can process about 600 MP/s**, and about **500 MP/s** when the pipeline is also running NNs and video encoder in parallel
- **3A algorithms** can process about **200..250 FPS overall** (for all camera streams). This is a current limitation of our implementation, and we have plans for a workaround to run 3A algorithms on every Xth frame, no ETA yet
diff --git a/docs/source/components/nodes/imu.rst b/docs/source/components/nodes/imu.rst
index 69e51ea7f..d34e56257 100644
--- a/docs/source/components/nodes/imu.rst
+++ b/docs/source/components/nodes/imu.rst
@@ -1,10 +1,14 @@
IMU
===
-IMU (`intertial measurement unit `__) node can be used to receive data from the IMU chip on the device.
-Our DepthAI devices use `BNO085 `__ 9-axis sensor (`datasheet here `__)
-that supports sensor fusion on the (IMU) chip itself. The IMU chip is connected to the Myriad X (VPU) over SPI (we have integrated
-`this driver `__ to the DepthAI).
+IMU (`intertial measurement unit `__) node can be used to receive data
+from the IMU chip on the device. Our OAK devices use either:
+
+- `BNO085 `__ (`datasheet here `__) 9-axis sensor, combining accelerometer, gyroscope, and magnetometer. It also does sensor fusion on the (IMU) chip itself. We have efficiently integrated `this driver `__ into the DepthAI.
+- `BMI270 `__ 6-axis sensor, combining accelerometer and gyroscope
+
+. The IMU chip is connected to the `RVC `__
+over SPI.
How to place it
diff --git a/docs/source/index.rst b/docs/source/index.rst
index e0783a2a6..79cb1edf7 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -15,7 +15,7 @@ We support both :ref:`Python API ` and :ref:`C++ API `__. More :ref:`information here `.
+- **Device side** is the OAK device itself. If something is happening on the device side, it means that it's running on the Robotics Vision Core (`RVC `__). More :ref:`information here `.
- **Pipeline** is a complete workflow on the device side, consisting of :ref:`nodes ` and connections between them. More :ref:`information here `.
- **Node** is a single functionality of the DepthAI. :ref:`Nodes` have inputs or outputs, and have configurable properties (like resolution on the camera node).
- **Connection** is a link between one node's output and another one's input. In order to define the pipeline dataflow, the connections define where to send :ref:`messages ` in order to achieve an expected result
diff --git a/docs/source/samples/Yolo/tiny_yolo.rst b/docs/source/samples/Yolo/tiny_yolo.rst
index bf6d831b0..bac9e595a 100644
--- a/docs/source/samples/Yolo/tiny_yolo.rst
+++ b/docs/source/samples/Yolo/tiny_yolo.rst
@@ -2,7 +2,8 @@ RGB & Tiny YOLO
===============
This example shows how to run Tiny YOLOv4 or YOLOv3 on the RGB input frame, and how to display both the RGB
-preview and the metadata results from the YOLO model on the preview. Decoding is done on the VPU (Myriad X) instead on the host.
+preview and the metadata results from the YOLO model on the preview. Decoding is done on the `RVC `__
+instead on the host computer.
Configurable, network dependent parameters are required for correct decoding: