This project implements a face tracking system using an Intel RealSense camera, YOLO object detection, and multiple pan-tilt servo motors controlled via a Raspberry Pi. The system detects faces in real-time, computes the necessary pan and tilt angles for each servo motor to point towards the detected face, and controls the pan-tilt units accordingly.
- Introduction
- Hardware Requirements
- Software Requirements
- Project Structure
- Installation and Setup
- Usage
- Code Overview
- Configuration Details
- Testing and Calibration
- Notes and Considerations
- Troubleshooting
- Acknowledgments
This project creates an interactive system where multiple pan-tilt units track a human face detected by an Intel RealSense camera. The system utilizes:
- Intel RealSense Camera: Captures RGB and depth data.
- YOLO Object Detection: Detects faces in real-time.
- Coordinate Transformations: Calculates transformations between the camera frame and pan-tilt units.
- MQTT Communication: Sends computed angles from the local computer to the Raspberry Pi.
- Raspberry Pi with PCA9685 Servo Drivers: Controls multiple servos based on received MQTT messages.
- Local Computer: Ubuntu PC and Raspberry Pi.
- Intel RealSense Depth Camera: e.g., D435i.
- Raspberry Pi: For controlling servos (Model 3B+ or later recommended).
- PCA9685 Servo Drivers: Up to three boards for controlling up to 48 servos.
- Servos: Standard hobby servos for pan-tilt units.
- External Power Supply: To power the servos (e.g., 5V DC supply capable of supplying sufficient current).
- Jumper Wires and Connectors: For hardware connections.
- Python: Version 3.8 or later.
- Realsense SDK: install on local cumputer.
- MQTT Broker: Mosquitto (installed both on local commputer and Raspberry Pi).
- Python Libraries: Listed in the
requirements.txtfile.
├── config.yaml
├── main.py
├── models
│ └── best_ncnn_model
│ ├── metadata.yaml
│ ├── model.ncnn.bin
│ ├── model.ncnn.param
│ └── model_ncnn.py
├── modules
│ ├── camera.py
│ ├── detection.py
│ ├── pan_tilt.py
│ ├── processing.py
│ ├── __pycache__
│ │ ├── camera.cpython-38.pyc
│ │ ├── detection.cpython-38.pyc
│ │ ├── pan_tilt.cpython-38.pyc
│ │ ├── processing.cpython-38.pyc
│ │ ├── utils.cpython-38.pyc
│ │ └── visualization.cpython-38.pyc
│ ├── utils.py
│ └── visualization.py
├── pi
│ ├── mqtt_subscriber.py
│ └── servo_controller.py
├── README.md
├── requirements.txt
Clone or Download the Project Repository to your local computer.
installation on local computer
- Update the System
sudo apt-get update
sudo apt-get upgrade- Install Dependencies
sudo apt-get install git cmake build-essential libssl-dev libusb-1.0-0-dev pkg-config libgtk-3-dev
- Clone the Repository
git clone https://github.com/IntelRealSense/librealsense.git- Run the Installation Script
cd librealsense
sudo ./scripts/setup_udev_rules.sh
sudo ./scripts/patch-realsense-ubuntu-lts.sh- Build and Install the SDK
mkdir build && cd build
cmake ../ -DCMAKE_BUILD_TYPE=Release
make -j4
sudo make install
- Verify Installation
Run the RealSense Viewer:
realsense-viewerEnsure that your camera is detected and streaming.
- Install Virtual Environment
sudo apt-get install python3-venv- Create and Activate Virtual Environment
python3 -m venv venv
source venv/bin/activate`
Install the required Python packages:
pip install -r requirements.txtNote: Ensure requirements.txt includes all necessary packages:
pyrealsense2
opencv-python
numpy
PyYAML
ultralytics
pyvista
matplotlib
paho-mqtt
- Edit config.yaml to match your setup (see Configuration Details).
- Ensure the Raspberry Pi's IP Address is correctly set in the code where MQTT communication is established.
Enable I2C on Raspberry Pi
sudo raspi-configNavigate to Interface Options > I2C > Enable
- VCC (3.3V Logic Power): Connect to Raspberry Pi's 3.3V pin.
- GND: Connect to Raspberry Pi's GND pin.
- SDA and SCL: Connect to Raspberry Pi's SDA (GPIO 2) and SCL (GPIO 3).
- V+ (Servo Power): Connect to external 5V power supply.
- Servos: Connect to PCA9685 channels.
Set Unique I2C Addresses Configure each PCA9685 board to have a unique address (e.g., 0x40, 0x41, 0x42) by adjusting the A0–A5 address pins.
Install necessary Python libraries:
sudo apt-get update
sudo apt-get install python3-pip
pip3 install paho-mqtt pyyaml adafruit-circuitpython-pca9685 adafruit-circuitpython-motorInstall MQTT Broker:
sudo apt-get install mosquitto mosquitto-clients
sudo systemctl enable mosquitto
sudo systemctl start mosquittoConfigure Mosquitto to Accept External Connections:
Edit the Mosquitto configuration file to allow external connections:
sudo nano /etc/mosquitto/mosquitto.confAdd the following lines:
listener 1883
allow_anonymous trueRestart Mosquitto:
sudo systemctl restart mosquitto
Copy Raspberry Pi Code:
- Copy
pi/mqtt_subscriber.pyandpi/servo_controller.pyto the Raspberry Pi.
Edit config.yaml on Raspberry Pi:
- Ensure that the
config.yamlfile on the Raspberry Pi matches the one on your local computer, especially thei2c_idandmotors_idconfigurations.
(Additional sections like "Usage" and "Testing" would continue in a similar detailed structure.)
python3 mqtt_subscriber.pyThis script will listen for incoming MQTT messages and control the servos accordingly.
python3 main.pymain.py will failed if the rpi is not detected on the network
This script will:
- Initialize the RealSense camera and YOLO model.
- Detect faces and compute pan-tilt angles.
- Send computed angles to the Raspberry Pi via MQTT.
- The pan-tilt units should move to track detected faces.
- The console will display logging information about detections and actions.
- main.py: The main script that orchestrates camera capture, face detection, angle computation, and MQTT communication.
- modules/:
- camera.py: Manages RealSense camera initialization and frame acquisition.
- detection.py: Contains the YOLODetector class for face detection.
- processing.py: Handles coordinate transformations and angle calculations.
- pan_tilt.py: Manages pan-tilt unit configurations and angle broadcasting over MQTT.
- utils.py: Utility functions.
- visualization.py: Functions for visualizing detections (if needed).
- mqtt_subscriber.py: Subscribes to MQTT messages, parses incoming angle data, and commands the servos.
- servo_controller.py: Defines the ServoController class for initializing and controlling multiple PCA9685 boards and servos.
camera:
position: [0.0, 0.0, 0.0] # Camera position (origin)
orientation: [0.0, 0.0, 0.0] # Camera orientation (no rotation)
pan_tilt_units:
- name: "PanTilt1"
position: [0.10, 0.0, 0.0] # Position relative to camera
orientation: [0.0, 0.0, 0.0] # Orientation relative to camera
i2c_id: 0 # I2C bus ID
motors_id: [0, 1] # Servo channels on PCA9685
- name: "PanTilt2"
position: [-0.10, 0.0, 0.0]
orientation: [0.0, 0.0, 0.0]
i2c_id: 0
motors_id: [2, 3]
# Add additional pan-tilt units as needed- Positions: In meters, relative to the camera's coordinate frame.
- Orientations: In degrees [roll, pitch, yaw], relative to the camera's frame.
- i2c_id: Identifies the PCA9685 board (adjust based on I2C addresses).
- motors_id: Servo channels on the PCA9685 board.
-
Test Individual Components:
- Verify the RealSense camera is functioning using
realsense-viewer. - Test the YOLO face detection separately.
- On the Raspberry Pi, test servo movement with a simple script to ensure servos respond correctly.
- Verify the RealSense camera is functioning using
-
Calibrate Servos:
- Adjust
min_pulseandmax_pulseinservo_controller.pyif servos do not reach full range. - Ensure servos are mechanically centered.
- Adjust
-
Verify MQTT Communication:
- Use
mosquitto_subandmosquitto_pubto test MQTT messages between the local computer and Raspberry Pi.
- Use
-
Adjust Configurations:
- Fine-tune positions and orientations in
config.yamlfor accurate tracking. - Check that
i2c_idandmotors_idcorrectly map to your hardware setup.
- Fine-tune positions and orientations in
- Power Supply: Use a sufficient power supply for servos to prevent brownouts.
- Servo Limits: Ensure that the computed angles do not exceed the physical limits of your servos.
- Latency: Network delays can affect real-time performance. Use a wired connection if possible.
- Error Handling: The code includes logging to help identify issues. Check logs if the system isn't behaving as expected.
- Safety: Be cautious when working with servos to prevent injury or damage.
-
No MQTT Messages Received:
- Ensure the MQTT broker is running on the Raspberry Pi.
- Check network connectivity between the local computer and Raspberry Pi.
- Verify that the MQTT topic names match.
-
Servos Not Moving:
- Check power connections to the PCA9685 boards.
- Verify that
i2c_idandmotors_idare correctly configured. - Use
i2cdetect -y 1on the Raspberry Pi to ensure PCA9685 boards are detected.
-
Camera Not Detected:
- Ensure the Intel RealSense SDK is properly installed.
- Try different USB ports or cables.
-
Incorrect Tracking:
- Recalibrate positions and orientations in
config.yaml. - Verify that coordinate transformations are correctly implemented.
- Recalibrate positions and orientations in
- Intel RealSense SDK: For providing libraries to interface with RealSense cameras.
- Ultralytics YOLO: For the object detection framework.
- Adafruit: For the PCA9685 Python library and motor control libraries.
- paho-mqtt: For MQTT client libraries in Python.
-
Set Up Hardware:
- Connect the Intel RealSense camera to your local computer.
- Wire the Raspberry Pi with PCA9685 boards and servos.
- Ensure all devices are powered correctly.
-
Install Software on Local Computer:
- Install Intel RealSense SDK.
- Set up Python environment and install dependencies.
- Configure
config.yaml.
-
Install Software on Raspberry Pi:
- Enable I2C and install required Python libraries.
- Install and configure MQTT broker.
- Copy
mqtt_subscriber.pyandservo_controller.pyto the Raspberry Pi.
-
Run the System:
- Start the MQTT subscriber on the Raspberry Pi.
- Run
main.pyon your local computer. - Observe the pan-tilt units tracking detected faces.
-
allow sequences to move servo (not only face tracking mode)
-
add storing last command for each servo (smoothing movement)
- add limit angle previous -minus new to smooth movement
find the mirror main computer
sudo nmap -sP
2. Copy the Public Key to the Raspberry Pi
Ensure that the public key associated with your private key is added to the ~/.ssh/authorized_keys file on the Raspberry Pi.
Run the following command on your local machine:
ssh-copy-id -i ~/.ssh/id_rsa nico@192.168.1.14
This will copy your public key (~/.ssh/id_rsa.pub) to the Raspberry Pi and add it to the authorized_keys file. You will be prompted for your password once.