This is a Raspberry Pi car with a tiltable and pannable camera controlled by a phone using a live video feed and virtual joysticks on a web page. It's built with Python, Flask, Socket.IO, and React.
Hold the phone sideways and place both thumbs on the screen to activate the virtual joysticks.
The left joystick makes the car/camera go forward/up or reverse/down, and the right joystick makes the car/camera turn left or right. The top right buttons toggle between moving the car and moving the camera. The top left button takes an HD photo. Photos are accessible from an album button to the right of the photo button, and each photo in the album opens up in a zoomable and pannable overlay controlled with multi finger gestures.
- Low latency using websockets for both video streaming (20 FPS) and controls (25 MS responsiveness)
- Settings related to latency and video resolution can easily be configured
- Intuitive controls with virtual joysticks that allow variability in speed and turning radius
- Quickly toggle the UI between driving mode and camera platform position adjustment mode
- The virtual joysticks that control the car movement also control the tilt and pan of the camera
- Option to auto center the camera platform upon toggling back to driving mode
- Take HD photos and save them to an album whose photos can be viewed with phone gesture zooming and panning
- The entire thing, including the Raspberry Pi, is powered by rechargable batteries
- Has the range of the entire WiFi network (or use ngrok to make the car controllable from anywhere, through the internet - see section at the end of this readme)
The Raspberry Pi runs a Flask server with Socket.IO integration, accepting websocket messages to control the movement of the car. The GPIO is integrated through the Python interface to control the steering servo, rear wheel motors, and the camera platform servo positions. The live feed of the camera is served to the frontend through the websocket as a series of low resolution JPG images at 20 frames per second, optimized for low latency. These settings are configurable.
The frontend is a React application with two virtual joysticks. The left joystick operates on the vertical plane and controls the forward and reverse motion of the car and the tilt of the camera. The right joystick operates on the horizontal plane and controls the steering of the car and the pan of the camera. The speed and steering radius of the car are directly correlated with the force applied to the virtual joysticks. Joystick control data is emitted to the server every 25 milliseconds to continously sync the car's speed and servo positions, providing a responsive experience.
The photo button, on the top left, takes an HD photo, which gets saved on the Raspberry Pi in a dedicated folder / album. The photos in this album can be viewed, zoomed, and panned through the UI. The camera always initializes two streams - one low resolution stream that handles the continuous video feed, and another high resolution stream to handle the snapshots that can be taken and saved in this album.
- Raspberry Pi Zero 2 WH (3, 4, 5 would also probably work)
- LK Cokoino 4WD robot hat shield
- LK Cokoino rear-wheel drive robot car chassis with servo
- Arducam camera module
- Arducam camera pan tilt platform
- 18650 2-battery pack with charger
- Dupont jumper wires
- Power jack adapter barrel connector
- 64 GB Micro SD card (could be any size larger than 8GB)
- USB C Micro SD card reader
- Follow the instructions for the robot car chassis assembly using Demo2 (Robot HAT)
- Attach the camera to the Raspberry Pi with the ribbon cable
- Follow the instructions for the camera pan tilt platform assembly, attach it to the Raspberry Pi's GPIO, and mount the platform on the car
It's easiest to set the WiFi network name and password in the imaging software used to flash the micro SD card with the OS so you won't need a mouse, keyboard, and monitor for the pi, and you can instead SSH into it to execute the instructions below.
-
Clone this repository on the Raspberry Pi
git clone https://github.com/diracleo/pi-camera-car.git -
Go into the repository directory
cd pi-camera-car -
Install (will reboot at the end)
sudo make install
-
Start the flask application
make run -
On your phone, while on the same WiFi network as the Raspberry Pi, open the URL in a web browser
http://<IP ADDRESS OF PI>:8000 -
You can now control the car using your phone
Even though running the flask server is just a single command, it's inconvenient because it requires first SSH'ing into the Raspberry Pi (it's a pain to connect a mouse, keyboard, and monitor to a Raspberry Pi that's mounted on a robot car), which is why I recommend creating a service through systemctl to automatically start the flask server on boot. After these steps, every time you flip the switch to turn the car on, it will start up the flask server and be ready for you to control the car with your phone within a minute.
-
sudo nano /etc/systemd/system/car.serviceAdd these contents to the file:
[Unit] Description=Pi Camera Car After=network.target [Service] WorkingDirectory=/home/<user>/pi-camera-car ExecStart=./start Type=idle User=<user> Restart=always RestartSec=1 [Install] WantedBy=multi-user.target -
Restart the daemon
sudo systemctl daemon-reload -
Enable the service
sudo systemctl enable car.service
It's probably easiest to fork this repository and do development on a different machine than the Raspberry Pi. You can push to your forked repository from that machine, and then you can pull the changes on the Raspberry Pi using SSH.
The frontend is a React app bundled by Vite.
-
Go into the frontend directory and start the development server
cd frontendnpm run dev -
Make changes to
App.jsxand test them at the URL displayed in the console -
When you're done, press
ctrl + c -
Build the production bundle
npm run build -
Commit and push your changes to your forked repository
-
Pull those changes onto the Raspberry Pi (the production build directory,
dist, should be included in those changes, as it was intentionally excluded from the.gitignorefile)
The backend is a Flask application in backend/app.py. You can modify this file as you need to, and then re-run make run to see the changes.
The software architecture will support a variety of hardware configurations, whether it's a different robot car chassis or motor driver, and it may just be a matter of changing the GPIO interface in backend/app.py to use different pins. The basic premise is that the backend streams video from the camera to the frontend through the websocket, and the frontend sends messages through the websocket that contain the throttle and steering values, and the backend processes them however it wants to.
You can use ngrok to open up control of the car through the internet.
-
In one terminal, open up traffic to port 8000
ngrok http 8000 -
In another terminal, run the app
make run