diff --git a/examples/mascot-jump-game/python/main.py b/examples/mascot-jump-game/python/main.py index 0c26439..ea94914 100644 --- a/examples/mascot-jump-game/python/main.py +++ b/examples/mascot-jump-game/python/main.py @@ -1,4 +1,4 @@ -# SPDX-FileCopyrightText: Copyright (C) 2025 ARDUINO SA +# SPDX-FileCopyrightText: Copyright (C) ARDUINO SRL (http://www.arduino.cc) # # SPDX-License-Identifier: MPL-2.0 diff --git a/examples/mascot-jump-game/sketch/game_frames.h b/examples/mascot-jump-game/sketch/game_frames.h index f155c76..e5f0027 100644 --- a/examples/mascot-jump-game/sketch/game_frames.h +++ b/examples/mascot-jump-game/sketch/game_frames.h @@ -1,5 +1,5 @@ /* - * SPDX-FileCopyrightText: Copyright (C) 2025 ARDUINO SA + * SPDX-FileCopyrightText: Copyright (C) ARDUINO SRL (http://www.arduino.cc) * * SPDX-License-Identifier: MPL-2.0 */ diff --git a/examples/mascot-jump-game/sketch/sketch.ino b/examples/mascot-jump-game/sketch/sketch.ino index 80a729e..7f42f8e 100644 --- a/examples/mascot-jump-game/sketch/sketch.ino +++ b/examples/mascot-jump-game/sketch/sketch.ino @@ -1,4 +1,4 @@ -// SPDX-FileCopyrightText: Copyright (C) 2025 ARDUINO SA +// SPDX-FileCopyrightText: Copyright (C) ARDUINO SRL (http://www.arduino.cc) // // SPDX-License-Identifier: MPL-2.0 diff --git a/examples/object-hunting/README.md b/examples/object-hunting/README.md new file mode 100644 index 0000000..583a9fb --- /dev/null +++ b/examples/object-hunting/README.md @@ -0,0 +1,179 @@ +# Object Hunting + +The **Object Hunting Game** is an interactive scavenger hunt that uses real-time object detection. Players must locate specific physical objects in their environment using a USB camera connected to the Arduino UNO Q to win the game. + +**Note:** This example requires to be run using **Network Mode** or **Single-Board Computer (SBC) Mode**, since it requires a **USB-C® hub** and a **USB webcam**. + +![Object Hunting Game Example](assets/docs_assets/thumbnail.png) + +## Description + +This App creates an interactive game that recognizes real-world objects. It utilizes the `video_objectdetection` Brick to stream video from a USB webcam and perform continuous inference using the **YoloX Nano** model. The web interface challenges the user to find five specific items: **Book, Bottle, Chair, Cup, and Cell Phone**. + +Key features include: + +- Real-time video streaming and object recognition +- Interactive checklist that updates automatically when items are found +- Confidence threshold adjustment to tune detection sensitivity +- "Win" state triggering upon locating all target objects + +## Bricks Used + +The object hunting game example uses the following Bricks: + +- `web_ui`: Brick to create the interactive game interface and handle WebSocket communication. +- `video_objectdetection`: Brick that manages the USB camera stream, runs the machine learning model, and provides real-time detection results. + +## Hardware and Software Requirements + +### Hardware + +- Arduino UNO Q (x1) +- **USB-C® hub with external power (x1)** +- A power supply (5 V, 3 A) for the USB hub (x1) +- **USB Webcam** (x1) + +### Software + +- Arduino App Lab +**Important:** A **USB-C® hub is mandatory** for this example to connect the USB Webcam. +**Note:** You must connect the USB camera **before** running the App. If the camera is not connected or not detected, the App will fail to start. + +## How to Use the Example + +1. **Hardware Setup** + Connect your **USB Webcam** to a powered **USB-C® hub** attached to the UNO Q. Ensure the hub is powered to support the camera. + ![Hardware setup](assets/docs_assets/hardware-setup.png) + +2. **Run the App** + Launch the App from Arduino App Lab. + *Note: If the App stops immediately after clicking Run, check your USB camera connection.* + ![Arduino App Lab - Run App](assets/docs_assets/launch-app.png) + +3. **Access the Web Interface** + Open the App in your browser at `:7000`. The interface will load, showing the game introduction and the video feed placeholder. + +4. **Start the Game** + Click the **Start Game** button. The interface will switch to the gameplay view, displaying the live video feed and the list of objects to find. + +5. **Hunt for Objects** + Point the camera at the required items (Book, Bottle, Chair, Cup, Cell Phone). When the system detects an object with sufficient confidence, it will automatically mark it as "Found" in the UI. + +6. **Adjust Sensitivity** + If the camera is not detecting objects easily, or is detecting them incorrectly, use the **Confidence Level** slider on the right. + - **Lower value:** Detects objects more easily but may produce false positives. + - **Higher value:** Requires a clearer view of the object to trigger a match. + +7. **Win the Game** + Once all five objects are checked off the list, a "You found them all!" screen appears. You can click **Play Again** to reset the list and restart. + +## How it Works + +The application relies on a continuous data pipeline between the hardware, the inference engine, and the web browser. + +**High-level data flow:** + +``` + USB Camera ──► VideoObjectDetection ──► Inference Model (YoloX) + │ │ + │ (MJPEG Stream) │ (Detection Events) + ▼ ▼ + Frontend (Browser) ◄── WebUI Brick + │ + └──► WebSocket (Threshold Control) +``` + +- **Video Streaming**: The `video_objectdetection` Brick captures video from the USB camera and hosts a low-latency stream on port `4912`. The frontend embeds this stream via an `