QuestCameraKit is a collection of template and reference projects demonstrating how to use Meta Quest’s new Passthrough Camera API (PCA)
for advanced AR/VR vision, tracking, and shader effects.
- Purpose: Convert a 3D point in space to its corresponding 2D image pixel.
- Description: This sample shows the mapping between 3D space and 2D image coordinates using the Passthrough Camera API. We use MRUK's EnvironmentRaycastManager to determine a 3D point in our environment and map it to the location on our WebcamTexture. We then extract the pixel on that point, to determine the color of a real world object.
- Purpose: Convert 2D screen coordinates into their corresponding 3D points in space.
- Description: Use the Unity Sentis framework to infer different ML models to detect and track objects. Learn how to convert detected image coordinates (e.g. bounding boxes) back into 3D points for dynamic interaction within your scenes. In this sample you will also see how to filter labels. This means e.g. you can only detect humans and pets, to create a more safe play-area for your VR game. The sample video below is filtered to monitor, person and laptop. The sample is running at around
60 fps
.
1. 🎨 Color Picker | 2. 🍎 Object Detection |
---|---|
![]() |
![]() |
- Purpose: Detect and track QR codes in real time. Open webviews or log-in to 3rd party services with ease.
- Description: Similarly to the object detection sample, get QR code coordinated and projects them into 3D space. Detect QR codes and call their URLs. You can select between a multiple or single QR code mode. The sample is running at around
70 fps
for multiple QR codes and a stable72 fps
for a single code.
- Purpose: Apply a custom frosted glass shader effect to virtual surfaces.
- Description: A shader which takes our camera feed as input to blur the content behind it.
Todo
: We have a shader that correctly maps the camera texture onto a quad, and we have one vertical blur shader and one horizontal blur shader. Ideally we would combine all of these into one shader effect to be able to easily apply it to meshes or UI elements.
3. 📱 QR Code Tracking | 4. 🪟 Frosted Glass |
---|---|
![]() |
![]() |
- Purpose: Ask OpenAI's vision model (or any other multi-modal LLM) for context of your current scene.
- Description: We use a the OpenAI Speech to text API to create a coommand. We then send this command together with a screenshot to the Vision model. Lastly, we get the response back and use the Text to speech API to turn the response text into an audio file in Unity to speak the response. The user can select different speakers, models, and speed. For the command we can add additional instructions for the model, as well as select an image, image & text, or just a text mode. The whole loop takes anywhere from
2-6 seconds
, depending on the internet connection.
OpenAI.vision.whisper.model.mp4
Information | Details |
---|---|
Device Requirements | - Only for Meta Quest 3 and 3s - HorizonOS v74 or later |
Unity WebcamTexture | - Access through Unity’s WebcamTexture - Only one camera at a time (left or right), a Unity limitation |
Android Camera2 API | - Unobstructed forward-facing RGB cameras - Provides camera intrinsics ( camera ID , height , width , lens translation & rotation )- Android Manifest: horizonos.permission.HEADSET_CAMERA |
Public Experimental | Apps using PCA are not allowed to be submitted to the Meta Horizon Store yet. |
Specifications | - Frame Rate: 30fps - Image latency: 40-60ms - Available resolutions per eye: 320x240 , 640x480 , 800x600 , 1280x960 |
- Meta Quest Device: Ensure you are runnning on a
Quest 3
orQuest 3s
and your device is updated toHorizonOS v74
or later. - Unity: Recommended is
Unity 6
. Also runs on Unity2022.3. LTS
. - Camera Passthrough API does not work in the Editor or XR Simulator.
- Get more information from the Meta Quest Developer Documentation
Caution
Every feature involving accessing the camera has significant impact on your application's performance. Be aware of this and ask yourself if the feature you are trying to implement can be done any other way besides using cameras.
-
Clone the Repository:
git clone https://github.com/yourusername/QuestVisionKit.git
-
Open the Project in Unity: Launch Unity and open the cloned project folder.
-
Configure Dependencies: Follow the instructions in the section below to run one of the samples.
- Open the
ColorPicker
scene and run the application. - Build the scene and run the APK on your headset.
- Aim the ray onto a surface in your real space and press the A button or pinch your fingers to observe the cube changing it's color to the color in your real environment.
- Open the
ObjectDetection
scene. - You will need Unity Sentis for this project to run (com.unity.sentis@2.1.2).
- Select the labels you would like to track. No label means all objects will be tracked.
- Build the scene and run the APK on your headset. Look around your room and see how tracked objects receive a bounding box in accurate 3D space.
Below you can see all the labels that are provided:
person | bicycle | car | motorbike | aeroplane | bus | train | truck |
boat | traffic light | fire hydrant | stop sign | parking meter | bench | bird | cat |
dog | horse | sheep | cow | elephant | bear | zebra | giraffe |
backpack | umbrella | handbag | tie | suitcase | frisbee | skis | snowboard |
sports ball | kite | baseball bat | baseball glove | skateboard | surfboard | tennis racket | bottle |
wine glass | cup | fork | knife | spoon | bowl | banana | apple |
sandwich | orange | broccoli | carrot | hot dog | pizza | donut | cake |
chair | sofa | pottedplant | bed | diningtable | toilet | tvmonitor | laptop |
mouse | remote | keyboard | cell phone | microwave | oven | toaster | sink |
refrigerator | book | clock | vase | scissors | teddy bear | hair drier | toothbrush |
- Open the
QRCodeTracking
scene to test real-time QR code detection and tracking. - You will need to install NuGet for Unity
- After installing NuGet for Unity you will have a new Menu
NuGet
. Click on it and then onManage NuGet Packages
. Search for the ZXing.Net package from Michael Jahn and install it. - Build the scene and run the APK on your headset. Look at a QR code to see the marker in 3D space and URL of the QR code.
Troubleshooting: If you ever get the error below, make sure in your Player Settings
under Scripting Define Symbols
you see ZXING_ENABLED
.
The type or namespace name 'ZXing' could not be found (are you missing a using directive or an assembly reference?)
- Open the
FrostedGlass
scene. - Build the scene and run the APK on your headset.
- Look at the panel from different angles and observe how objects behind it are blurred.
Troubleshooting: If you cannot see the blur effect, make sure in your render asset the Opaque Texture
check-box is checked.
Warning
The Meta Project Setup Tool (PST) will show a warning and tell you to uncheck it, so do not fix this warning.
- Open the
ImageLLM
scene. - Make sure to create an API key and enter it in the
OpenAI Manager prefab
. - Select your desired model and optionally give the LLM some instructions.
- Make sure your headset is connected to the internet (the faster the better).
- Build the scene and run the APK on your headset.
Note
File uploads are currently limited to 25 MB
and the following input file types are supported: mp3
, mp4
, mpeg
, mpga
, m4a
, wav
, and webm
.
Below you can see all supported languages. You can send commands and receive results in any of these languages:
Afrikaans | Arabic | Armenian | Azerbaijani | Belarusian | Bosnian | Bulgarian | Catalan | Chinese |
Croatian | Czech | Danish | Dutch | English | Estonian | Finnish | French | Galician |
German | Greek | Hebrew | Hindi | Hungarian | Icelandic | Indonesian | Italian | Japanese |
Kannada | Kazakh | Korean | Latvian | Lithuanian | Macedonian | Malay | Marathi | Maori |
Nepali | Norwegian | Persian | Polish | Portuguese | Romanian | Russian | Serbian | Slovak |
Slovenian | Spanish | Swahili | Swedish | Tagalog | Tamil | Thai | Turkish | Ukrainian |
Urdu | Vietnamese | Welsh |
This project is licensed under the MIT License. See the LICENSE file for details. Feel free to use the samples for your own projects, though I would appreciate if you would leave some credits to this repo in your work ❤️
For questions, suggestions, or feedback, please open an issue in the repository or contact me on X, LinkedIn, or at roberto@blackwhale.dev. Find all my info here or join our growing XR developer community on Discord.
- Meta For the Passthrough Camera API and Passthrough Camera API Samples.
- Thanks to shader wizard Daniel Ilett for helping me set up the
FrostedGlass
sample. - Thanks to Michael Jahn for the XZing.Net library used for the QR code tracking samples.
- Thanks to Julian Triveri for constantly pushing the boundaries with what is possible with Meta Quest hardware and software.