IMPORTANT: This project is sourced from the official Qualcomm AI Engine Direct Helper repository: https://github.com/quic/ai-engine-direct-helper/tree/main/samples/android/SuperResolution2
This is a sample application demonstrating the use of QAI AppBuilder SDK for on-device AI inference.
This is an Android application that demonstrates AI-powered image super-resolution using Qualcomm's QNN (Qualcomm Neural Network) SDK. The app takes low-resolution images (128x128) and outputs 4x upscaled high-resolution images (512x512) using deep learning models running on Snapdragon NPUs.
Key Technologies:
- Android (Java) with JNI for native code integration
- C++17 with CMake for native build
- Qualcomm QAI AppBuilder SDK for model inference
- OpenCV for image preprocessing/postprocessing
- NDK targeting arm64-v8a architecture
QAI AppBuilder is designed to help developers easily execute models on Windows on Snapdragon (WoS) and Linux platforms. It encapsulates the Qualcomm® AI Runtime SDK APIs into a set of simplified interfaces for running models on the NPU/HTP (Hexagon Tensor Processor).
Official Repository: https://github.com/quic/ai-engine-direct-helper
- Android Studio Arctic Fox or later
- Android SDK with API level 24 (min) to 36 (target)
- Android NDK (r21 or later recommended)
- Gradle 8.13
- A Snapdragon device with NPU support (tested on Qualcomm Snapdragon platforms)
- OpenCV Android SDK 4.9.0
# Clone the repository
git clone <repository-url>
cd SuperResolution2
# Build debug APK
./gradlew assembleDebug
# Build release APK
./gradlew assembleRelease
# Install on connected device
./gradlew installDebug
# Clean build
./gradlew cleanapp/
├── src/main/
│ ├── java/com/example/superresolution/
│ │ └── MainActivity.java # Main Android UI & image processing
│ ├── cpp/
│ │ ├── native-lib.cpp # JNI bridge to QAI AppBuilder
│ │ ├── CMakeLists.txt # Native build configuration
│ │ └── External/
│ │ ├── QAIAppBuilder/ # Qualcomm QAI AppBuilder SDK
│ │ │ ├── include/
│ │ │ │ ├── LibAppBuilder.hpp # Main API header
│ │ │ │ └── Lora.hpp # LoRA adapter support
│ │ │ └── libappbuilder.so # Pre-built library
│ │ └── xtensor/ # Header-only tensor library
│ ├── res/
│ │ └── layout/
│ │ └── activity_main.xml # UI layout
│ └── libs/arm64-v8a/ # QNN runtime libraries
│ ├── libQnnHtp.so # HTP backend
│ ├── libQnnSystem.so # QNN system library
│ └── libQnnCPU.so # CPU fallback
├── build.gradle.kts # App-level Gradle config
└── proguard-rules.pro # ProGuard configuration
Place your super-resolution model .bin files on the device:
/sdcard/AIModels/SuperResolution/
├── real_esrgan_x4plus.bin
├── model2.bin
└── ...
The app automatically scans this directory and populates a dropdown with available models.
- Launch the app on your Snapdragon device
- Grant storage permissions when prompted
- Select a model from the dropdown
- Tap "Select Image" to choose an input image
- Tap "Convert" to run super-resolution
- The output image will be displayed and saved to
/sdcard/AIModels/SuperResolution/
- Input: 128x128 RGB image, resized from any source image
- Output: 512x512 RGB image (4x upscaling)
- Data Format: Float32, normalized [0,1], HWC layout
1. User selects image
↓
2. Java: Resize to 128x128, BGR→RGB conversion
↓
3. Java: Normalize to [0,1] float32
↓
4. Pass via Direct ByteBuffer to native code
↓
5. C++: Initialize QAI AppBuilder with model
↓
6. C++: Execute inference on NPU via QNN
↓
7. C++: Return output via Direct ByteBuffer
↓
8. Java: Denormalize to [0,255], RGB→BGR
↓
9. Display and save result
MainActivity.java (app/src/main/java/com/example/superresolution/MainActivity.java)
- UI controller for image selection and model management
- Image preprocessing: resize, color conversion, normalization
- Postprocessing: denormalization, color conversion
- JNI bridge to native inference code
native-lib.cpp (app/src/main/cpp/native-lib.cpp)
- JNI implementation linking Java to QAI AppBuilder
- Model loading and initialization
- Inference execution on QNN backend
- Memory management for input/output buffers
LibAppBuilder.hpp (app/src/main/cpp/External/QAIAppBuilder/include/LibAppBuilder.hpp)
- Qualcomm QAI AppBuilder API
- Simplified interface to QNN SDK
- Model lifecycle: Initialize, Inference, Destroy
The app expects models in Qualcomm's .bin format compatible with QAI AppBuilder. The default model name is real_esrgan_x4plus.
For information on obtaining or converting models, see the QAI AppBuilder documentation.
- Target Architecture: arm64-v8a only
- Backend: QNN HTP (Hexagon Tensor Processor)
- Performance Mode: "burst" mode for optimized inference
- Memory: Uses direct ByteBuffers for zero-copy JNI transfer
-
Device Requirements: This app requires a Snapdragon device with NPU support. It will not work on devices without Qualcomm hardware.
-
Architecture Lock: The app is configured to build only for
arm64-v8a(64-bit ARM). Other architectures are not supported. -
Runtime Libraries: QNN backend libraries (
libQnnHtp.so,libQnnSystem.so) must be present in the app's native library directory. -
Model Storage: Models must be placed in
/sdcard/AIModels/SuperResolution/on the device. -
Permissions: The app requires "Manage All Files" permission on Android 11+ to access models and save outputs.
- QAI AppBuilder Official Repository
- QAI AppBuilder Build Instructions
- Qualcomm AI Hub
- Qualcomm Developer Network
Copyright (c) 2025, Qualcomm Innovation Center, Inc. All rights reserved.
SPDX-License-Identifier: BSD-3-Clause
This project is a sample application from the Qualcomm AI Engine Direct Helper repository. See the official repository for the original source code and licensing information.
For issues related to:
- QAI AppBuilder SDK: Report to ai-engine-direct-helper issues
- QNN SDK: Contact Qualcomm Developer Support
- This sample: Check the original repository for updates