Minimal Android R&D project that streams camera frames, offloads edge detection to native C++ (OpenCV), and renders the processed texture with OpenGL ES 2.0. The repo also ships a TypeScript-based viewer for sharing or debugging processed frames on the web.
- Real-time camera preview using
TextureView(Camera2 API, 1280×720 target). - JNI bridge to an OpenCV-backed C++ pipeline (NV21 → RGBA conversion + optional Canny edges).
- OpenGL ES renderer (
GLSurfaceView) that uploads RGBA textures and displays them with a lightweight shader. - UI toggle between raw and edge-processed output, with FPS and resolution overlay.
- Frame coalescing on the Kotlin side to drop stale frames and keep latency low.
- TypeScript debug viewer that displays a sample processed frame with live stats controls.
├── app/ # Android application (Kotlin UI + JNI bridge)
│ └── src/main/cpp/ # CMake project for edgeproc shared library
├── gl/ # Reusable OpenGL renderer module
├── jni/ # C++ EdgeProcessor (OpenCV) sources/headers
└── web/ # TypeScript viewer (tsc build)
- Install tooling: Android Studio (or command-line SDK), NDK r26+, CMake 3.22+, and OpenCV-for-Android (>=4.8).
- Export the OpenCV package path so CMake can find it (adjust to your filesystem):
export OpenCV_DIR="/path/to/OpenCV-android-sdk/sdk/native/jni"
- Build and install with Gradle:
./gradlew assembleDebug ./gradlew installDebug
- Grant camera permission on first launch. Use the toggle to switch between raw preview and edge rendering while monitoring FPS.
- Kotlin hands NV21 buffers to
NativeBridge, which forwards to theedgeprocshared library. EdgeProcessor(C++) converts NV21 → RGBA via OpenCV and optionally runs Canny edge detection (5×5 Gaussian pre-filter).- Processed RGBA frames are returned to Kotlin, uploaded to an OpenGL texture, and drawn by
EdgeRenderer.
glAndroid library module encapsulates shader compilation (ShaderProgram) and texture rendering (EdgeRenderer).EdgeSurfaceViewwrapsGLSurfaceView, handles lifecycle, and forwards frames/rotation to the renderer.
cd web
npm install
npm run build
Open web/public/index.html in a browser; it loads the compiled dist/main.js, renders a sample processed frame (embedded PNG), and exposes a mode selector with FPS/resolution text overlays.
- The pipeline targets practicality over polish; additional shaders (grayscale/invert) or GPU-side filters can be dropped into
glwithout touching the app module. - Consider wiring a real transport (WebSocket/HTTP) to stream live frames to the TypeScript viewer.
- For production, cache
ByteBuffers and reuse OpenGL textures to minimize allocations; current implementation favors clarity.