"Are you sure it's edible?"
Checkit is an iOS foraging assistant that uses on-device AI to identify plants and objects through your camera, assess whether they're safe to eat, and answer questions about your environment through a voice-driven AI assistant.
- Live object detection — YOLO11 draws bounding boxes around plants and objects in real time.
- Plant identification — Tap any box to classify the plant species from 300k species and get a streaming Gemma-powered narrative on edibility, toxicity, and preparation.
- Voice assistant — Press and hold to speak. Whisper transcribes your speech on-device; Gemini replies in natural language and builds a local region pack of species knowledge used offline.
- Fully offline inference — All vision models (YOLO, plant classifier, Whisper, Gemma) run on-device via the Zetic SDK. Only the chat assistant requires a network connection.
| Model | Provider | Role |
|---|---|---|
| YOLO11 | Zetic (on-device) | Real-time object detection |
| Plant classifier (300k species) | Zetic (on-device) | Plant species identification |
| Whisper tiny encoder + decoder | Zetic (on-device) | On-device speech transcription |
| Gemma 4 2B | Zetic (on-device) | Streaming plant/object narratives |
| Gemini 2.5 Flash | Google Cloud | Voice assistant replies + region pack updates |
| Vision framework | Apple (on-device) | Plant/non-plant gate before classification |
- Point your camera at a plant or object.
- YOLO detects it and draws a bounding box.
- Tap the box — the app runs Apple Vision to confirm it looks like a plant, then runs the PlantNet-300k classifier to identify the species.
- Gemma streams a narrative about edibility, toxicity, and any relevant context from your region pack.
- Press and hold to speak into the mic.
- Whisper transcribes your speech entirely on-device.
- Your message is sent to Gemini 2.5 Flash, which replies in the chat and updates a local region pack (a structured list of species relevant to your area).
- The region pack is used by the Identify tab to provide richer, context-aware results offline.
- Xcode 15+
- iOS 17+ device (on-device models require a physical device)
- ZeticMLangeiOS Swift package (v1.6.0, added via Xcode Package Dependencies)
Create ios/CheckIt/Secrets.xcconfig with:
GEMINI_API_KEY = your_gemini_api_key
ZETIC_PERSONAL_KEY = your_zetic_personal_key
These are injected into Info.plist at build time and read via SecretsLoader. Do not commit this file.
- Open
ios/CheckIt.xcodeprojin Xcode. - Add the Zetic package: File → Add Package Dependencies →
https://github.com/zetic-ai/ZeticMLangeiOS.git→ exact version1.6.0. - Add your
Secrets.xcconfig. - Build and run on a physical iOS device.
On first launch, the app downloads all five Zetic models (~several hundred MB). Progress is shown on the loading screen.
ContentView
├── LiveView (Identify tab)
│ ├── CameraService — AVCaptureSession, frame delivery
│ ├── InferenceWorker — YOLO inference per frame
│ ├── BoxTracker — Stabilizes bounding boxes across frames
│ ├── OverlayCanvas — Draws boxes; tap triggers DetailSheet
│ └── DetailSheet
│ ├── VisionPlantGateService — Plant/non-plant gate
│ ├── PlantClassificationService — Species classification
│ ├── OfflinePlantKnowledgeService — Pack-based edibility resolution
│ └── GemmaReasoningService — Streaming narrative
└── AssistantView (Prepare tab)
├── SpeechInputService — AVAudioEngine mic capture
├── WhisperTranscriberService — On-device transcription
└── GeminiPackService — Chat reply + region pack update
All five Zetic models are loaded concurrently after permissions are granted (ModelLoader). Secrets, model names, and preprocessing constants are centralized in AppConfig and ModelConfig.
- The app is portrait-only and targets light mode.
- The region pack and transcript are persisted to Application Support and survive restarts. Triple-tap the divider in the Prepare tab to clear them.
- A debug HUD (2-second long-press on the live view) shows FPS, detection class, and model latencies.
- This is a beta — plant identification should not be used as the sole basis for foraging decisions.