An iOS app experiment that uses on-device LLMs to detect and neutralize dark patterns in websites. Browse the web with hidden "decline" buttons revealed, manipulative UI exposed, and deceptive design patterns removed.
- Dark Pattern Detection - Identify manipulative UI patterns like hidden decline buttons, fake urgency timers, and confusing opt-out flows
- On-Device Processing - All analysis runs locally using Apple Foundation Models or MLX - no data leaves your device
- Sanitization - Extract page HTML, analyze with LLM, let user select what to fix and generate js that removes the dark patterns
- Context Window - this is the biggest limitation. Processing real websites could be possible with optimizations that strip away html bloat and irrelevant markup.
- Server side blindness - is the "only 3 left" a dark pattern or are actually only 3 items left?
- **Processing speed" - first step would be to implement a caching system that auto applys fixes on subsequent website visits
- iOS 26.2+
- Xcode 16.0+
- Device with Apple Intelligence or sufficient RAM for local models
- Clone the repository
- Open
fairplay.xcodeprojin Xcode - Add Swift packages via File → Add Package Dependencies:
- Build and run on simulator or device
This project supports hot reloading via Inject. To enable it:
- Download InjectionIII and place it in
/Applications. Use the GitHub release instead of the App Store version. - Build and run your app in the simulator
The injection bundle loads automatically. Save any Swift file and changes appear instantly without rebuilding!
WebView loads page
↓
Extract HTML via JavaScript
↓
Send to on-device LLM for classification
↓
User selects what to fix
↓
Send to on-device LLM to generate fix for specific pattern
↓
Run js that applies the changes
LLM Backend: LocalLLMClient provides a unified interface for:
- Apple Foundation Models (on-device)
- llama.cpp with GGUF models (tried but not compatible with MLX + worse performance)
- MLX models
For larger models, add com.apple.developer.kernel.increased-memory-limit to your entitlements.
MIT