mobile-ai-orchestrator is a platform-agnostic Rust library for intelligent AI routing on constrained devices. It decides where and how to run AI inference - locally, remotely, or hybrid.
|
Important
|
This is a library/framework, NOT an application. For a complete Android application using this library, see neurophone. |
┌─────────────────────────────────────────────────────────────────┐
│ mobile-ai-orchestrator │
│ (THIS LIBRARY) │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Query ──► Expert ──► Router ──► Context ──► Inference │
│ System Decision Manager Dispatch │
│ (Safety) (Where?) (History) (Execute) │
│ │
│ Decides: "Should this run locally or in the cloud?" │
│ Handles: Safety rules, privacy filtering, context tracking │
│ Provides: Routing decisions, NOT the actual inference │
│ │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌───────────────┴───────────────┐
│ │
┌───────▼───────┐ ┌─────────▼─────────┐
│ Local SLM │ │ Cloud API │
│ (llama.cpp) │ │ (Claude, etc.) │
│ Your choice │ │ Your choice │
└───────────────┘ └───────────────────┘| Feature | This Library | Typical AI SDKs |
|---|---|---|
Platform |
Platform-agnostic (Linux, Android, iOS, embedded) |
Platform-specific |
Decision Layer |
Built-in routing intelligence |
Direct API calls only |
Safety |
Zero |
Varies |
Offline |
Offline-first by design |
Network required |
Privacy |
Automatic sensitive data detection |
Manual handling |
-
Mobile apps - Let the library decide local vs cloud based on query complexity
-
Edge devices - Intelligent fallback when connectivity is limited
-
Privacy-sensitive - Automatic blocking of sensitive data from cloud
-
Resource-constrained - Battery and memory-aware routing
Add to your Cargo.toml:
[dependencies]
mobile-ai-orchestrator = "0.1"
# Optional: enable network features
mobile-ai-orchestrator = { version = "0.1", features = ["network"] }use mobile_ai_orchestrator::{Orchestrator, OrchestratorConfig};
// Create orchestrator with default config
let config = OrchestratorConfig::default();
let orchestrator = Orchestrator::new(config);
// Process a query - library decides routing
let decision = orchestrator.process("How do I iterate a HashMap?")?;
match decision.route {
Route::Local => {
// Run with your local SLM (llama.cpp, etc.)
let response = your_local_llm.generate(&decision.context);
}
Route::Remote => {
// Send to your cloud API (Claude, GPT, etc.)
let response = your_cloud_api.query(&decision.context);
}
Route::Blocked => {
// Query blocked by safety rules
println!("Blocked: {}", decision.reason);
}
}| Component | Purpose | File |
|---|---|---|
Expert System |
Rule-based safety layer (block dangerous queries) |
|
Router |
Heuristic + neural routing decisions |
|
Context Manager |
Conversation history and project state |
|
Orchestrator |
Main coordinator, pipeline execution |
|
Reservoir |
Echo State Network for temporal compression |
|
MLP |
Multi-layer perceptron for learned routing |
|
SNN |
Spiking neural network for wake detection |
|
The library includes three neural computing approaches:
-
Reservoir Computing (ESN) - Compress conversation history into fixed-size state
-
Multi-Layer Perceptron - Learn routing decisions from data
-
Spiking Neural Network - Ultra-low-power wake-word detection
These are optional enhancements over the heuristic baseline.
-
Zero
unsafeblocks in entire codebase -
Type-safe by design (Rust ownership model)
-
Memory-safe (compile-time guarantees)
-
Formal rule-based safety layer
-
Core functionality works without internet
-
Network features behind feature flag
-
Graceful degradation
-
Local decision-making always available
let config = OrchestratorConfig {
// Safety thresholds
safety_threshold: 0.8,
// Routing preferences
prefer_local: true,
max_local_complexity: 0.6,
// Context settings
max_history_items: 100,
project_aware: true,
// Privacy rules
block_patterns: vec![
r"password|secret|api.?key".to_string(),
],
};The library includes a CLI for testing:
# Build CLI
cargo build --release
# Interactive mode
./target/release/mobile-ai --interactive
# Single query
./target/release/mobile-ai "Explain ownership in Rust"
# With project context
./target/release/mobile-ai --project myproject "What's the architecture?"neurophone uses this library for AI routing:
// In neurophone-core
use mobile_ai_orchestrator::{Orchestrator, Route};
let orchestrator = Orchestrator::new(config);
let decision = orchestrator.process(&user_query)?;
match decision.route {
Route::Local => llama_client.generate(decision.context),
Route::Remote => claude_client.query(decision.context),
Route::Blocked => /* handle blocked query */,
}Run performance benchmarks:
cargo bench| Operation | Time | Memory |
|---|---|---|
Route decision (heuristic) |
~50μs |
~1KB |
Route decision (MLP) |
~200μs |
~10KB |
Context update |
~10μs |
~100B |
Reservoir step |
~500μs |
~50KB |
This project achieves Bronze-level RSR (Rhodium Standard Repository) compliance:
-
Type safety (Rust type system)
-
Memory safety (ownership model, zero
unsafe) -
Offline-first (network is optional)
-
Comprehensive documentation
-
Test coverage (>90%)
-
Build system (
justfile,flake.nix) -
CI/CD automation
-
Security policy
| Project | Relationship | Description |
|---|---|---|
Consumer |
Android app that uses this library for on-device AI |
|
Complementary |
Conversation context preservation across sessions |
|
Inspiration |
Safety-critical programming language concepts |
# Run tests
cargo test
# Run benchmarks
cargo bench
# Generate docs
cargo doc --open
# Run examples
cargo run --example basic_usage
cargo run --example reservoir_demo
cargo run --example mlp_routerContributions welcome! This project operates under TPCF Perimeter 3 (Community Sandbox).
See CONTRIBUTING.md for guidelines.
@software{mobile_ai_orchestrator_2025,
author = {Jewell, Jonathan D.A.},
title = {Mobile AI Orchestrator: Platform-Agnostic AI Routing Library},
year = {2025},
url = {https://github.com/hyperpolymath/heterogenous-mobile-computing},
note = {RSR Bronze-compliant}
}-
Author: Jonathan D.A. Jewell
-
Email: hyperpolymath@protonmail.com
-
Matrix: See MAINTAINERS.md
Platform-Agnostic • Offline-First • Zero Unsafe • RSR Bronze Compliant