Summary
Add spoken turn-by-turn style prompts during in-app navigation, similar to Apple Maps voice guidance—not only silent on-screen cues.
Motivation
- Safer walks: glance at the phone less often.
- Clearer reassurance at decision points (turns, replans, nearing destination).
Scope (to refine)
- Integrate iOS APIs appropriate for localized speech / route guidance (for example
AVSpeechSynthesizer, MapKit / MKDirections-style cues if aligned with how routes are modeled, or CLLocation / step-geometry triggers).
- Respect silent mode, spoken content accessibility settings where applicable, or offer an in-app toggle.
- Pause / duck other audio via
AVAudioSession if we mix with music or podcasts.
Acceptance (draft)
Notes
- Depends on the existing plan/nav step model (
planLoop, steps, reroutes)—implementation should attach prompts to stable events rather than ad-hoc strings everywhere.
Summary
Add spoken turn-by-turn style prompts during in-app navigation, similar to Apple Maps voice guidance—not only silent on-screen cues.
Motivation
Scope (to refine)
AVSpeechSynthesizer, MapKit / MKDirections-style cues if aligned with how routes are modeled, or CLLocation / step-geometry triggers).AVAudioSessionif we mix with music or podcasts.Acceptance (draft)
docs/in-app-navigation.mdor successor).Notes
planLoop, steps, reroutes)—implementation should attach prompts to stable events rather than ad-hoc strings everywhere.