There’s no shortage of rumors about Apple’s plans to release camera-equipped wearables. And while it’s easy to get fatigued by yet another wave of upcoming AI-powered hardware, one powerful use case often gets lost in the shuffle: accessibility.
SceneScout, a new research prototype from Apple and Columbia University, isn’t a wearable. Yet. But it hints at what AI could eventually unlock for blind and low-vision users. As Apple’s and Columbia University’s researchers explain it:
People who are blind or have low vision (BLV) may hesitate to travel independently in unfamiliar environments due to uncertainty about the physical landscape. While most tools focus on in-situ navigation, those exploring pre-travel assistance typically provide only landmarks and turn-by-turn instructions, lacking detailed visual context. Street view imagery, which contains rich visual information and has the potential to reveal numerous environmental details, remains inaccessible to BLV people.
Källa: Apple’s newest AI study unlocks street view for blind users – 9to5Mac