On May 15, 2024, Apple announced Eye Tracking, Music Haptics, and Vocal Shortcuts features, which will provide accessibility for individuals with physical disabilities using an iPhone or iPad.
Apple’s Eye Tracking feature, powered by AI, allows users to navigate their iPhone or iPad using only their eyes. Designed for individuals with physical disabilities, it uses the front-facing camera for quick setup and calibration, with all data kept securely on the device.
Eye Tracking works across iPadOS and iOS apps without needing extra hardware, enabling users to control app elements and perform gestures with their eyes.
Music Haptics enables users who are deaf or hard of hearing to experience music on iPhone through tactile feedback. Using the Taptic Engine, it delivers taps, textures, and vibrations in sync with the music. This feature works with millions of songs in the Apple Music catalog and will be available as an API for developers to enhance music accessibility in their apps.
Vocal Shortcuts allow iPhone and iPad users to assign custom utterances for Siri to launch shortcuts and complete tasks. The Listen for Atypical Speech feature enhances speech recognition for diverse speech patterns using on-device machine learning. These features are designed for users with conditions affecting speech, offering greater customization and control.
Vehicle Motion Cues helps reduce motion sickness for iPhone and iPad users in moving vehicles. It displays animated dots on the screen edges to mitigate sensory conflict caused by motion. Using built-in sensors, it detects when a user is in a moving vehicle and adjusts accordingly.
This feature can be set to activate automatically or be toggled on and off in Control Center.
New accessibility features for CarPlay include Voice Control for app navigation, Sound Recognition for alerts to car horns and sirens for deaf or hard of hearing users, and Color Filters to aid colorblind users. Additional visual accessibility features like Bold Text are also included.
Source: