
A technology pioneer commitment to inclusivity, Cupertino giant Apple is introducing a range of innovative tools tailored specifically later this year for individuals with physical disabilities and those who are deaf or hard of hearing.
The new features include Eye Tracking, Music Haptics, Vocal Shortcuts, Vehicle Motion Cues, and more. For these features, Apple has combined the hardware and software, is harnessing Apple silicon, artificial intelligence, and machine learning.
“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”
Eye Tracking on iPhone, iPad
Eye Tracking is a way for users with physical disabilities to control iPad or iPhone with their eyes. The feature is powered by artificial intelligence that gives users a built-in option for navigating iPad and iPhone with just their eyes. Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple. This feature will work across iPadOS and iOS apps, and will not require additional hardware or accessories. The feature can be used to navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.
Music Haptics
Apple has introduced Music Haptics with the thought of introducing a new way for users who are deaf or hard of hearing to experience music using the Taptic Engine in iPhone. With this accessibility feature turned on, the Taptic Engine in iPhone plays taps, textures, and refined vibrations to the audio of the music. This will work across millions of songs in the Apple Music catalog, and will be available as an API for developers to make music more accessible in their apps.
Vocal Shortcuts
Vocal Shortcuts is designed to allow users to perform tasks by making a custom sound. With this feature, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. Listen for Atypical Speech, another new feature, will give users an option for enhancing speech recognition for a wider range of speech. It uses on-device machine learning to recognise user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customisation and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.
Reduce motion sickness with Vehicle Motion Cues
Many experience motion sickness in moving vehicles. For this, Apple’s Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles. Apple says that research shows that motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using iPhone or iPad while riding in a moving vehicle. With Vehicle Motion Cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, Vehicle Motion Cues can recognise when a user is in a moving vehicle and responds accordingly. The feature can be set to show automatically on iPhone, or can be turned on and off in Control Center.
In addition to the above, accessibility features coming to visionOS this year. The new features will include systemwide Live Captions to help everyone — including users who are deaf or hard of hearing — follow along with spoken dialogue in live conversations and in audio from apps. With Live Captions for FaceTime in visionOS, more users can easily enjoy the unique experience of connecting and collaborating using their Persona.
Some other features for blind or those who have low vision include VoiceOver that will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customise VoiceOver keyboard shortcuts on Mac. Magnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button. Braille users will get a new way to start and stay in Braille Screen Input for faster control and text editing. For users with low vision, Hover Typing will show larger text when typing in a text field, and in a user’s preferred font and colour.
Users who are at risk of losing their ability to speak, Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases. For nonspeaking Apple hardware users, Live Speech will include categories and simultaneous compatibility with Live Captions.
For Unparalleled coverage of India's Businesses and Economy – Subscribe to Business Today Magazine