Latest

Apple announce new accessibility features for iPad and iPhone

16th May 2024
Harry Fowle
0

Apple has announced new accessibility features coming later this year, including Eye Tracking, a way for users with physical disabilities to control their iPad or iPhone with their eyes.

Additionally, Music Haptics will offer a new way for users who are deaf or hard of hearing to experience music using the Taptic Engine in iPhone; Vocal Shortcuts will allow users to perform tasks by making a custom sound; Vehicle Motion Cues can help reduce motion sickness when using iPhone or iPad in a moving vehicle; and more accessibility features will come to visionOS. These features combine the power of Apple hardware and software, harnessing Apple silicon, artificial intelligence, and machine learning to further Apple’s decades-long commitment to designing products for everyone.

“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

Bringing eye tracking to iPhone and iPad

Powered by AI, Eye Tracking gives users a built-in option for navigating iPad and iPhone using only their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds. With on-device machine learning, all data used to set up and control this feature is kept securely on the device and is not shared with Apple.

Eye Tracking works across iPadOS and iOS apps and does not require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

Accessible music with haptic features

Music Haptics offers a new way for users who are deaf or hard of hearing to experience music on iPhone. With this accessibility feature enabled, the Taptic Engine in iPhone produces taps, textures, and refined vibrations in sync with the audio of the music. Music Haptics works across millions of songs in the Apple Music catalogue and will be available as an API for developers to make music more accessible in their apps.

Features for a wide range of speech

With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. Listen for Atypical Speech, another new feature, enhances speech recognition for a wider range of speech patterns. Listen for Atypical Speech uses on-device machine learning to recognise user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customisation and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

Mark Hasegawa-Johnson, the Principal Investigator of the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign, stated: “Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers. The Speech Accessibility Project was designed as a broad-based, community-supported effort to help companies and universities make speech recognition more robust and effective, and Apple is among the accessibility advocates who made the Speech Accessibility Project possible.”

Aiding motion sickness with vehicle motion cues

Vehicle Motion Cues is a new feature for iPhone and iPad designed to help reduce motion sickness for passengers in moving vehicles. Research indicates that motion sickness is often caused by a sensory conflict between what a person sees and what they feel, which can make it uncomfortable for some users to use iPhone or iPad while riding in a vehicle. Vehicle Motion Cues addresses this by displaying animated dots on the edges of the screen to represent changes in vehicle motion, reducing sensory conflict without interfering with the main content. Using built-in sensors, Vehicle Motion Cues recognises when a user is in a moving vehicle and responds accordingly. The feature can be set to activate automatically on iPhone or can be turned on and off in Control Centre.

Voice control and improved accessibility for CarPlay

New accessibility features coming to CarPlay include Voice Control, Colour Filters, and Sound Recognition. With Voice Control, users can navigate CarPlay and control apps using only their voice. Sound Recognition allows drivers or passengers who are deaf or hard of hearing to receive alerts for car horns and sirens. Colour Filters make the CarPlay interface easier to use for those who are colourblind, with additional visual accessibility features such as Bold Text and Large Text.

More accessibility options for visionOS

This year, accessibility features coming to visionOS will include systemwide Live Captions to assist everyone, including users who are deaf or hard of hearing, in following spoken dialogue in live conversations and audio from apps. Live Captions for FaceTime in visionOS will enable more users to easily enjoy connecting and collaborating using their Persona. Apple Vision Pro will add the capability to move captions using the window bar during Apple Immersive Video, and will support additional Made for iPhone hearing devices and cochlear hearing processors. Updates for vision accessibility will include Reduce Transparency, Smart Invert, and Dim Flashing Lights for users with low vision or those who want to avoid bright lights and frequent flashing.

These features will join the existing accessibility features in Apple Vision Pro, which offers a flexible input system and an intuitive interface designed for a wide range of users. Features such as VoiceOver, Zoom, and Colour Filters can provide users who are blind or have low vision access to spatial computing, while Guided Access can support users with cognitive disabilities. Users can control Vision Pro with any combination of their eyes, hands, or voice, with accessibility features including Switch Control, Sound Actions, and Dwell Control to assist those with physical disabilities.

“Apple Vision Pro is without a doubt the most accessible technology I’ve ever used,” said Ryan Hudson-Peralta, Product Designer, Accessibility Consultant, and Co-Founder of Equal Accessibility LLC. “As someone born without hands and unable to walk, I know the world was not designed with me in mind, so it’s been incredible to see that visionOS just works. It’s a testament to the power and importance of accessible and inclusive design.”

Featured products

Product Spotlight

Upcoming Events

No events found.
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier