Ahead of Global Accessibility Awareness Day on May 18, Apple on Tuesday previewed software features for cognitive, vision, hearing, and mobility accessibility, along with tools for individuals who are non-speaking or at risk of losing their ability to speak.
"Today, we're excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love," said Tim Cook, chief executive officer (CEO) at Apple.
"These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways," said Sarah Herrlinger, senior director of Global Accessibility Policy and Initiatives at Apple.
Below are the details of new features
Live speech
With Live Speech on iPhone, iPad, and Mac, users can type what they want to say and have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. The users can also save commonly used phrases. This feature is primarily aimed for those who are unable to speak or who have lost their speech over time.
More From This Section
Personal voice
Here, the users can create a personal voice by reading along with randomly generated text prompts to record 15 minutes of audio on iPhone and iPad. Apple said, this feature uses on-device machine learning to keep users' information private and secure and integrates with the live speech feature. It is designed for users who are suffering from conditions that can progressively impact speaking ability such as those with a recent diagnosis of amyotrophic lateral sclerosis (ALS).
Point and speak-in magnifier
The point and speak-in magnifier feature would make it easier for users with vision disabilities to interact with physical objects that have several text labels. For example, while using a household appliance — such as a microwave — the feature can combine input from the camera app, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their fingers across the keypad.
Point and speak feature is built into the magnifier app on iPhone and iPad and works with the voiceover app. It can be also used with other magnifier features such as people detection, door detection, and image descriptions to help users navigate their physical environment.
Assistive Access
The assistive access feature uses innovations in design to distil apps and experiences to their essential features to lighten the cognitive load. It offers a distinct interface with high-contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individuals.
This includes a customised experience for the phone and Facetime apps, which have been combined into a single calls app, as well as messages, cameras, photos, and music.
For example, for users who prefer communicating visually, the messages app includes an emoji-only keyboard and the option to record a video message to share with others. The users can also choose between a more visual, grid-based layout for their home screen and apps or a row-based layout for users who prefer text.
Additional features
The deaf or hard-of-hearing users can pair Apple hearing devices directly to Mac and customise them for their hearing.
Moreover, the voice control feature adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike.
Users with physical and motor disabilities who use the switch control feature to turn any switch into a virtual game controller to play their favourite games on iPhone and iPad.