Apple is introducing new features for cognitive, speech, and vision accessibility, along with Assistive Access Home Screen layouts on iPad and iPhone. Live Speech, Personal Voice, and Point and Speak in Magnifier are also coming later this year. These updates draw on advances in hardware and software, including on-device machine learning, to ensure user privacy. Apple works in collaboration with community groups to develop accessibility features that make a real impact on people's lives. Apple's Assistive Access feature supports users with cognitive disabilities, helping them connect and improve their cognitive health.

It focuses on the activities they enjoy, such as connecting with loved ones, capturing photos, and listening to music. The feature was designed with feedback from members of disability communities, and is part of Apple's Global Accessibility Policy and Initiatives. Assistive Access is an essential feature on iPhone 14 Pro Max that distills experiences across the Camera, Photos, Music, Calls, and Messages apps to lighten their cognitive load. It includes a customized experience for Phone and FaceTime, as well as a separate Calls app with high contrast buttons and large text labels. Messages includes an emoji-only keyboard and the option to record video messages to share with loved ones.

The new streamlined Home Screen on iPad and iPhone with Assistive Access features enabled. With Live Speech on iPhone, iPad, and Mac, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. They can also save commonly used phrases to chime in quickly during lively conversations with family, friends, and colleagues. This feature has been designed to support millions of people globally who are unable to speak or who have lost their speech over time. Live Speech on iPhone, iPad, and Mac allows users to type and have it be spoken out loud during phone and FaceTime calls, as well as in-person conversations.

Personal Voice is a secure way to create a voice that sounds like them, and is recorded on iPhone 14 Pro Max. It uses on-device machine learning to keep users' information private and secure, and integrates seamlessly with Live Speech. The Magnifier app on iPhone and iPad allows users with vision disabilities to interact with physical objects that have several text labels. It also provides a detection mode in the Magnifier, which allows users to create their own voice in just 15 minutes. The app also integrates input from the camera, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad.

Finally, it works with VoiceOver and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions. Apple is introducing new features and curated collections to enable users to pair and customize their hearing devices for their hearing comfort. These include Voice Control Guide, Switch Control, Text Size, and VoiceOver. Voice Control Plus adds phonetic suggestions for text editing, and provides tips and tricks about using voice commands as an alternative to touch and typing. Users with physical and motor disabilities can turn any switch into a virtual game controller to play their favorite games on iPhone and iPad.

Text Size is now easier to adjust across Mac apps such as Finder, Messages, Mail, Calendar, and Notes. Siri voices sound natural and expressive even at high rates of speech feedback, and users can customize the rate at which Siri speaks to them. To celebrate Global Accessibility Awareness Day around the World, Apple is introducing SignTime, a new feature that will launch in Germany, Italy, Spain, and South Korea on May 18. Apple Store locations are offering informative sessions to help customers discover accessibility features. Shortcuts, Apple Podcasts, Apple TV app, and Apple Music will all feature movies and series curated by notable storytellers from the disability community. Three disability community leaders will share their experiences as nonspeaking individuals and the transformative effects of augmentative and alternative communication (AAC) apps in their lives on the App Store Accessibility Awareness spotlight displayed on iPhone 14 Pro.