May 16, 2023
PRESS RELEASE
Apple introduces new features for cognitive accessibility with live speech, personal voice, and Point and Speak in Magnifier.
New software features for cognitive, speech and vision accessibility are coming later this year
CUPERTINO, CALIFORNIA Apple today previewed software features for cognitive, vision, hearing and mobility accessibility, as well as innovative tools for people who cannot speak or are at risk of speech loss. These updates leverage advances in hardware and software, include on-device machine learning to ensure user privacy, and extend Apple’s long-standing commitment to making products for everyone.
Apple works closely with community groups representing a wide range of users with disabilities to develop accessibility features that make a real impact on people’s lives. Later this year, users with cognitive disabilities can use their iPhone and iPad more comfortably and independently with Assistive Access; people who cannot speak can write to speak during calls and chats with Live Speech; and those at risk of losing the ability to speak can use Personal Voice to create a similar synthesized voice to communicate with family and friends. For blind or low-vision users, Magnifier offers Point-and-Speak Detection Mode, which identifies text showing users’ faces and reads it aloud to help them interact with physical objects like appliances.
“At Apple, we’ve always believed that the best technology is technology built for everyone,” said Apple CEO Tim Cook. “Today, we’re excited to share incredible new features that build on our long history of making technology accessible so that everyone can create, communicate and do what they love.”
Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, said: “Accessibility is part of everything we do at Apple. “These groundbreaking features were designed with input from members of the disability community every step of the way to support diverse users and help people connect in new ways.”
Assistive Access Supports Users with Cognitive Disabilities
Assistive Access leverages design innovations to tailor applications and experiences to their core features to ease cognitive load. The feature features feedback from people with cognitive disabilities and their trusted supporters – focusing on the activities they enjoy – and the core of iPhone and iPad: connecting with loved ones, taking and enjoying pictures, and listening to music.
Assistive Access includes a personalized experience for Phone and FaceTime, as well as Messages, Camera, Photos and Music, combined in a single Calls app. The feature offers a distinctive interface with high-contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support. For example, for users who prefer visual communication, Messages includes an emoji-only keyboard and the ability to record video messages to share with loved ones. Users and trusted supporters can also choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for users who prefer text.
“The intellectual and developmental disability community is full of creativity, but technology often creates physical, visual or cognitive barriers for these individuals,” said Katy Schmid, CEO of The Arc National Program Initiatives in the United States. “Having a feature on your iPhone or iPad that provides a cognitively accessible experience means more doors for education, employment, security and autonomy. It means expanding worlds and expanding potential.”
Access to Live Speech and Personal Voice Forward Talk
With Live Speech on iPhone, iPad, and Mac, users can record what they want to say to be spoken aloud during phone and FaceTime calls, as well as face-to-face conversations. Users can also save frequently used phrases for quick calling during live chat with family, friends and colleagues. Live Speech is designed to support the millions of people globally who cannot speak or have lost their speech over time.
For users who are at risk of losing the ability to speak – for example, those with a new diagnosis such as ALS (amyotrophic lateral sclerosis) or other diseases that can gradually affect the ability to speak – Personal Voice is a simple and safe way to create a voice that sounds like it. they are.
Users can create a Personal Voice by reading along with a set of randomized text prompts to record 15 minutes of audio on their iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users’ data private and secure, and seamlessly integrates with Live Speech so users can speak with their own Voice while connecting with loved ones.1
“At the end of the day, the most important thing is to be able to communicate with friends and family,” said Philip Green, board member of the nonprofit Team Gleason and ALS advocate, who experienced significant changes in his voice after receiving an ALS diagnosis in 2018. “If you can tell them you love them with a voice that sounds like you, it makes all the difference in the world – and being able to create your synthetic voice on an iPhone in just 15 minutes is extraordinary.”
Detection mode in Magnifier provides Point and Talk for users who are blind or have low vision
The Show and Talk feature in Magnifier makes it easy for visually impaired users to interact with physical objects with multiple text labels. For example, when using household appliances like a microwave oven, Point and Speak combines input from the camera app, LiDAR Scanner, and on-device machine learning to announce the text on each key as users move their finger across the keyboard.2 Point and Speak is built into the Magnifier app on iPhone and iPad, works great with VoiceOver, and can be used with other Magnifier features like People Detection, Door Detection, and Image Descriptions to help users navigate the physical environment.
Additional Features
- Deaf or hard of hearing users can pair iPhone is designed for hearing aids transfer directly to your Mac and customize them for your listening comfort.3
- Voice Control adds phonetic suggestions for text editing so that users typing in their own voice can choose the correct word from several words that sound the same, such as “do”, “time” and “sheh”.4 In addition, with Voice Control Guideusers can learn tips and tricks for using voice commands as an alternative to touch and typing on iPhone, iPad and Mac.
- Users with physical and motor disabilities Change control It can turn any switch into a virtual game controller to play your favorite games on iPhone and iPad.
- For visually impaired users, Text Size It’s now easier to customize Mac apps like Finder, Messages, Mail, Calendar, and Notes.
- Users sensitive to fast animations can do so automatically freeze images with moving elementsLike GIFs in Messages and Safari.
- for VoiceOver users, Siri voices sound natural and expressive even in loud speech feedback; users can also customize the speed at which Siri speaks to them, with options ranging from 0.8x to 2x.
Celebrating Global Accessibility Awareness Day around the world
To celebrate Global Accessibility Awareness Day, this week Apple is introducing new features, curated collections and more:
- SignTime It will launch in Germany, Italy, Spain and South Korea on May 18 to connect Apple Store and Apple Support customers with on-demand sign language interpreters. The service is now available to customers in the US, Canada, UK, France, Australia and Japan.5
- choose Apple Store locations Informational sessions are offered throughout the week to help customers around the world explore accessibility features, and the Apple Carnegie Library will host a “Today at Apple” session with a sign language performer and interpreter. Justina Miles. With group reservations – available year-round – Apple Store locations are where community groups can learn about accessibility features together.
- Shortcuts Remember This helps cognitively impaired users create a visual diary in Notes for easy reference and reflection.
- This week, Apple podcasts will present a series of shows about the impact of accessible technology; the Apple TV the program consists of films and series directed by prominent storytellers from the disability community; Apple Books will be the focus Becoming Heumann: An Unrepentant Memoir of a Disability Rights Activist, a memoir written by disability rights pioneer Judith Heumann; and Apple Music cross-genre American Sign Language (ASL) music videos will be presented.
- In this week Apple Fitness+, trainer Jamie-Ray Hartshorne incorporates ASL while highlighting the features available to users, part of an ongoing effort to make fitness more accessible to all. Features include Audio Prompts, which provide additional descriptive verbal instructions to support users who are blind or partially sighted, and the Walk Time and Run Time episodes are ‘Walk or Push Time’ and ‘Run or Push Time’ for wheelchair users. . In addition, Fitness+ trainers incorporate ASL into every exercise and meditation, all videos are subtitled in six languages, and trainers demonstrate exercise variations so that users of different skill levels can connect.
- The App Store will highlight three disabled community leaders – Aloysius Gan, Jordyn Zimmerman and Bradley Heaven — will each share their own experiences as individuals who cannot speak and the transformative effects of augmentative and alternative communication (AAC) applications in their lives.
About Apple
Apple revolutionized personal technology in 1984 with the introduction of the Macintosh. Today, Apple leads the world in innovation with the iPhone, iPad, Mac, Apple Watch and Apple TV. Apple’s five software platforms – iOS, iPadOS, macOS, watchOS and tvOS – provide a seamless experience across all Apple devices and empower people with cutting-edge services including the App Store, Apple Music, Apple Pay and iCloud. Apple’s more than 100,000 employees are committed to making the best products on earth and leaving the world better than we found it.
- Personal Voice can be created with Apple silicon using iPhone, iPad and Mac and will be in English.
- Point and Speak with LiDAR Scanner on iPhone and iPad will be available in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese and Ukrainian.
- Users will be able to pair their Made for iPhone hearing aids with select Mac devices with the M1 chip and all Mac devices with the M2 chip.
- Voice Control phonetic suggestions will be available in English, Spanish, French and German.
- SignTime sessions are available using American Sign Language (ASL) in the US and Canada, British Sign Language (BSL) in the UK, French Sign Language (LSF) in France, and Japanese Sign Language (JSL) in Japan. Australia uses Australian Sign Language (Auslan). On May 18, SignTime will be available using German Sign Language (DGS) in Germany, Italian Sign Language (LIS) in Italy, Spanish Sign Language (LSE) in Spain, and Korean Sign Language (KSL) in South Korea.
Click the Contacts button
Will Butler
apple
willbutler@apple.com
Eric Hollister Williams
apple
e_hollisterwillia@apple.com
Apple Media Helpline
media.help@apple.com