May 16, 2023
Apple introduces new cognitive accessibility features, along with Live Speech, Personal Voice, Point and Speak in Magnifier
New cognitive, verbal, and visual accessibility software features are coming later this year
Cupertino, California Apple today previewed software features for cognitive, visual and auditory accessibility and mobility, along with innovative tools for individuals who do not speak or are at risk of losing their ability to speak. These updates build on advances in hardware and software, and include on-device machine learning to ensure user privacy, and extend Apple’s longstanding commitment to making products for everyone.
Apple works in deep collaboration with community groups that represent a broad range of users with disabilities to develop accessibility features that have a real impact on people’s lives. Later this year, users with cognitive disabilities can use iPhone and iPad more easily and independently with Assistive Access; Non-speaking individuals can type to speak during calls and conversations using Live Speech; People at risk of losing their ability to speak can use a personal voice to create a synthesized voice of their own to communicate with family and friends. For users who are blind or have low vision, the magnifier’s discovery mode offers Point and Speak, which identifies text that users point at and reads it out loud to help them interact with physical objects such as household appliances.
“At Apple, we’ve always believed that the best technology is technology designed for everyone,” said Apple CEO Tim Cook. “Today, we’re excited to share incredible new features that build on our long history of making technology accessible, so everyone has the opportunity to create, connect, and do what they love.”
“Accessibility is part of everything we do at Apple,” said Sarah Herlinger, senior director of global accessibility policies and initiatives at Apple. “These groundbreaking features are designed with feedback from members of disability communities every step of the way, to support a diverse group of users And helping people connect in new ways.”
Assisted Access supports users with cognitive impairments
Assistive Access uses design innovations to distill apps and experiences down to their core features in order to reduce cognitive load. The feature reflects feedback from people with cognitive disabilities and their trusted supporters — focusing on the activities they enjoy — that are central to iPhone and iPad: communicating with loved ones, taking and enjoying photos, and listening to music.
Assistive Access includes a customized experience for Phone and FaceTime, which are combined into a single calling app, along with Messages, Camera, Photos, and Music. The feature provides a distinctive interface with high-contrast buttons and large text labels, as well as tools to help trusted supporters personalize the experience for the individuals they support. For example, for users who prefer to communicate visually, Messages includes an emoji-only keyboard and the option to record a video message to share with loved ones. Trusted users and supporters can also choose between a visual grid-based layout for their home screen and apps, or a row-based layout for users who prefer text.
“The intellectual and developmental disability community is full of creative people, but technology often poses physical, visual, or cognitive barriers for these individuals,” said Katie Schmid, senior director of national program initiatives at The Arc of the United States. “For an advantage that delivers a cognitively accessible experience on iPhone or iPad — it means more open doors to education, employment, security, and independence. It means expanding worlds and expanding possibilities.”
Access to live speech and advanced personal voice
With Live Speech on iPhone, iPad, and Mac, users can type what they want to say out loud during phone calls, FaceTime calls, as well as in-person conversations. Users can also save commonly used phrases to quickly chime in during a live chat with family, friends, and colleagues. Live Speech is designed to support the millions of people around the world who are unable to speak or who have lost their speech over time.
For users at risk of losing their ability to speak—such as those recently diagnosed with ALS (amyotrophic lateral sclerosis) or other conditions that can gradually affect the ability to speak—Personal Voice is a simple and safe way to create a voice that looks like their own.
Users can create a personal audio by reading along with a random selection of text messages to record 15 minutes of audio on an iPhone or iPad. This speech access feature uses on-device machine learning to keep users’ information private and secure, and integrates seamlessly with Live Speech so users can speak in their own personal voice when calling their loved ones.1
“At the end of the day, the most important thing is being able to communicate with friends and family,” said Philip Green, a board member and ALS advocate for the nonprofit Team Gleason, who has seen significant changes in his voice since receiving ALS. Diagnosis in 2018. “If you can tell them you love them, with a voice that sounds like you, that makes all the difference in the world — and being able to create your own artificial voice on your iPhone in just 15 minutes is extraordinary.”
The magnifier’s detection mode offers point and talk for users who are blind or visually impaired
Point and Speak in Magnifier makes it easier for visually impaired users to interact with physical objects that have many text labels. For example, while using a home device — such as a microwave — Point and Speak combine input from a camera app, LiDAR scanner, and on-device machine learning to announce the text on each button as users move their finger across the keyboard.2 Point and Speak is built into the Magnifier app on iPhone and iPad, works great with VoiceOver, and can be used with other Magnifier features like people detection, door detection, and photo descriptions to help users navigate their physical environment.
- Deaf or hard of hearing users can pair Hearing aids designed for iPhone directly to your Mac and customize it for your hearing comfort.3
- Voice control Adds voice suggestions for text editing so that users who type with their voice can choose the correct word among several that may sound the same, such as “do”, “deserved”, and “nada”.4 In addition, with Voice control guideUsers can learn tips and tricks on using voice commands as an alternative to touch and typing across iPhone, iPad and Mac.
- Users with physical and mobility impairments control key They can turn any key into a virtual game controller to play their favorite games on iPhone and iPad.
- for visually impaired users, font size Now it’s easy to set across Mac apps like Finder, Messages, Mail, Calendar, and Notes.
- Users sensitive to fast animation can do this automatically Turn off images with moving elementssuch as GIFs, in Messages and Safari.
- to Voice over For users, Siri’s voice sounds natural and expressive even at high rates of verbal feedback; Users can also customize the rate at which Siri speaks to them, with options ranging from 0.8x to 2x.
Celebrating International Accessibility Awareness Day around the world
To celebrate World Accessibility Awareness Day, this week Apple is introducing new features, curated collections, and more:
- SignTime It will launch in Germany, Italy, Spain and South Korea on May 18th to connect Apple Store and Apple Support customers with on-demand sign language interpreters. The service is already available to customers in the US, Canada, UK, France, Australia and Japan.5
- He chooses Apple Store locations Worldwide offers informational sessions throughout the week to help customers discover accessibility features, and the Apple Carnegie Library will feature a Today at Apple session with a sign language performer and interpreter Justina Miles. And with group bookings — available year-round — Apple Store locations are a place where community groups can learn about accessibility features together.
- abbreviations Adds Remember This, which helps users with cognitive disabilities create a visual diary in Notes for easy reference and reflection.
- this week, Apple Podcast will present a range of presentations on the impact of accessible technology; the Camel The app features films and series curated by notable storytellers from the disability community; Apple Books I’ll highlight Being a Human: The Unrepentant Memoir of a Disability Rights ActivistJudith Heumann, a memoir of disabled rights pioneer; And Apple Music It will show multi-genre music videos in American Sign Language (ASL).
- This week in Apple Fitness +Jamie-Ray Hartshorne, trainer Jamie-Ray Hartshorne, is integrating ASL while highlighting the features available to users that are part of an ongoing effort to make fitness more accessible to everyone. Features include audio cues, which provide additional short descriptive verbal cues to support users who are blind or have low vision, and “Time to walk and time to run” loops have become “Time to walk or push” and “Time to run or push” for wheelchair users. In addition, Fitness + ASL trainers incorporate into every exercise and meditation, all videos include closed captioning in six languages, and trainers demonstrate modifications in the exercises so that users of all levels can join in.
- the app store You will highlight three leaders of the disability community – Aloysius Jan, Jordin Zimmerman, and Bradley Heaven – Each will share their experiences as non-talkers and the transformative effects of augmentative and alternative communication (AAC) applications in their lives.
Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today, Apple leads the world in innovation with iPhone, iPad, Mac, Apple Watch, and Apple TV. Apple’s five software platforms—iOS, iPadOS, macOS, watchOS, and tvOS—deliver seamless experiences across all of their Apple devices and empower people to offer advanced services including the App Store, Apple Music, Apple Pay, and iCloud. Apple’s more than 100,000 employees are dedicated to making the best products on Earth, and to leaving the world in a better state than we found it.
- Personal audio can be created using iPhone, iPad and Mac with Apple silicon, and will be available in English.
- Point and Speak on iPhone and iPad with LiDAR Scanner will be available in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese and Ukrainian.
- Users will be able to pair Made for iPhone hearing aids with select Macs with the M1 chip and all Macs with the M2 chip.
- Voice suggestions for voice control will be available in English, Spanish, French and German.
- SignTime sessions are available in the United States and Canada using American Sign Language (ASL), in the United Kingdom using British Sign Language (BSL), in France using French Sign Language (LSF), in Japan using Japanese Sign Language (JSL), and in Australia using Language Australian Signal (Auslan). On May 18, SignTime will be available in Germany using German Sign Language (DGS), in Italy using Italian Sign Language (LIS), in Spain using Spanish Sign Language (LSE), and in South Korea using Korean Sign Language (KSL).
Click on Contacts
Eric Hollister Williams
Apple Media Helpline
“Freelance web ninja. Wannabe communicator. Amateur tv aficionado. Twitter practitioner. Extreme music evangelist. Internet fanatic.”