Apple adds system-wide Live Captions as part of larger accessibility update

Global Accessibility Awareness Day is next Thursday (May 19) and Apple, like many other companies, is announcing supporting updates in honor of the occasion† The company is bringing new features to iPhone, iPad, Mac, and Apple Watch, and the most intriguing is the system-wide Live Captions.

Equivalent to Google’s implementation on Android, Apple’s Live Captions will transcribe audio played on your iPhone, iPad, or Mac in real time and display captions on the screen. It will also caption sound around you so you can use it to follow conversations in the real world. You can adjust the size and position of the caption box and also choose different font sizes for the words. The transcription is also generated on the device. But unlike Android, Live Captions on FaceTime calls will also clearly differentiate between speakers, using icons and names to map what’s being said. In addition, Mac users can type a response and have it spoken out loud to others in the conversation in real time. Live Captions will be available in beta in English for people in the US and Canada.

Apple is also updating its existing sound recognition tool, which allows iPhones to continuously listen for sounds such as alarms, sirens, doorbells, or crying babies. With an upcoming update, users will be able to train their iPhones or iPads to listen to custom sounds, such as your washing machine’s “I’m done” song or perhaps your pet duck’s quacking. A new feature called Siri Pause Time also lets you extend the assistant’s wait time when you respond or ask something, so you can take the time to say what you need.

Two screenshots showing Apple’s new accessibility features. The first shows

The company is updating its Magnifier app that helps visually impaired people better interact with people and objects around them. Expand on a previous people detection tool telling users how far others were around them, Apple is adding a new Door Detection feature. This will use the iPhone’s LiDAR and camera to not only locate and identify doors, but will also read aloud text or symbols on the display, such as opening hours and signs with restrooms or accessible entrances. In addition, it describes the handles, whether a button has to be pressed, pulled or turned, as well as the color, shape, material of the door and whether it is closed or open. Together, people and door detection will be part of the new detection mode in Magnifier.

Updates are also coming to Apple Watch. Last year the company introduced Supportive touch, allowing people to interact with the wearable without touching the screen. The watch would sense whether the hand it sits on is making a fist or whether the wearer is touching their index finger and thumb together for a “squeeze” action. With an upcoming software update, it should be faster and easier to enable quick actions on assistive touch, which lets you use gestures like double-pinch to answer or end calls, take photos, start a workout, or play the game. pause from media.

But Assistive Touch is not a method that everyone can use. For those with physical or motor disabilities that make it impossible to use hand gestures, the company is bringing some form of voice and switch control to its smartwatch. Called Apple Watch Mirroring, the feature uses hardware and software, including AirPlay, to transfer a user’s preset voice or switch control preferences from their iPhones to the wearable, for example. This would allow them to use their head tracking, sound actions and Made For iPhone switches to interact with their Apple Watch.

Apple is adding more customization options to the Books app, allowing users to apply new themes and adjust line heights, word and character spacing, and more. The VoiceOver screen reader will also soon be available in more than 20 new languages ​​and locales, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese. Dozens of new voices will also be added, as well as a voice control spelling mode that allows you to dictate custom spellings using letter-by-letter input

Finally, the company is launching a new feature called Buddy Controller that allows people to use two controllers to control one player, which would be useful for users with disabilities who want to collaborate with their caregivers. Buddy Controller works with supported game controllers for iPhone, iPad, Mac and Apple TV. Many more updates are coming across the Apple ecosystem, including on-demand interpreters for American Sign Language expanding to the Apple Store and support in Canada, as well as a new Maps guide, curated playlists in Apple TV and Music, and the addition of the Accessibility Assistant to the Shortcuts app on Mac and Watch. The features previewed today will be available later this year.

#Apple #adds #systemwide #Live #Captions #part #larger #accessibility #update

Leave a Comment

Your email address will not be published. Required fields are marked *