Apple Reveals New Accessibility Upgrades For Apple Devices In Celebration Of Global Accessibility Awareness Day
Apple announced a new batch of accessibility upgrades for the iPhone, iPad, Mac, and Apple Watch that are set to roll out as software updates later this year.
One feature the company will beta test is Live Captions which can transcribe any audio content in English from the iPhone, iPad, and Mac. This can include FaceTime calls, video conferencing apps (with auto attribution to identify the speaker), streaming video, or in-person conversations.
Development of the audio-transcribing features started around the release of Android 10 and are already available in English on the Pixel 2 and later devices, as well as "select" other Android phones and in additional languages for the Pixel 6 and Pixel 6 Pro.
Additionally, the Apple Watch will receive a boost in Assistive Touch gesture recognition controls through Quick Actions that detect the pinching motion of your fingers. It can recognize a double pinch to end a call, dismiss notifications, take a picture, pause/play media, or start a workout.
Apple's smartwatch brand will also be more accessible for people with physical and motor disabilities with a new mirroring feature that will add remote control from a paired iPhone. Apple Watch Mirroring includes services pulled from AirPlay, making access to the unique features of the Watch more manageable without relying specifically on your ability to tap on its tiny screen or voice controls.
Another function Apple rolled out is Sound Recognition with iOS 14 which works by picking up on specific sounds like a smoke alarm or water running and can alert users who may be deaf or hard of hearing. Sound recognition may soon allow customized sounds to be added to its radar such as a door alert or a chime from an appliance.
Its VoiceOver screen reader app, Speak Selection and Speak Screen features will get support for 20 new "locales and languages. When using the Mac, VoiceOver's new Text Checker will scan for formatting issues like extra spaces or capital letters. When using Apple Maps, new sound and haptics feedback will indicate where to start for walking directions.
There will also be an assistive feature called Door Detection that uses the camera of an iPhone or iPad to announce entryways at an unfamiliar location as well as describes the door (knob/handle and open/closed). This is part of the Detection Mode being added to the Magnifier in iOS, which lets the camera zoom in and identify nearby objects, or recognize nearby people and alert the user with sounds, speech, or haptic feedback.
Other accessibility functions include a Buddy Controller that combines two game controllers to help a friend play a game, tweaks to the letter-by-letter input of Voice Control Spelling Mode, and further visual tweaks for Apple Books that can bold text, change themes, or adjust the line, character, and word spacing to make it more readable.
All the revealed features are part of Apple's recognition of Global Accessibility Awareness Day on May 19th. Apple Store locations will offer live sessions to help people find out more about existing features, and an upcoming Accessibility Assistant shortcut will arrive on Mac and Apple Watch this week to recommend specific features based on preferences selected by a user.