Apple Previews New Door Detection, Apple Watch Mirroring, and Live Captions Accessibility Features - MacRumors
Skip to Content

Apple Previews New Door Detection, Apple Watch Mirroring, and Live Captions Accessibility Features

Apple today previewed a range of new accessibility features, including Door Detection, Apple Watch Mirroring, Live Captions, and more.

Apple Accessibility OS features 2022
Door Detection will allow individuals who are blind or have low vision to use their iPhone or iPad to locate a door upon arriving at a new destination, understand how far they are from it, and describe the door's attributes, including how it can be opened and any nearby signs or symbols. The feature will be part of a new "Detection Mode" in Magnifier, alongside People Detection and Image Descriptions. Door Detection will only be available on iPhones and iPads with a LiDAR scanner.

Users with physical disabilities who may rely on Voice Control and Switch Control will be able to fully control their Apple Watch Series 6 and Apple Watch Series 7 from their iPhone with Apple Watch Mirroring via AirPlay, using assistive features like Voice Control and Switch Control, and inputs such as voice commands, sound actions, head tracking, and more.

New Quick Actions on the Apple Watch will allow users to use a double-pinch gesture to answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout.

Deaf users and those who are hard of hearing will be able to follow Live Captions across the iPhone, ‌iPad‌, and Mac, providing a way for users to follow any audio content more easily, such as during a phone call or when watching video content. Users can adjust the font size, see Live Captions for all participants in a group FaceTime call, and type responses that are spoken aloud. English Live Captions will be available in beta on the iPhone 11 and later, ‌iPad‌ models with the A12 Bionic and later, and Macs with Apple silicon later this year.

Apple will expand support for VoiceOver, its screen reader for blind and low vision users, with 20 new languages and locales, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese. In addition, users will be able to select from dozens of new optimized voices across languages and a new Text Checker tool to find formatting issues in text.

There will also be Sound Recognition for unique home doorbells and appliances, adjustable response times for Siri, new themes and customization options in Apple Books, and sound and haptic feedback for VoiceOver users in Apple Maps to find the starting point for walking directions.

The new accessibility features will be released later this year via software updates. For more information, see Apple's full press release.

To celebrate Global Accessibility Awareness Day, Apple also announced plans to launch SignTime in Canada on May 19 to support customers with American Sign Language (ASL) interpreters, launch live sessions in Apple Stores and social media posts to help users discover accessibility features, expand the Accessibility Assistant shortcut to the Mac and Apple Watch, highlight accessibility features in Apple Fitness+ such as Audio Hints, release a Park Access for All guide in ‌Apple Maps‌, and flag accessibility-focused content in the App Store, Apple Books, the TV app, and Apple Music.

Popular Stories

iOS 26

iOS 26.4 Adds Two New Features to CarPlay

Tuesday March 24, 2026 1:55 pm PDT by
iOS 26.4 was released today, and it includes a couple of new features for CarPlay: an Ambient Music widget and support for voice-based chatbot apps. To update your iPhone 11 or newer to iOS 26.4, open the Settings app and tap on General → Software Update. CarPlay will automatically offer the new features so long as the iPhone connected to your vehicle is running iOS 26.4 or later....
Apple Business hero

Apple Unveils 'Apple Business' All-in-One Platform

Tuesday March 24, 2026 8:53 am PDT by
Apple today announced Apple Business, a new all-in-one platform that unifies device management, productivity tools, and customer outreach features. The service is designed to be a consolidated replacement for several of Apple's existing business-focused offerings, including Apple Business Essentials, Apple Business Manager, and Apple Business Connect. It provides organizations with a single...
AirPods Pro Firmware Feature

Apple Releases New Firmware for AirPods Pro 3, AirPods Pro 2 and AirPods 4

Tuesday March 24, 2026 12:31 pm PDT by
Apple today released new firmware for the AirPods Pro 2, AirPods Pro 3, and the AirPods 4. The firmware has a version number of 8B39, up from 8B34 on the AirPods Pro 3, 8B28 on the AirPods Pro 2, and 8B21 on the AirPods 4. There is no word on what's included in the firmware, but Apple has a support document with limited notes. Most updates are limited to bug fixes and performance...

Top Rated Comments

50 months ago

Apple again leads in accessibility. Love the Live captions and door detection.
To be fair, Android has this Live Captions feature already as well as Google Chrome. I had to rely on it on all platforms.

Microsoft announced and is testing Live Captions on Windows 11 insider builds for a few months now.

Apple is late as usual but I’m sure they will be the best implemented one as that is just them.

Regardless, everyone wins here. We need more accessibility support across the industry.
Score: 8 Votes (Like | Disagree)
50 months ago

I think the difference is that Google does all processing on their servers, Apple's implementation is on-device only and works offline. (not to mention your conversation stays private)
Actually, Google’s live caption is all done on-device and does not require an internet connection to function. They have been moving more and more voice request processing to on-device the past few years.
Score: 6 Votes (Like | Disagree)
50 months ago

Actually, Google’s live caption is all done on-device and does not require an internet connection to function. They have been moving more and more voice request processing to on-device the past few years.
This is correct. Taken from Android Accessibility Help ('https://support.google.com/accessibility/android/answer/9350862?hl=en') page: "All captions are processed locally, never stored, and never leave your device."

When it comes to accessibility, users need anything that can help them now. They can't sit around and wait for something else, so I would say Apple is late to the game here. I know a co-worker who switched to Android several years ago so he could use the live caption feature for meetings. Previously, he was using a captioning service over the phone, but was not a fan of having another live person listening in on the meetings.
Score: 5 Votes (Like | Disagree)
surfzen21 Avatar
50 months ago

Apple again leads in accessibility. Love the Live captions and door detection.
Agreed. A lot of their accessibility features seem to get over looked but they actually are life-changing for folks in need.
Score: 4 Votes (Like | Disagree)
Apple$ Avatar
50 months ago
Better late than never, Apple. As a CI Android user, I love the live captions feature so much! it's just so handy when you are watching a YouTube video that doesn't have captions. Instead of skipping it as I did in the past, I just turn on the live captions.
Score: 3 Votes (Like | Disagree)
50 months ago

To be fair, Android has this Live Captions feature already as well as Google Chrome. I had to rely on it on all platforms.

Microsoft announced and is testing Live Captions on Windows 11 insider builds for a few months now.

Apple is late as usual but I’m sure they will be the best implemented one as that is just them.

Regardless, everyone wins here. We need more accessibility support across the industry.
I think the difference is that Google does all processing on their servers, Apple's implementation is on-device only and works offline. (not to mention your conversation stays private)
Score: 3 Votes (Like | Disagree)