While it’s not usually the first thing everyone looks at after installing a new iOS software update, I’d give the new accessibility features on your iPhone priority attention because there are some highly valuable tools that even users without disabilities can enjoy.
Of course, Apple created the new accessibility features on iOS 16 with disabled people in mind, but more and more users are using them because they can vastly improve the user experience. What started as a project to make iPhones accessible to all users became one of the iPhone’s biggest advantages over other smartphones.
Accessibility features allow you to customize your iPhone and get the most out of it. You can force your iPhone to read for you, detect sounds, respond to voice commands, or take photos hands-free. And now, you can do even more with iOS 16’s new assistive features.
While we’ve already seen iOS 16.1 and 16.2 updates for iPhone, there weren’t any newly added accessibility features beyond a small icon change and a rumored-to-come-soon feature. However, iOS 16.0 still gave us plenty to take advantage of.
With iOS 16, you can now control your nearby devices that are on and signed into the same iCloud account. Go to Settings –> Accessibility –> Control Nearby Devices, then tap the “Control Nearby Devices” button to start a search. It will list all the devices you can control on the next screen, and tapping one gives you several options.
For example, when controlling my iPad (running iPadOS 16) with my iPhone, I can open the Home Screen, App Switcher, Notification Center, Control Center, and Siri. When playing audio or video on the device, I can also play and pause it, go to the previous or next track, and adjust the volume.
Magnifier has a new Door Detection option on iOS 16, which helps blind and low-vision users locate entryways when they arrive at their destination. The tool can tell you how far away the door is, if the door is open or closed, how to open it (push it, turn the knob, pull the handle, etc.), what any signs say (like room numbers), what any symbols mean (like people icons for restrooms), and more.
Door Detection uses the lidar scanner for light detection and ranging, which is only available on the following models:
- iPhone 14 Pro and 14 Pro Max
- iPhone 13 Pro and 13 Pro Max
- iPhone 12 Pro and 12 Pro Max
- iPad Pro 11-inch (2nd, 3rd, and 5th generations)
- iPad Pro 12.9-inch (4th, 5th, and 6th generations)
Another new feature in the Magnifier app is Image Descriptions. When you point your camera at something, it will show (or read) you detailed descriptions of what it sees. Unlike Door Detection, this feature is available for all iOS 16 users. It’s not always accurate, but it should improve as development continues.
Now that there are two more detection tools in Magnifier, a new Detection Mode menu is available that houses Door Detection, People Detection, and Image Descriptions.
If you don’t have one of the iPhone or iPad models that supports Door Detection (see above), which are also the same models that support People Detection, you can only add Image Descriptions to your controls, not the Detection Mode menu.
Magnifier also supports activities, which lets you save your current Magnifier configuration, including the controls panel, camera, brightness, contrast, filters, and detection modes. That way, you can use specialized setups for a particular recurring task or situation. To save your current layout, use “Save New Activity” from the Settings cog. You can switch between layouts via the cog, too. In the Activities settings, you can delete or duplicate customized options.
One of the most significant new accessibility features is Live Captions, which are helpful for people with hearing problems and anyone who cannot hear audio on their iPhone for any reason. It will work in phone and FaceTime calls, video calls on social media apps, streaming shows and other media, and even teleconferencing apps.
It’s also possible to customize the font size, color, and background color for easier reading. You can even move the captions like you can with the Picture in Picture player and set its idle opacity. And if you use a Mac for calls, you can reply to the conversation by typing and having your words read out loud in real time.
For now, Live Captions is available in the U.S. and Canada for iPhone 11 and later, iPad with A12 Bionic and later, and Macs with Apple silicon. If you’re worried about privacy, Apple promises that user information will stay private as Live Captions are generated directly on the device. You can’t record them when you take a screenshot.
If you have an Apple Watch, you can use most of your paired iPhone’s accessibility features to control it remotely, thanks to Apple Watch Mirroring.
With Apple Watch Mirroring, users can control Apple Watch using iPhone’s assistive features like Voice Control and Switch Control, and use inputs including voice commands, sound actions, head tracking, or external Made for iPhone switches as alternatives to tapping the Apple Watch display. Apple Watch Mirroring uses hardware and software integration, including advances built on AirPlay, to help ensure users who rely on these mobility features can benefit from unique Apple Watch apps like Blood Oxygen, Heart Rate, Mindfulness, and more.
Apple Watch Mirroring is available on Apple Watch Series 6 and later. To enable it, go to Settings –> Accessibility –> Apple Watch Mirroring, then toggle on the switch. Once connected, you can control your Watch completely through your iPhone.
You can now enable even more languages for VoiceOver, Speak Selection, and Speak Screen. The supported languages include the following. Asterisks (*) indicate languages that are new on iOS 16.
- Arabic (World) *
- Bangla aka Bengali (India) *
- Basque (Spain) *
- Bhojpuri (India) *
- Bulgarian (Bulgaria) *
- Catalan (Spain) *
- Chinese (China mainland)
- Chinese (Hong Kong)
- Chinese (Liaoning, China mainland) *
- Chinese (Shaanxi, China mainland) *
- Chinese (Sichuan, China mainland) *
- Chinese (Taiwan)
- Croatian (Croatia) *
- Czech (Czechia)
- Danish (Denmark)
- Dutch (Belgium)
- Dutch (Netherlands)
- English (Australia)
- English (India)
- English (Ireland)
- English (Scotland, UK) *
- English (South Africa)
- English (UK)
- English (US)
- Finnish (Finland)
- French (Belgium) *
- French (Canada)
- French (France)
- Galician (Spain) *
- German (Germany)
- Greek (Greece)
- Hebrew (Israel)
- Hindi (India)
- Hungarian (Hungary)
- Indonesian (Indonesia)
- Italian (Italy)
- Japanese (Japan)
- Kannada (India) *
- Korean (South Korea)
- Malay (Malaysia) *
- Marathi (India) *
- Norwegian Bokmål (Norway)
- Persian aka Farsi (Iran) *
- Polish (Poland)
- Portuguese (Brazil)
- Portuguese (Portugal)
- Romanian (Romania)
- Russian (Russia)
- Shanghainese (China mainland) *
- Slovak (Slovakia)
- Slovenian (Slovenia) *
- Spanish (Argentina)
- Spanish (Chile) *
- Spanish (Columbia)
- Spanish (Mexico)
- Spanish (Spain)
- Spanish (Sweden)
- Tamil (India) *
- Telugu (India) *
- Thai (Thailand)
- Turkish (Turkey)
- Ukrainian (Ukraine) *
- Valencian (Spain) *
- Vietnamese (Vietnam) *
Dozens of new voices are also available for VoiceOver, Speak Selection, and Speak Screen — all optimized assistive features and languages. For English, new voices include Agnes, Bruce, Eloquence, Evan, Joelle, Junior, Kathy, Nathan, Noelle, Ralph, Vicki, and Zoe.
There are also novelty voices, including Albert, Bad News, Bahh, Bells, Boing, Bubbles, Cellos, Good News, Jester, Organ, Superstar, Trinoids, Whisper, Wobble, and Zarvox.
There are a few new options to work with in Settings –> Accessibility –> VoiceOver –> Activities –> Programming, the menu that lets you create groups of preferences for specific uses.
First is Typing Style, which lets you choose between Default, Standard, Touch, and Direct Touch. The second is Navigation Style, with Default, Flag, and Grouped choices. And the third is Braille Alert Messages, where you can pick either Default, On, or Off. These options were available before, just not for programming activities.
When using VoiceOver in Apple Maps, you’ll get automatic sound and haptic feedback to help you identify the starting point for walking directions.
If you have difficulty using a game controller, the new Buddy Controller feature lets a friend or care provider help you play a game. It works by combining two game controllers into one, so you can effectively play together as a single player. If this sounds familiar, that’s because Xbox consoles offer a similar feature called Co-pilot.
In Settings –> Accessibility –> Siri, you’ll find a new section called Siri Pause Time, which lets you set how long Siri waits for you to finish speaking. You can leave the default setting or choose Longer or Longest. This tool is perfect for you if Siri always seems like an impatient interrupter.
Sound Recognition has been available since iOS 14, but in iOS 16, you can train your iPhone to recognize specific sounds from your environment. Go to Settings –> Accessibility –> Sound Recognition –> Sounds, and choose “Custom Alarm” or “Custom Appliance or Doorbell.”
To delete custom alarms and sounds, swipe left on them from the Sounds menu. You can also tap “Edit,” then the delete icon (red circle with a white line in the middle), and confirm with “Delete.”
You’re probably already used to the iPhone’s dictation feature, but now you can use Spelling mode in Voice Control to spell out a word letter by letter so there are no misunderstandings. Use it to dictate names, addresses, acronyms, and more. The feature is currently only available in US English.
Aside from Spelling Mode, Voice Control also has new commands for:
- Open App Library
- Show Keyboard
- Hide Keyboard
- Press <key name> key
- Hang up
- Turn on Apple Watch Mirroring
- Turn off Apple Watch Mirroring
- Turn on Apple Watch Remote Control
- Turn off Apple Watch Remote Control
- Turn on Full Keyboard Access
- Turn off Full Keyboard Access
- VoiceOver activate
- VoiceOver Magic Tap
- VoiceOver select <number>
- VoiceOver read all
- VoiceOver select <item name>
- VoiceOver select first item
- VoiceOver select last item
- VoiceOver select next app
- VoiceOver select previous app
- VoiceOver select next item
- VoiceOver select previous item
- VoiceOver select next rotor
- VoiceOver select previous rotor
- VoiceOver select next rotor option
- VoiceOver select previous rotor option
- VoiceOver select status bar
- VoiceOver Item Chooser
- VoiceOver speak summary
- VoiceOver stop speaking
- VoiceOver screen curtain
- Increase zoom
- Decrease zoom
- Maximize zoom
- Minimize zoom
- Zoom down
- Zoom up
- Zoom left
- Zoom right
In iOS 16, the Apple Books app comes with new themes and accessibility options. The app has been redesigned, and the new interface is simplified, which also helps make it more accessible. You can bold text and customize spacing for easier reading. And there are a few new themes you can use to make the app easier on the eyes.
On iOS 16.2 and later, there are new Books actions in the Shortcuts app, including a Change Book Appearance action. So you can create shortcuts to each book in the app, applying different appearances to each, so you don’t have to keep changing them manually.
If you’ve been known to accidentally hang up on people during phone or FaceTime calls by clicking the Side button, which locks the screen and unintentionally ends calls, then iOS 16 can help. In Settings –> Accessibility –> Touch, there’s a new switch for “Prevent Lock to End Call.” Toggling it on will stop you from hanging up prematurely whenever your iPhone’s screen locks.
It wasn’t possible to ask Siri to end a phone or FaceTime call for you, but now you can do it by saying, “Hey Siri, hang up” while talking to someone. The downside is that the person you’re talking to will hear you saying the command, but it’s great for ending the call hands-free for whatever reason. You can enable it in the Siri & Search settings (left screenshot below) or Siri accessibility settings (right screenshot below).
This feature is available on the iPhones with an A13 Bionic chip or later, meaning the iPhone 11 and newer. However, it’s also supported on models with an A12 Bionic chip — the iPhone XS, XS Max, and XR — when using AirPods or Siri-enabled Beats headphones.
The auto-answer calls option is a great help to some users with disabilities. Still, there was one catch — it had to be turned on manually via Settings –> Accessibility –> Touch –> Call Audio Routing –> Auto-Answer Calls. Now, you can say, “Hey Siri, turn on auto-answer,” or “Hey Siri, turn off auto-answer.” Besides iOS 16, it’s also available for WatchOS 9.
Your iPhone can read incoming messages and notifications, but the feature only worked when combined with AirPods or Beats headphones. On iOS 16, it also works on your iPhone’s speaker and with Made for iPhone hearing aids. It’s an essential tool for anyone who can’t pick up their iPhone to read the latest text or notification.
Have Siri read out notifications. Siri will avoid interrupting you and will listen after reading notifications so you can respond or take actions without saying ‘Hey Siri.’ Siri will announce notifications from new apps that send Time Sensitive notifications or direct messages.
You can also set Siri to send a reply in supported apps without asking you if you’re sure you want to send it.
Back Tap allows you to tap the Apple logo on the back of your iPhone two or three times to trigger an action, such as taking a screenshot without a thumbnail appearing or opening Spotlight. Regarding accessibility, there are two more options for Back Tap: Control Nearby Devices and Live Captions.
If you activate a specific accessibility tool using the Accessibility Shortcut, where you triple-click the Home or Side button on your iPhone, you’ll be happy to know that it now also supports the new Control Nearby Devices and Live Captions features, just like Back Tap does.
In Settings –> Accessibility –> Zoom, you’ll see a new switch for “Show while Mirroring.” It’s disabled by default, but when on, it will show the zoomed appearance whenever sharing your screen or during screen recordings. However, it doesn’t seem to work when using QuickTime or other screen recorders on Mac that use QuickTime’s protocol.
If you use the Health app, you can now import your audiograms into it on your iPhone. Go to Browse –> Hearing –> Audiogram, then tap “Add Data.” You can use your camera to take a picture of your audiogram, choose an audiogram image from your Photos app, or upload an audiogram document from Files.
While not in iOS 16.0 or 16.1, Apple has started testing a new Custom Accessibility Mode on iOS 16.2, but it wasn’t live for anyone to try and didn’t make the final stable release. Codenamed Clarity, the mode “creates a streamlined iPhone experience,” to quote Apple’s description.
The mode creates a new user interface that replaces the regular one, letting you change specific elements to suit your needs. You can view apps in lists instead of grids, use giant app icons, make the Lock Screen easier to unlock, and more. For detailed information, check out our guide to iOS 16.2’s new features.
Keep Your Connection Secure Without a Monthly Bill. Get a lifetime subscription to VPN Unlimited for all your devices with a one-time purchase from the new Gadget Hacks Shop, and watch Hulu or Netflix without regional restrictions, increase security when browsing on public networks, and more.
Other worthwhile deals to check out: