iOS Accessibility Features Most People Never Touch
From Back Tap shortcuts to Voice Control macros — the iOS accessibility settings that work as power-user tools even if you don't need them for accessibility.

Here's a thing I noticed as an app developer. The features my users love most are often the ones buried deepest in Settings. Nobody finds them. Nobody talks about them. They just sit there.
iOS Accessibility is exactly that. Most people think it's a section for people with vision problems or motor disabilities. That's not wrong. But it's incomplete. Apple's engineering team built some of the most genuinely useful tools in the entire OS and labeled them "Accessibility." So everyone skips them.
I've been building iOS apps for a few years. I test every build with accessibility features turned on. That's how I started actually using them. A few of them I kept on permanently.
This guide covers the ones worth knowing. No fluff. Go to Settings → Accessibility and follow along.
1. Back Tap — The Shortcut Nobody Knows About
Double-tap or triple-tap the back of your iPhone. That's it. That's the feature.
You can assign any action to it. Take a screenshot. Open the camera. Scroll to the top of a page. Run a Shortcut. Apple called it "Back Tap" and buried it in Accessibility → Touch → Back Tap.
I set double-tap to Screenshot and triple-tap to a Shortcut that opens my notes app. No swipes, no button combos. Just tap the back of the glass.
The list of assignable actions is long. You can trigger Accessibility Shortcut, Reachability, Spotlight, Shake, Siri, and any Shortcut you've created. That last one is where it gets powerful.
If you use Shortcuts at all, Back Tap turns your phone into a physical button for your most-used automation. I use it to start a timer. One developer I know uses it to log a water intake entry. Simple, fast, no unlock required.
- Fastest way to trigger any action
- Works while screen is on
- Connects to Shortcuts
- No screen real estate used
- Thick cases can block detection
- Occasionally misfires while typing
- iPhone 7 and earlier not supported
2. Speak Screen — Your Phone Reads to You
Swipe down from the top of the screen with two fingers. iOS starts reading everything on the screen out loud.
That's Speak Screen. It's in Accessibility → Spoken Content → Speak Screen. Turn it on. Then try it on a long article or a news page.
The voice quality has improved a lot over the years. On iOS 17 and 18, the Enhanced voices sound close to a real person. I use this on long read-later articles while doing dishes or walking.
“I have a backlog of 200+ saved articles. Speak Screen is how I actually get through them without staring at a screen for three hours.”
There's a floating controller that appears. You can pause, rewind 15 seconds, adjust the reading speed, and highlight words as they're spoken. The word-highlight feature is underrated — useful when learning a new language too.
There's also "Speak Selection" in the same menu. Enable it and you can tap any selected text to have just that part read aloud. Useful for proofreading your own writing or checking how a foreign word sounds.
3. Reachability — One-Handed Phone Use, Done Right
iPhones are large. Your thumb is not. Reaching the top-right corner while holding your phone with one hand is awkward and leads to dropped phones.
Reachability fixes this. Enable it in Accessibility → Touch → Reachability. Then swipe down on the bottom edge of the screen — just below the gesture bar. The entire UI drops to the lower half of the screen.
Tap what you need. The screen bounces back. Takes two seconds.
This one feels obvious once you know it. I don't know how I used a Pro Max without it. If you have large hands and never had this problem — ignore it. If you use your phone one-handed regularly, turn it on now.
4. AssistiveTouch — A Software Button That Does Everything
A small circle appears on screen. Tap it. A menu opens with shortcuts. That's AssistiveTouch.
Find it at Accessibility → Touch → AssistiveTouch. The default menu has Home, Siri, Device, and Custom. You can replace all of those. Add Screenshot, Lock Screen, Rotate Screen, Volume controls — whatever you actually use.
I know what you're thinking. "That floating circle looks annoying." It does at first. But you can fade the opacity down to about 20% when idle. It barely shows. Tap it and it snaps back to full opacity.
There's a specific use case where this is essential: broken buttons. If your volume button or side button stopped working, AssistiveTouch keeps your phone fully functional. I've used it as a bridge before getting a repair. It works.
5. Sound Recognition — Your Phone Listens So You Don't Have To
Sound Recognition sends a notification when your iPhone detects a specific sound. Smoke alarm. Baby crying. Doorbell. Knock at the door. Dog barking.
Go to Accessibility → Sound Recognition. Toggle it on. Then pick which sounds to listen for.
All processing happens on-device. No audio leaves your phone. Apple was specific about this when they launched the feature in iOS 14.
“I use this when I'm in the basement working. If someone rings the doorbell, I get a notification. It's saved a few deliveries from being returned.”
The smoke alarm detection is the one I'd recommend turning on for everyone. If you're wearing headphones and there's a problem, you'll know. The notification shows up even when your phone is face-down on a desk.
False positives do happen. A high-pitched kettle once triggered my smoke alarm detection. But for something that could matter in an emergency, some false positives are acceptable.
6. Display & Text Size Settings — The Comfortable Screen
This menu has more useful settings than most people realize. Open Accessibility → Display & Text Size.
A few worth knowing:
Reduce White Point dims the very brightest whites beyond what the brightness slider allows. Useful at night when minimum brightness is still too bright. I set mine to about 25% reduction.
Color Filters were designed for color blindness but there's one mode — Grayscale — that some people use intentionally. A black and white phone is significantly less addictive. Some people turn this on deliberately to reduce phone use. The novelty wears off in a day and the habit reduction stays.
Bold Text and Larger Text are obvious but often overlooked. They work across all system apps and most third-party apps too. If you ever squint at your phone, just turn these on.
- Bold Text
- Reduce White Point (night use)
- Increase Contrast
- Reduce Transparency
- Grayscale (experimental)
- Color Filters
- Smart Invert (can break apps)
7. Magnifier — The Camera App Nobody Downloads
Your iPhone has a dedicated magnifying glass app built in. Not the Camera app. A separate tool designed specifically for zooming in on small text and objects.
Enable it at Accessibility → Magnifier. Then add it to your Control Center for fast access (Settings → Control Center).
It has a brightness slider, contrast controls, filters, and freeze-frame. Zoom in, freeze the image, read the text at your own pace. The camera won't focus that close, but Magnifier will.
Use case: medication labels, restaurant menus, serial numbers on the back of routers, fine print on contracts. Anything too small to read comfortably.
Apps That Extend What Accessibility Starts
The built-in features go far. But a few third-party apps are worth adding to the mix.
Microsoft's AI camera tool reads text out loud, describes scenes, identifies products by barcode, reads handwriting, and even describes people's expressions. It's genuinely impressive and completely free. Point it at a document and it reads every line. Point it at a scene and it describes what's in the frame. Pairs well with Speak Screen for a full audio-first phone experience.
A text-to-speech reader that handles PDFs, ePubs, web articles, and Word docs. The voice quality is a step above iOS's built-in Speak Screen. You can adjust speed, pitch, and voice with fine-grained control. Import documents, listen while doing other things. The $19.99 price is one-time and worth it if you do any serious reading. I use this for technical PDFs and research papers I'd otherwise skip.
Originally built to connect visually impaired users with sighted volunteers over video call. Now it has an AI mode (powered by GPT-4o) that describes any photo instantly. No waiting for a volunteer. Point the camera at anything and ask a question. "What does this label say?" "Is this shirt blue or green?" "Are these the same size?" Faster and more flexible than Seeing AI for some tasks. The AI response time is around 2-3 seconds.
Google's version of Seeing AI. Modes include Explore (real-time scene description), Document (read any document), Currency (identify bills), and Food Labels. The document reading is particularly accurate on printed text with unusual fonts. Worth having both Seeing AI and Lookout — they handle different inputs differently. Google's currency detection works for more countries than Seeing AI.
Side-by-Side: Seeing AI vs Be My Eyes vs Lookout
| Feature | Seeing AI | Be My Eyes | Lookout |
|---|---|---|---|
| Price | Free | Free | Free |
| Text reading | Excellent | Good | Excellent |
| Scene description | Good | Excellent (AI) | Good |
| Barcode / product ID | Yes | Via AI | Yes |
| Currency detection | Limited countries | Via AI | More countries |
| Handwriting | Yes | Yes (AI) | Limited |
| Live volunteer help | No | Yes | No |
| Best for | Documents, handwriting | Complex questions | Everyday navigation |
The Accessibility Shortcut — Tie It All Together
There's one more setting worth knowing. Accessibility → Accessibility Shortcut (at the very bottom of the menu).
Triple-click the side button to toggle any accessibility feature on or off. Assign Magnifier, Color Filters, Reduce White Point, or anything else here.
I have Reduce White Point assigned here. Triple-click when getting into bed, screen dims further. Triple-click again in the morning, back to normal. No digging through Settings.
Setup Checklist (Do This Now)
Here's the five-minute version. Do these in order.
- Settings → Accessibility → Touch → Back Tap → assign Double Tap to Screenshot, Triple Tap to your most-used Shortcut
- Settings → Accessibility → Spoken Content → turn on Speak Screen → download an Enhanced voice
- Settings → Accessibility → Touch → Reachability → toggle on
- Settings → Accessibility → Sound Recognition → toggle on → enable Smoke Alarm and Doorbell
- Settings → Accessibility → Magnifier → toggle on → add to Control Center
- Settings → Accessibility → Accessibility Shortcut → assign Reduce White Point
- Download Seeing AI (free). Open it once. You'll use it again.
FAQ
Will these features slow down my phone?
Sound Recognition runs continuously in the background and uses some processing power, but it's minimal. Apple designed it for older hardware too. The others (Back Tap, Reachability, Magnifier) only activate when you use them. No measurable battery impact from having them enabled.
Does Speak Screen work in third-party apps?
Yes, it works in almost all apps — Safari, Reeder, Pocket, Chrome, Notes, Mail. The two-finger swipe down works anywhere on screen. The only apps it struggles with are ones that render text as images or use unusual drawing layers. Most well-built apps support it fine.
Does Back Tap work with a case on?
Depends on the case. Thin TPU cases work fine. Thick rugged cases (OtterBox Defender level) can block it. Apple Silicon in the iPhone detects the tap vibration through the chassis, so some padding is fine. If it's not working with your case, try tapping slightly firmer or lower on the back. You'll find the right spot.
Is Seeing AI private? Does it send photos to a server?
Microsoft processes images on their servers for the AI features. That's how it works. If you're reading something confidential, use iOS's built-in Live Text instead (long-press any text in Photos or Camera). Seeing AI is fine for menus, labels, product info — things that aren't sensitive. Same applies to Be My Eyes' AI mode and Lookout.
Wrap Up
The Accessibility menu is one of the most underused parts of iOS. Apple built real tools in there. They labeled them "accessibility" and most people assumed they didn't apply.
Back Tap gives you a physical button for any action. Speak Screen reads anything. Reachability solves one-handed use. Sound Recognition keeps you informed. Magnifier replaces reading glasses for small print. That's five things you can set up in under five minutes and keep using permanently.
Start with Back Tap and Speak Screen. Those two alone are worth the five minutes.