Accessibility features are focused on improving an interface for a particular disability, but can often improve experiences for all users. The new Live Captions feature, currently in beta testing on select devices and countries, is a prime example. Apple aims to convert all audio produced by your device into accurately transcribed readable text, much like Live Text can extract text from bitmap images.
To enable the feature, you must have an iPhone 11 or later with iOS 16 installed, a fairly recent iPad with iPadOS 16 (see this list), or an Apple silicon Mac (M1 or M2) with macOS Ventura installed. For iPhones and iPads, Apple says that Live Captions only works when the device language is set to English (US) or English (Canada). The macOS description says more broadly that the beta is “not available in all languages, countries, or regions.”
If you can use live captions (or want to check if you can), go to Settings (iOS/iPadOS)/System settings (Ventura) > Accessibility. If you see an item from Live Captions (Beta), you can use it. tap or click live subtitles to allow. You can then touch Appearance on iOS/iPadOS or use the top-level menu items on macOS to modify how subtitles appear. You can separately enable or disable live captions in FaceTime so that captions appear in that app.
Live captions appear as an overlay showing your English audio interpretation of any sound produced by your system. A live audio waveform matches the sound that Live Captions “hears.” On iOS and iPadOS, you can tap the overlay and access additional controls: minimize, pause, microphone, and full screen; on macOS, the pause and microphone button are available. By tapping or clicking the microphone button, you can speak and have what you say appear on the screen. This could be useful if you are trying to show someone the text of what you are saying.
The text produced in Live Captions is ephemeral: it cannot be copied or pasted. It’s also resistant to mobile screenshots: apparently, the overlay is generated in such a way that iOS and iPadOS can’t capture it.
Live Captions shows great promise, something to watch out for as it improves and expands. I tested Live Captions with podcasts, YouTube, and Instagram audio. It wasn’t as good as some AI-based transcriptions I’ve seen, such as in video conferencing, but it did a great job and was superior to no subtitles.
Apple could tie live captions into its built-in translation feature, and you might be able to use it to speak in your own language and show a translated version to someone in your language, or have live transcripts of video streams, podcasts, and more. . audio in a language other than the one you speak.
This Mac 911 article is a response to a question posted by Macworld reader Kevin.
Ask Mac 911
We’ve compiled a list of the most frequently asked questions, along with answers and links to columns – read our super FAQ to see if your question is covered. If not, we are always looking for new problems to solve! Email yours to firstname.lastname@example.org, including screenshots as appropriate and if you would like your full name used. Not all questions will be answered, we don’t respond to emails, and we can’t provide direct troubleshooting advice.