Apple has introduced a new feature to its camera system that automatically recognizes text in your photos and transcribes text from a phone number on a business card to a whiteboard full of notes. Live Text, as the feature is called, doesn’t require any prompts or special actions from the user – just tap on the icon and you’re good to go.
Announced by Craig Federighi at the WWDC virtual forum, Live Text will come to iPhone with iOS 15. He displayed it with two pictures, one of a whiteboard after a meeting, and some snapshots that included restaurant signs in the background.
Tapping the Live Text button in the lower right slightly underlined the revealed text, and then a swipe allowed it to be selected and copied. In the case of the whiteboard, it collected several sentences of notes, including bullet points, and along with a restaurant sign it grabbed the phone number that could be called or saved.
The feature is reminiscent of many found in Google’s long-developing Lens app, and the Pixel 4 added more robust scanning capability in 2019. The difference is that the text in every photo taken by an iPhone running the new system is captured more or less passively. – You don’t need to enter scanner mode or launch a separate app.
This is a good thing for anyone, but it can be especially helpful for people who are visually impaired. A snapshot or two makes any text, otherwise difficult to read, able to be directed or saved.
The process seems to happen entirely over the phone itself, so don’t worry if this information is being sent to a datacenter somewhere. This also means that it’s fairly fast, although until we test it for ourselves, we can’t say whether it’s instantaneous or, like some other machine learning features, something that’s better than the one you shot. Occurs within the next few seconds or minutes after taking.