Your iPhone will soon get Apple’s answer to Google Lens


Google Lens has slowly become one of the most useful augmented reality apps out there, so Apple has decided to create its own rival in iOS 15.

At WWDC 2021, Apple announced that ‘Live Text’ and ‘Visual Look Up’ would be coming to the iPhone’s Camera and Photos app as part of iOS 15. And both are direct rivals to Google Lens, which has become an increasingly powerful method. Discover the real world through your smartphone camera on both Android and iOS.

While we’ve already seen something similar on Android phones, ‘Live Text’ looks like it will be an easy way for iPhone users to convert handwritten or printed text copied from the real world into digital text. Apple says it is based on a ‘deep neural network’ that uses on-device processing rather than a cloud-based approach.

  • Google Lens: How to master Google’s super-useful AI camera app
  • How to Use Google Lens on Your iPhone or iPad
  • iOS 15 release date, features, supported iPhones and everything you need to know

The example Apple showed was Notes on an Office whiteboard—you’ll be able to tap a new icon in the lower-right corner of the Camera app’s viewfinder, then simply use Apple’s normal text selection gesture (drag your finger over the text). Do it. Copy handwritten text to email or the Notes app.

You’ll also have to use ‘live text’ on existing photos in your library, although these use cases seem a bit less useful. Apple’s examples were copying a restaurant’s name and phone number into the background of an old photo, but perhaps some of the more interesting uses for the technology will materialize when it’s out in the real world.

Apple’s ‘Live Text’ is naturally a lot more limited than Google Lens, as the latter has been out since 2017. Right now, Apple says it only understands seven languages ​​(English, Chinese, French, Italian, German, Spanish, Portuguese). Which is less than Google Lens’ ability to translate words in over 100 languages.

That means Google Lens’ another useful trick – live translating restaurant menus or signs when you’re traveling – just won’t exactly match Apple’s ‘live text’. But it’s a useful new trick for Apple’s Camera app and works on all kinds of photos, including screenshots and photos on the web.

real world exploration

Likewise, Apple’s new ‘Visual Look Up’ is a more direct challenge to some of the core features of Google Lens.

While it wasn’t shown in great depth during WWDC 2021, the feature will apparently let you automatically see information in your photos, such as the breed of dog or the type of flower you cut.

According to Apple, this will work for books, art, nature, pets and landmarks, though how exhaustive its knowledge is remains to be seen. It will certainly be difficult to compete with Google on this front, given the mountains of data the search giant has been able to glean from its other services.

But while the feature will probably be a bit limited initially, it looks like the move is related to augmented reality and, perhaps eventually, the rumored Apple Glass. Automatically identifying visual information, such as landmarks, is likely to be an important component of any smart glasses, so ‘Visual Look Up’ may be considered another important step towards some Apple face furniture.

‘Visual Look Up’ will apparently also work on iPhone, iPad, and Mac, so as long as you update it to the latest software, it’ll be baked into any Apple device you’re using. You can expect the full release of iOS 15 to arrive in mid-September.

  • Google Photos’ unlimited free storage just ran out—here’s what to do

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories