Apple’s new iOS 15 features would be a perfect fit for AR glasses


Apple doesn’t have a headset yet, but the latest software for iPhone, iPad, and Mac continues to lay the groundwork

How long will this be true? Apple’s WWDC introduction swoops in on telepresence. It’s not a joke though.

For another year, another no-show Apple AR Glasses on annual WWDC Conference. At the company’s second all-virtual edition of the developer conference, there was no information available about Apple’s long-expected VR and AR headsets. No new big AR push either. You could get away with the WWDC keynote thinking that Apple isn’t pushing AR at all. At least, for now: 2021 isn’t over yet.

In fact, look more closely, and there are already too many puzzle pieces scattered all over the place. Apple’s ARkit and RealityKit developer tools added some deeper features to manage more objects and larger virtual scenarios on the real world. Core apps started getting some AR hooks too: Apple Maps Adding AR Instructions, like Google Maps already does. In addition, I like google lensApple is introducing ways read and search text In Photos, or via the Camera app.

Spatial audio, sharing and FaceTime: the beginning of telepresence?

Tim Cook took to the virtual stage at WWDC to face the audience of Memoji, Apple’s AR incarnation that’s already been around for three years. It was probably meant to represent the feeling we all have from home. I kept watching it and thinking about the future of telepresence.

Apple doesn’t have its own social AR communication apps yet, but others do. local, a company that has its own VR and AR apps on headsets and phones, is one example. Many of these rely on spatial audio to create a sense of presence and direct attention. Facebook considers spatial audio be the cornerstone How people will communicate with AR glasses.

And on iOS 15, Apple is adding Spatial Audio for FaceTime Calls. If you haven’t played around with VR or AR social apps, the spatial audio in FaceTime can seem like overkill. I haven’t tried it on iOS 15 yet, but I think it matters a lot more than it seems. In large groups, it can help to create a map of who is where. On the FaceTime grid, it doesn’t matter much. But in a final room hovering FaceTime holograms, like Microsoft already has playing with meshho On the Hololens, this can be really important. Spatial audio is becoming more woven into Apple’s ARKit capabilities, and it makes me wonder what’s next.

added Sharing Devices in FaceTime, while feeling like a late catcher for the Zoom, feels pretty important. If Apple is developing an OS for Glasses that will allow people to share the world together, Apple will need to figure out how people can instantly connect and show apps, content, and more with each other. can. Developing your own FaceTime tool seems like the first step.

wwdc-2021-apple-112-ios-15-live-text.png

Apple’s Live Text scans with your iPhone camera, like Google Lens. But it’s a device that AR glasses can take advantage of.

Live Text and Maps: AR as a Supporting Tool

Stop me if you’ve heard it before: Augmented Reality can be used to help people. Google has made Assistant AR a focus for many years, and both Google Maps and Google Lens use AR to overlay information on them to show pop-up directions or to analyze text and objects in the real world. do in various ways.

That’s been the dream goal for AR Glasses since Google Glass eight years ago. Apple’s introduction of these kinds of features on iOS 15 indicates that it is ready to treat AR as more than a magical experience or way of shopping for things. There are already plenty of useful AR-enabled apps on the App Store, but Apple’s own OS hasn’t integrated much of them. Maps and Live Text both seem like the beginning of that integration.

Object capture: a preview of how Apple will develop 3D scanning?

A Pro Tool announced at WWDC allows developers to create high-resolution 3D files from real-world objects. iPhone and iPad Can Already Amazingly Capable 3D Scanning through apps and Hardware features like lidar, but scan quality can be unreliable. Apple never made its own 3D capture software before, but object capture is a start.

Unlike many existing 3D scanning tools that map image capture data to 3D depth maps, Object Capture converts a bunch of photos (captured via an iPhone or iPad, or otherwise) into high-resolution 3D files. is. The processing part happens on a Mac, which feels like a disconnect at first. Apple’s iOS hardware — the M1 iPad Pro In particular – they seem to have a lot of processing power for these kinds of tasks.

The Mac is being touted by Apple as a 3D processing tool, but it could also be a stepping-stone to figuring out how Apple will do 3D object capture on more powerful iPhones and iPads in the future.

The object capture tool is being used right now for an extremely practical purpose: to get AR-enabled e-commerce on its feet next year. Virtual shopping experiences are already proving to be an effective experiment through the pandemic, and it looks like Apple is planning to add object capture libraries of 3D things to companies like Etsy, which will launch its 3D shopping in the fall. The list is planning to expand, and Wayfair, which is building its own scanning app using Apple’s toolkit for manufacturers selling through its stores.

But at some point, 3D capture is going to be there for ordinary people too: not just to share things, but to build objects and worlds that can live in AR. Apple may not be ready to put all those pieces on its hardware just yet, but Object Capture brings the Mac into the AR development fold.

app-clip-code

Apple’s App Clip codes, announced last year, are a part of Apple’s ongoing real-world layering of AR over real things.

Apple’s real-world AR layer is slowly evolving

To have AR glasses working in the real world, you need a real world that has been mapped to AR. Apple has been slowly remaking its world map over the past few years using lidar-equipped cars. Many cities are becoming capable of real-world AR that can be tagged to physical locations. As for Apple, all of these cities are US-based for now, with London being the first outside the US in the fall. Apple’s latest ARKit tool requires that location-based AR layer of data to display virtual art for multiperson experiences, or for things like AR-based directions that pop up in the next version of Maps.

Apple also wants to tag real objects with QR codes such as Apple Tags, called App Clip Codes, which when scanned will bring about an AR effect that maps the object being scanned or nearby things. could. Tags can launch Apple’s mini-app app Clips, which was announced last year with iOS 14, but in AR-enabled formats. Apple started working on the idea Last year, but it seems that progress has been slow in tagged objects in the real world. Maybe we’ll see products (Apple would mean their own, or HomeKit accessories) starting to get these app clip codes.

Several other companies are also pursuing real-world-based, multi-person AR: Snapchat, niantic, Google, Microsoft and Facebook, for starters. How Apple’s progress compares to those competitors could determine how quickly Apple releases an advanced pair of AR glasses that are designed to be worn at all times. Until then, Apple’s expected upcoming VR/AR hybrid headset could bridge the gap for developers by relying less on real-world outdoor locations.

Is there a Pro headset coming next?

Apple could have its own AR/VR hardware next year. But chances are strong that the company will need to start discussing the new software and its vastly different OS, perhaps a year ahead, based on speculation based on how Apple has announced new platforms in the past. These new AR tools are helping to create new sharing, capture, and assistive dimensions that could lead to Apple’s headsets, which will further emphasize communication, collaboration, and showcasing virtual things in the real world.

Apple’s late arrival in the AR/VR headset scene will be nothing new. In fact, Apple appears late to new technology (for example, the Apple Watch, or the iPhone or AirPods). While many companies like Facebook, Snapchat and Microsoft are sharing their emerging ideas in more experimental states, Apple is waiting to fully complete its first headset effort. Or, continue doing what it’s already doing: developing AR software in the open, feature by feature.

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories