In iOS 16, apps can trigger real actions hands-free

- Advertisement -


New features coming with iOS 16 will allow apps to perform actions in the real world hands-free. This means users can do things like turn on music just by entering a room, or turn on an e-bike for a workout just by getting on it. Apple told developers today at a session hosted during the company’s Worldwide Developers Conference (WWDC) that these hands-free activities can also be initiated even if the iOS user is not actively using the app at the time.

- Advertisement -

The update that uses Apple Nearby Interaction Platformcould lead to some interesting use cases where the iPhone becomes a way to interact with objects in the real world if developers and accessory makers decide to adopt the technology.

- Advertisement -

During the session, Apple explained how modern apps can connect to and communicate with Bluetooth LE accessories even while running in the background. However, in iOS 16, apps will be able to start an interaction session near a Bluetooth LE accessory that also supports Ultra-Broadband in the background.

As a result, Apple has updated the accessory manufacturers specification to support these new background sessions.

- Advertisement -

This paves the way for a future where the line between apps and the physical world is blurring, but it remains to be seen if third-party app and device makers decide to leverage this functionality.

The new feature is part of a broader update to the Apple Nearby Interaction platform that was the focus of the developer session.

Introduced at WWDC 2020 in iOS 14, this platform allows third-party app developers to connect to the U1 or Ultra Wideband (UWB) chip on iPhone 11 and later, Apple Watch, and other third-party accessories. This is what today powers the precise search capabilities offered by Apple AirTag, which allow iPhone users to open the Find Me app to indicate the exact location of your AirTag using the on-screen directional arrows along with other indications that let you know how far away you are from the AirTag or if the AirTag might be on a different floor.

With iOS 16, third-party developers will be able to create apps that do much the same, thanks to a new feature that will allow them to integrate ARKit, Apple’s toolkit for augmented reality developers, with the Nearby Interaction platform.

This will allow developers to use ARKit’s calculated device trajectory so that their devices can also intelligently direct the user to an inappropriate element or other object that the user may want to interact with, depending on the app’s functionality. By using ARKit, developers will have more consistent information about distance and direction than if they only used side by side interaction.

However, this functionality does not have to be used only for third party AirTag-like accessories. Apple has shown another use case where a museum can use ultra-wideband accessories to guide visitors through its exhibits, for example.

In addition, this feature can be used to overlay directional arrows or other augmented reality objects on top of the real world camera view, as this helps guide users to an ultra-wideband object or accessory. Continuing with the demo, Apple briefly showed how red AR bubbles can appear on the app screen on top of the camera to point the way.

In the long term, this functionality lays the groundwork for Apple’s mixed reality smart glasses rumors, where AR-based apps are likely to play a key role.

The updated functionality is being made available to beta testers of the iOS 16 software update, which will be available to the general public later this year.

Learn more about WWDC 2022 at TechCrunch


Credit: techcrunch.com /

- Advertisement -

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories

Stay on top - Get the daily news in your inbox