iOS 16 introduces several new APIs that allow developers to expand the capabilities of their apps. For example, there are new APIs for lock screen widgets, walkie-talkie, interactive maps, weather data, and more. Interestingly, Apple has also updated the Nearby Interaction API to integrate the U1 chip into ARKit, amid rumors of a new mixed reality headset.
Introduced with iOS 14, the Nearby Interactions API allows developers to leverage the ultra-wideband U1 chip available in the iPhone 11 and later. The U1 chip, which gives devices precise location and spatial awareness, can be used to detect the distance between one iPhone and another, for example. With iOS 15 and watchOS 8, Apple has extended these features to the Apple Watch, as the Apple Watch Series 6 and later also feature the U1 chip. This year, iOS 16 brings an interesting new option for developers working with the Nearby Interaction API, namely the ability to integrate the U1 chip with augmented reality via ARKit.
U1 chip gets more attention
As the company explained at WWDC 2022, iOS already uses the U1 chip in combination with ARKit to locate AirTags with the Precise Find feature. Using the data provided by the U1 chip and the iPhone camera, the Find My app can guide the user precisely to their AirTag. But now, developers can also use the U1 chip and ARKit to create similar experiences in their apps that make information about distances and directions even more consistent and accurate. According to Apple, the best use cases for this API are experiences that guide users to a specific nearby object, such as a misplaced item, an object of interest, or an object the user wants to interact with. For example, an app can tell users whether the object they're looking for is in front of or behind them.
U1, ARKit and Apple's AR/VR headset
Several recent rumors suggest that Apple will launch a new mixed reality headset in late 2022 or early 2023. Although the product was not announced at WWDC 2022 and the company did not say a word about augmented or mixed reality during the opening keynote, the WWDC sessions There's been a lot of talk about AR and VR. For a device that's set to feature multiple cameras and advanced sensors, including an ultra-wideband chip, it seems clear that it will have precise spatial awareness.
Apple prepares developers for AR/VR headset
And while there's no SDK for the new headset since it hasn't been officially announced yet, Apple seems to really want developers to prepare their apps for this type of interaction even before the headset is announced. When the company first announced the U1 chip with the iPhone 11, it mentioned that experiences like faster AirDrop would be just the beginning. U1 is now used for things like car keys in the Wallet app and finding AirTag. But the chip will definitely play a major role in Apple's mixed reality headset. Additionally, ARKit was updated in iOS 16 to support 4K HDR video and advanced indoor scanning - another important step towards an AR/VR device. Whether your device is compatible with the new software, you can find out here. (Photo by Unsplash / Penfer)