In addition to iOS 13.4, iPadOS 13.4, tvOS 13.4, watchOS 6.3 and macOS Catalina 10.15.4, Apple also released ARKit 3.5 published.
As part of the update wave, Apple has also released ARKit 3.5 for developers. With ARKit 3.5, developers can access and use the advantages of the new LiDAR scanner in the new iPad Pro. On Apple's Developer website is it [called:
ARKit 3.5 leverages the new LiDAR scanner and depth sensing system on iPad Pro to make AR experiences more realistic than ever before. The new Scene Geometry API lets you capture a 3D representation of the world in real time, enabling object coverage and real-world physics for virtual objects. All experiences enabled by ARKit automatically benefit from the new instant AR placement and improved motion capture and people coverage.
The rest of the website describes the three most important changes as follows:
scene geometry – Scene Geometry allows you to create a topological map of your space with labels identifying floors, walls, ceilings, windows, doors, and seats. This deep understanding of the real world allows you to unlock object coverage and real-world physics for virtual objects, and also gives you more information for your AR workflows.
Sofort AR – The LiDAR scanner on iPad Pro enables incredibly fast plane detection, so AR objects can be placed in the real world instantly, without scanning. Instant AR placement is automatically enabled on iPad Pro for all apps built with ARKit, without any code changes.
Improved motion detection and personal locking – With ARKit 3.5 on iPad Pro, depth estimation in people occlusion and height estimation in motion capture are more accurate. These two features improve on iPad Pro in all applications built with ARKit without any code changes.
Apple has already given some examples of how the LiDAR scanner can be used on the iPad Pro. But it remains exciting to see what applications will surprise us in the future with the new AR experience. (Image: Apple)