While Apple didn’t talk much about Augmented Reality during the WWDC keynote, the company did release this week a new version of its software developer kit (SDK) that developers use to create AR apps. ARKit 4 brings new capabilities to iOS 14 that developers can leverage to create experiences on all current iOS devices. It also adds important new depth-sensing capabilities accessible on devices that have Apple’s LiDAR Scanner (currently shipping only on the latest iPad Pro products). And, perhaps most importantly, ARKit 4 introduces Location Anchors, which lets developers place a persistent virtual object in a specific place in the real world.
Leveraging LiDAR; Improved Face Tracking
Apple introduced the Scene Geometry API in ARKit 3.5, after the launch of the latest iPad Pro products with LiDAR scanners. I expect Apple to add LiDAR scanners to the next generation of iPhones shipping later this year, so the feature is likely to get a fair amount of discussion during the next launch event.
The front-facing LiDAR scanner works by shooting light onto the surrounding area and collecting the reflected light. The device uses this data to create a topological map of the environment. This information lets developers create a richer AR experience by driving more realistic occlusion. Occlusion occurs when a digital object appears in front of a real-world object and partially occludes the user’s view of that object. Good occlusion is key to creating a more immersive AR experience. The LiDAR scanner also brings enhanced capabilities such as more realistic physics-based reactions between real and virtual objects. It also offers improved virtual lighting on real-world surfaces.
In iOS 14, Apple further expands the capabilities of LiDAR scanner-enabled devices to articulate better the distance between the iOS device and objects in the environment. On-device machine learning merges the color RGB image captured from the device’s wide-angle camera with the depth reading from the LiDAR scanner to create a dense-depth map. This depth data is tied to a 60hz refresh rate, which means as the iOS device moves, the depth data reflects this movement.
LiDAR also enables improvements to a feature called ray casting, which is a rendering technique that uses computational geometry to create a three-dimensional perspective in a two-dimension map. In ARKit 4, developers can leverage the LiDAR scanner to use ray casting to place virtual objects more quickly and precisely into the real world.
Finally, Apple introduced face tracking capabilities in a previous version of ARKit, but the capability was limited to devices with a front-facing True-Depth Camera. ARKit 4 expands face-tracking capabilities to all devices with an A12 Bionic processor or later, including the recently launched iPhones SE. Face tracking lets developers create apps that place your face over virtual content, tracking expressions in real-time.
Location Anchors
While the new capabilities enabled by the LiDAR scanner are exciting, perhaps the most notable new feature Apple announced with ARKit 4 is Location Anchors. This new technology brings higher-quality AR content to the outdoors. Location Anchors let developers specify longitude, longitude, and altitude. Then ARKit 4 leverages these coordinates—plus high-resolution Apple Maps data—to place experiences at a specific location in the real world.
The process for driving this next-generation AR experience is called visual localization, and it accurately places your device in relation to the surrounding environment. Apple says this is notably more accurate than can be done with GPS alone. Advanced machine learning techniques drive this process and run locally on the device.
The end result is that when a developer places a virtual object in the real world—for example, a digital sculpture at the intersection of two streets—that object will persist in that location and will appear in the exact same location, in precisely the same manner, to anyone viewing it with a capable device. Apple says Location Anchors will first roll out in major cities such as Los Angeles, San Francisco, Chicago, Miami, and New York, with more appearing later this summer. To leverage location anchors, apps must be running on devices with GPS and Apple’s A12 Bionic chip or later.
The importance of Location Anchors cannot be overstated, and it speaks to the fact that, as per usual, Apple is playing a very long game here. There are entire startups, and market segments focused on the technology that underlies a feature capability that Apple quietly launched with zero fanfare this week. Because it owns its own map data, and because it has hundreds of millions of devices constantly capturing location data, Apple is positioning itself to bring location-based AR to the masses. These new features will enable developers to create next-generation apps that will eventually make AR a mainstream technology.
Slow, Steady AR Progress
Those of us closely monitoring the AR space sometimes lament the seemingly slow pace of advancement. While many of us would love to have our Apple Glasses now, the fact is this is a complicated technology, and doing it right is more important than doing it fast. In addition to the real-world device challenges associated with optics, battery life, wireless connectivity, and more, great AR content will require a deep understanding of the real, ever-changing physical world. Few companies have the resources to acquire that understanding on their own (see Microsoft’s Spatial Anchors and Niantic’s recent acquisition of 6D.AI). Fewer still own both the hardware and software platforms upon which that AR content will run. With ARKit 4 and iOS 14, Apple fortifies its position as the world’s largest AR platform, and it gives developers new tools to create the types of AR apps we’ve all been waiting to experience.
I do not even understand how I ended up here, but I assumed this publish used to be great
Some really excellent info Sword lily I detected this.
Pretty! This has been a really wonderful post. Many thanks for providing these details.