ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. VIO fuses camera sensor data with CoreMotion data. These two inputs allow the device to sense how it moves within a room with a high degree of accuracy, and without any additional calibration.And here’s a tweet with a GIF, because it’s hard to grasp this if you don’t see it:
<script async src="//platform.twitter.com/widgets.js" charset=“utf-8"> Let’s examine what’s going on in that GIF. As you can tell, ARKit is using camera data to ‘see’ the scene and anWondering how Apple does AR on mobile? Like this: #WWDC #WWDC17 #WWDC2017 pic.twitter.com/dGgEMvjNiP
— nateSwanner (@NateSwanner) June 6, 2017
ARFrame
class to start a tracking session. It knows the camera’s position because the ARCamera
class tracks the device’s position relative to things in the scene. The lines representing an object at the center of the table could be anything, like a vase, and keep their relative space to the camera by utilizing the ARAnchor
class. The ‘vase’ is anchored to the table, which ARKit knows is a flat surface because of the ARPlaneAnchor
class. Thanks to these anchors, the ‘vase’ is stuck to the table, just as it would be if it were a real object you could touch; the table is a solid object ARKit knows it can anchor to, and the ‘vase’ can be any sprite you like. Now if the lines really were a vase, lighting would play a role. ARKit has what Apple is calling ‘lighting estimations,’ which will “estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects.” Placing a white vase in a dimly lit environment would mean the object would be more grey, just as it would be in real life because of the lighting issues. That’s done via a single class, RLightEstimate
. The details tell us why Apple saddled the class with an ‘estimation’ trailing tag. It assumes lighting “associated with a captured video frame in an AR session.” So if lighting details change, your scene may not be immediately reflective of those changes. It can estimate luminosity via an ambientIntensity
property throughout the scene, though. Interestingly enough, much of the heavy lifting for ARKit is offloaded onto the GPU. Though it needs an A9 or A10 SoC to function, ARKit uses Metal and SceneKit to do a lot of the detailed work. Apple says Unity and Unreal Engine will also work with ARKit.
Apple showed off a still-unreleased IKEA app using ARKit that helps you place furniture in your home’s space (the better to see if you really want those pieces). Previews of an updated "Pokemon Go" that leverages ARKit show just how much better AR can be with the platform working under the hood. LEGO even showed off a neat app that provides exploded views of models so you can see what-goes-where (something like that could be transformative for the company if it wants to get away from printed instructions in the future). But ARKit is nascent, and we still haven’t seen its full utility beyond a few demo apps. It’s also quite limited. Relying on the iPhone’s camera(s) is a smart move, but those cameras may not have the power to track multiple flat planes in a large room, for instance. All of Apple’s demos include limited space, like a table or corner of a room. That’s fine for gaming, but you likely won’t be able to see what redesigning your entire home with IKEA furniture might look like.My bike ride in AR. (Unity + ARKit + Mapbox + Strava) pic.twitter.com/g2uVwVlM3h
— Adam Debreczeni (@heyadam) June 7, 2017