We just saw the first fruits of Apple‘s labors to turn your phone into a window to other worlds.
In the corner of a real room, we placed an image of an IKEA armchair so convincing, the room’s layout looked more realistic with it there than without.
We decorated a cupcake that doesn’t really exist, and watched a Very Hungry Caterpillar turn into a beautiful butterfly after following us across the floor.
We walked through a world full of hilarious GIFs — and “replied” to them with a remix of our own. And we saw how, someday soon, we’ll be able to watch zombies from “The Walking Dead” intrude upon our reality, and take them down with katana and crossbow.
It’s been less than three months since Apple announced its ARKit software to let anyone make a augmented reality app that could run on millions upon millions of iPhones and iPads when iOS 11 arrives this fall. But ARKit already has a ready-for-primetime feel about it.-style
I don’t say that because of the particular apps I saw — most of which felt like unfinished experiments — or the lavish praise each developer heaped on Apple when I asked. (We weren’t allowed to touch many demos ourselves, and while IKEA wants to eventually offer a substantial portion of its famous flat-pack furniture catalog, only a handful of objects were ready.)
No, it’s the raw excitement that led these developers, and many other independent ones, to build a whole host of groundbreaking ARKit experiments in mere weeks. Have you seen find-your-friends-at-a-festival concept?yet? How about , or the
The excitement is catching, and one could easily imagine “There’s an AR app for that” could become a common phrase.
It’s no wonder Google announced its own ARKit competitor —— earlier today.
But excitement wasn’t the only thing we gathered from Apple’s first ARKit wave.
It only does tables and floors
Some of the not-so-secret-sauce behind ARKit is that it can detect flat surfaces, like tables, floors and the tops of chairs, and allow virtual objects to look like they’re realistically sitting on top. (Characters, like the Very Hungry Caterpillar, will know not to fall off!) We discovered it even works on glossy, reflective floors.
But ARKit can’t currently detect complex surfaces like curved couch cushions, or even vertical surfaces like walls. You might be able to place a virtual IKEA futon in your room, but you won’t be able to hang a virtual flat-screen TV. Not yet, anyhow.
It’s only for the rear cameras (for now)
ARKit makes use of the rear cameras only on current supported devices (iPhone 8‘s hardware will be all about.). What about front-facing cameras to enable Snapchat-like advanced AR effects, or better tracking for that use the front camera? Apple won’t say, but maybe that’s what
The impressive room-tracking that ARKit allows could be used for things beyond AR
The really impressive thing Apple’s augmented reality pulls off is using camera and motion sensor data to achieve the type of seamless spacial awareness that used to require more advanced AR and VR hardware. It’s used a bit in the iOS public beta for a VR-like mode in Apple Maps (link), and we may see similar uses in 3D apps. There’s no reason it couldn’t be used for some games, too, that aren’t technically “AR” at all.
There’s no secret sauce, just a lot of testing and knowledge
One reason why Google may have a hard time catching up: Using its precise knowledge of the exact size, position and orientation of the iPhone‘s camera and inertial sensors — which change from device to device with Google’s Android — Apple went out and captured thousands of real-world scenarios to make AR seem realistic when you’re using an iPhone or iPad.
Nothing technically restricts ARKit to late-model iPhones
There’s actually nothing special in Apple’s recent processors that allows ARKit to function. While the dedicated image processor in later iPhones helps, the main reason to limit to newer phones and tablets is so they’ll have enough extra processor power for developers to build immersive experiences around it.
Don’t expect phones with dual cameras to have an ARKit advantage anytime soon
Single-lens iPhones seem to work just as well on ARKit as dual-lens Plus models, even though the latter could theoretically see depth more easily. Apple seems more concerned with scale — letting developers address the most people possible, instead of fragmenting them with different technologies.
It’s not designed for hands-free glasses… but could it be used for that?
Apple doesn’t seem to intend ARKit for hands-free visors that the iPhone could pop into, like the. But again, I wouldn’t be surprised to see use cases emerge. ARKit seems pretty flexible, and combining ARKit’s sensing with a hands-free iPhone visor could open up some interesting ideas.
iOS 11 will arrive this fall, with more details most likely being announced whenever Apple has its next iPhone event, which is.