Google engineers raise the flag on AR depth, seek builders
A virtual cat with occlusion off and with occlusion on. Credit: Google

So, a smartphone camera is just good for taking snapshots? Don't bring such talk to Google's augmented reality engineers. Blending the real and the virtual—-and winning special props for blending the real with the virtual—is what Google has in mind as a motivator for developers wanting to go deep into AR.

"We're used to AR at our fingertips, giving us useful and fun experiences. Your content, though, often looks like it's pasted on the screen rather than in the world." That is Konstantine Tsotsos, software engineer, Google, talking in a video. He tantalizingly asks (as Google's team did), "what if ARCore could supercharge your camera? Bring the world into your phone?" So, how? By giving you color. And depth.

Google continues to turn heads with its platform, the latest advancement of which is bringing out a Depth feature. ARCore enables you to build augmented reality experiences. "Using different APIs, ARCore enables your phone to sense its environment, understand the world and interact with information.

This video in which Tsotsos appeared was to introduce the ARCore Depth API. Actually, the video turned out to be just as much a call for collaboration as it was a platform feature announcement.

The engineers believe this Depth advance is just scratching the surface of what is possible with the new feature. "We need developers like you, too help us build the future." He said those with ideas that they would like to build using the Depth API could fill out a form.

He said they were looking for developers in 2020 to help them build out a new wave of AR experiences and looked forward to see what they build with the Depth API. The developers could help them "explore and test the Depth API in real world apps at scale, with a plan to make the feature broadly available."

Nicole Lee in Engadget said that Google was incorporating a "Depth API" that will introduce occlusion, 3-D understanding, and a new level of realism. Lee was at the San Francisco office of Google where she visited developers working with the new ARCore Depth API to create a depth map using a regular smartphone camera. She said a "depth understanding of the world" allowed developers to play with "real-world physics, surface interaction and more."

This is a new offering in the platform and it allows you to create a depth map via depth-from-motion algorithms and a single camera. When used, ARCore is doing two things: (1) tracking the position of the mobile device as it moves, and (2) building its understanding of the real world.

Google engineers raise the flag on AR depth, seek builders
A demo experience we created where you have to dodge and throw food at a robot chef. Credit: Google

Shahram Izadi, Director of Research and Engineering, writing in the Google Developers blog, explained what is going on to give developers their realistic results:

"The ARCore Depth API allows developers to use our depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel."

An update to Google's ARCore technology can sense depth in a room and hide behind real ones using just a single phone camera, said CNET, "and the early results look promising."

Expect a , pixel-by-pixel representation of the distance to physical surfaces in the camera's view.

Occlusion is the enablement presently being showcased in Google's "immersive" AI opportunities for developers. "We will begin making occlusion available in Scene Viewer," said Izadi, "the tool that powers AR in Search, to an initial set of over 200 million ARCore-enabled Android devices today [the blog was posted on Dec. 9]."

Over 200 Android devices out there were said to be ARCore-compatible. "With just a single moving camera, Tsotsos said in the video, "we can give you a 3-D understanding of the world in over 200 million Android phones." He said they can start occluding objects (like the animal lurking in the bushes on his phone display screen).

David Kim, another Google software engineer, took it from there. He said that when a camera understands 3-D space, the has some positive options, where it can collide with the world, or even stick to it, instead of just bouncing around.

So, characters move around in your space naturally. Then there is the other plus of realistic looking particles in your scene—snow piling up, splashing rain. Now, said Kim, if your phone has an active depth sensor, all of these depth space effects can get even better.

Izadi repeated the recruiter invite: "If you are interested in trying the new Depth API, please fill out our call for collaborators form."



© 2019 Science X Network

Citation: Google engineers raise the flag on AR depth, seek builders (2019, December 12) retrieved 12 December 2019 from https://techxplore.com/news/2019-12-google-flag-ar-depth-builders.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.