ARCore has three basic components: Motion tracking uses video footage and internal sensors to estimate the phone’s relative location, so users can pin digital objects in one real-world place and move around them. Environmental understanding detects flat surfaces with the camera, and light estimation adds a realistic touch. It also allows the digital pieces to fit in with their real-world surroundings — by casting accurate shadows, for example.

 

Eventually, ARCore might be combined with a visual search tool like Google Lens to transform the way we get information. For example, instead of Googling instructions for how to use an espresso machine, you could take a picture of the machine — which the visual search feature would identify automatically. Then, the AR feature would provide you with an overlay of instructions so you could actually see which levers to pull on the machine that’s right in front of you — as opposed to something like a labeled photograph or video demonstration.