philipturner / scene-color-reconstruction

Adding color to real-time scene reconstruction

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Adding Color to 3D Scene Reconstruction for Augmented Reality Headset Experiences Using an iPhone

Philip Turner

Published: June 3, 2021

Abstract

In October 2020, Apple released the iPhone 12 Pro, which had a LiDAR scanner that enabled it to reconstruct the 3D shape of its surrounding scene. Coupling the Google Cardboard VR viewer with scene reconstruction allows the real world to be rendered in VR, replicating the experience of using an AR headset such as Hololens. In order to enhance this AR headset experience, I created software that mapped color in a video stream to areas on a reconstructed scene mesh, and retained that color when the mesh expanded and changed shape as the LiDAR scanner gathered new data. By adding color to scene reconstruction, the experience became more realistic, allowing the user to view the color of their surroundings in their peripheral vision and in areas occluded from the camera's view. This software runs on an iPhone, providing users with an accessible way to experience an AR headset.

Download the full text here.

Source code from this research is open-sourced as part of ARHeadsetKit.

About

Adding color to real-time scene reconstruction

License:Other