Microsoft Research, home to some very smart interactive development types, is widely credited with stealing the show at last week’s Siggraph 2011 conference with KinectFusion. Fusion is a system that takes live depth data from a moving depth camera and in real-time creates high-quality 3D models. The system allows the user to scan a whole room and its contents within seconds. As the space is explored, new views of the scene and objects are revealed and these are fused into a single 3D model.
The video, although not of incredible resolution, shows the potential for real-time 3D recreation of a physical space. Whilst obviously it’s incredibly exciting for the games development market, other applications such as architectural monitoring and telepresence also spring to mind. As processors get more powerful and the cameras used increase in resolution this kind of tech will find itself used in more and more different ways.