As an intern at Chiba Institute of Technology, he was tasked to couple together ARToolKit with OpenHaptics (a tool kit that enables adding haptic feedback to 3d renditions), and seized the chance to add a physics engine (Newton Game Dynamics) and OpenGL to the bunch. This enabled him to do some fancy renderings, like the cool mirror reflection of virtual objects below.
Subsequently, I momentarily wandered from haptics in order to solve problems I had already noticed during my previous augmented reality projects. Although OpenGL’s depth buffer7 does a great job, this is still not straightforward to accurately estimate an augmented object’ s location in the real environment (especially the position in depth). That’s the reason why I worked on object textures, shadows and reflections which, unconsciously or not, help the user to accurately analyze a 3D configuration.
Two weeks ago he published his results as a series of clips on Youtube, the clip above is only one out of six. You can see them all here, and read his written report, containing much more detail over here.