Playing in an Augmented World from YDreams on Vimeo.
Usually I refrain from re-posting old news, but Antão from YDreams (the guy in the red shirt) has contacted me and gave some technical info, so I'll let this one slip by in order to encourage others to send me news and other AR tidbits. My address is right there on the right panel. Anyway, so says Antão:
The applications use face and blob detection. Although the virtual objects move on a 2D plane, they are 3D objects and react using 3D physics. The interaction can be extended to be in real 3D. The applications take advantage of the modern multi-core CPUs (computer vision, behaviors and physics run in separate cores) .
We are using a proprietary development platform, codenamed YVision. It was developed on .NET 3.5 and uses open-source libraries like OpenCV, Ogre and ODE. The computer vision algorithms are the ones supplied by OpenCV, with a little magic added by us. ;-)
The applications were running on a Intel(R) Core(TM)2 Duo E7200 @ 2.53GHz with an NVIDIA GeForce 9400 GT (a regular off-the-shelf PC). Computer vision running on one core, while behaviors and physics were running on the other core.
They have submitted this video to MIX09 ShowOff contest, so feel free going over there and to vote for them.