We installed models of optical / touch sensation into our VCs. It
enables VCs to react for user’s actions to VCs in various and appropriate
ways. For example, they express attention by looking back
when they are hit from backward. They express happiness when
they are stroked gently, and they step away after a strong hit.
(if embedding does not work for you, video is here)
More details can be found here. Via Development Memo for Ourselves.