Tech lead Hartmut Neven:
Imagine you are on travel in Paris and you visit a museum. If a picture catches your attention you can simply take a photo and send it to the VMS [Visual Mobile Search] service. Within seconds you will receive an audio-visual narrative explaining the image to you. If you happen to be connected to a 3G network the response time would be below a second. After the museum visit you might step outside and see a coffeehouse. Just taking another snapshot from within the VMS client application is all you have to do in order to retrieve travel guide information. In this case location information is available through triangulation or inbuilt GPS it can assist the recognition process. Inside the coffeehouse you study the menu but your French happens to be a bit rusty. Your image based search engine supports you in translating words from the menu so that you have at least an idea of what you can order. (source)
At the moment, the visual mobile search application, internally known as Google Goggles, is going through a long battery of tests:
Back in California, the visual search team anxiously watched by video link as first time users tested the product. After some initial reviews were less than enthusiastic, Google engineers decided the new technology just wasn’t ready for prime time. So team members were dispatched to fix any remaining problems. (source)
So although not an immediate threat to leading Snaptell, we can be sure that Google will not rest till they will create a user friendly product that will use your photos to serve useful information and, naturally, more ads. In the meantime, if Google is looking for enthusiastic beta tester, my email is on the right :)
Read more at eWeek.com and CNBC. Via Steve Rubel.