Google Reveals Details Of Mobile Goggles, What’s Nearby?
Real-time search ruled the roost at Google’s search event at the Computer History Museum in Mountain View, California
Google’s mobile search team unveiled a few key technologies that run the risk of being drowned out by the noise over the company’s more momentous real-time search announcement on December 7.
Google unveiled real-time search at an event at the Computer History Museum in Mountain View, Calif. However, Vic Gundotra, the vice president of engineering who has been spearheading Google’s moves in the green field that is the mobile Web, warmed up the crowd with key mobile services.
They are: Google Goggles, a visual search application for smartphones, What’s Nearby, a location-based service, and Google search by voice in Japanese.
First, the Goggles app, which eWEEK detailed here on 4 December based on CNBC reporter Maria Bartiromo’s scoop. Google Goggles is a mobile application that lets users take a picture of a location from their smartphone and trigger a Google search that pulls up information associated with the image.
Essentially, the image a user snaps with his or her camera is a query that gets sent to Google’s cloud computing datacenters and processed with computer vision algorithms. Google’s program compares signatures against all other known items in its image recognition databases; figures out how many matches exist and returns search results.
So a user traveling in a strange land vacationing in a foreign country who sees something they want to know more about can snap a photo with their smartphone’s camera. If Google Goggles recognises the image from the company’s image recognition database, the search app will surface relevant results about that image.
This app, which basically lets users run search queries with images instead of words, can work for anything from a famous monument to a bottle of wine, Gundotra explained. Goggles identifies landmarks, works of art, and products, and is available today from Android Market for Android 1.6 devices and up.
On stage, Gundotra actually snapped a shot of a bottle of wine on his Motorola Droid smartphone and Google Goggles identified the product in seconds. However, Gundotra warned, this represents Google’s earliest efforts in the computer vision field; more enhancements are on the way.
“Today you have to frame a picture and snap a photo to get results, but in the future you will simply be able to point to it… as simple and easy as pointing your finger at an object and we’ll be able to treat it like a mouse pointer for the real world,” Gundotra said.
Google is also betting big on mobile Web services that leverage user’s locations to serve them relevant results. In today’s fresh release of Google Maps for mobile 3.3, Google has added What’s Nearby.
This feature, available as a Google Maps for mobile update from Android Market on phones running Google Android 1.6 and up, is useful for users an unfamiliar places. It works like this: users can access a map of where they are, hold their finger down on the map for a few seconds, tap on the bubble, and look for “What’s nearby?” in the menu.
Google will return a list of the 10 closest places, including restaurants, shops and other points of interest. Users can also access this feature from the My Location menu or from address search results. This is essentially another GPS killer app in the vein of Google Maps Navigation.
Eventually, Gundotra said Google will begin showing local product inventory in search results and Google Suggest will include location-specific search terms. For example, he used his Droid phone to combine his location with inventory feeds from nearby retailers. He searched for a Canon EOS camera and found them in stock at a Best Buy store 1.6 miles away.
Google also added Japanese to the Google search by voice language choices of English and Mandarin. In 2010, Google will combine voice recognition with its language translation software to providetranslation in conversation.
Gundotra also demoed this on stage, saying something in English, sending it to Google’s cloud and having the query recited back to him in Spanish.