Google Glass Gets Bone Conduction Audio
Google’s Glass wearable computing device will transmit sounds through bone conduction, rather than speakers, according to a patent filing
Another intriguing Google Glass detail has been revealed: The wearable eyeglasses-mounted computer will apparently transmit sound to its user via vibrations though human bones, rather than using traditional speakers.
Google filed a US patent application on 24 January for its “wearable computing device with indirect bone-conduction speaker”, which calls for a sound transmission system that transmits sounds to the user’s bones without the use of typical earbuds or speakers.
Bone conduction
Such bone-conduction systems already are on the market in some products and headsets, but until now there was no mention of this in earlier Google papers filed about Google Glass.
Google also filed additional paperwork with the US Federal Communications Commission (FCC) describing the feature for its Google Glass project, which the company hopes to market in 2014, according to a story by ArsTechnica.
“The headset, likely the Explorer edition promised to developers at Google I/O last year, includes an 802.11 b/g 2.4GHz WLAN, a low-energy Bluetooth 4.0 radio, and – if one sentence and a corresponding patent are to be believed – a ‘vibrating element’ for transmitting sound to the user’s head via bone conduction,” according to the ArsTechnica report.
“Exemplary wearable computing systems may include a head-mounted display that is configured to provide indirect bone-conduction audio,” Google wrote in its patent application. “For example, an exemplary head-mounted display may include at least one vibration transducer that is configured to vibrate at least a portion of the head-mounted display based on the audio signal. The vibration transducer is configured such that when the head-mounted display is worn, the vibration transducer vibrates the head-mounted display without directly vibrating a wearer.”
Instead, “the head-mounted display structure vibrationally couples to a bone structure of the wearer, such that vibrations from the vibration transducer may be indirectly transferred to the wearer’s bone structure,” the patent application states.
Testing
The Google Glass headset-mounted, wearable computer project was officially unveiled in July 2012 at the annual Google I/O conference. Developers who attended the event are being given the opportunity to buy the first “Explorer Edition” units of the product when they are offered for sale this year.
Google recently held two “hackathon” events in New York City and San Francisco as part of its “Glass Foundry” programme to collect developer input for the devices so far, with an emphasis on developing the Google Mirror API.
Attendees were given access to a Google Glass device for use and testing. The basic components of Google Glass feature an Android-powered display, a tiny webcam, a GPS locator and an Internet connection node built into one side of a pair of glasses.
The glasses are lightweight and may or may not have lenses. According to Google’s patent application for Glass, the glasses use a side-mounted touchpad that allows users to control its various functions.
User interaction
The glasses will be able to display a wide range of views, depending on user needs and interests. One potential view is a real-time image on the see-through display on the glasses, the patent application states.
One description details how the side-mounted touchpad could be a physical or virtual component and that it could include a heads-up display on the glasses with lights that get brighter as the user’s finger nears the proper touchpad button.
On the heads-up display viewed by the user on the glasses, the side-mounted touchpad buttons would be represented as a series of dots so they can operate them by feel, the applications states.
“The dots may be displayed in different colors. It should be understood that the symbols may appear in other shapes, such as squares, rectangles, diamonds or other symbols.”
Also described in the patent application are potential uses of a microphone, a camera, a keyboard and a touchpad, either one at a time or together.
User interaction
The device could even include capabilities to understand and show just what the user wants to see, according to the patent application. In the absence of an explicit instruction to display certain content, the exemplary system may intelligently and automatically determine content for the multimode input field that is believed to be desired by the wearer.
An actual Google Glass device was spotted in public on 21 January being used by Google co-founder Sergey Brin on a New York subway train (pictured).
The sighting was posted by a device hardware prototyper and hardware hacker who recognised the device and spoke briefly to Brin about the project.
Are you a Google expert? Take our quiz!
Originally published on eWeek.