Google’s New Ad Technology Perfect For Those HUD Glasses We Heard About

    March 26, 2012
    Drew Bowling
    Comments are off for this post.

Google has been awarded a new patent that seems to suggest that the company will be rolling out a new advertising strategy wherein the ads that are delivered to you are based on your immediate surroundings. And no, that’s not with respect to your online surroundings, like sites you visit – this would use your literal, physical environs.

The patent, according to the language in the document, includes technology that will be able to detect elements about your environment, such as background noise or even light sources. As you can imagine, speculation has begun as to why Google would want this or what they would possibly use it for. An example cited by Google in the patent document supposes one example in which “advertisements for air conditioners can be sent to users located at regions having temperatures above a first threshold, while advertisements for winter overcoats can be sent to users located at regions having temperatures below a second threshold.”

Another example cited in the document suggests that the technology would give advertisers an immense amount of control over who their ads are sent to. For example, “a seller of noise-canceling headphones may specify that an ad for noise-canceling headphones be served to a user located in an environment where the ambient noise is above a preset level (e.g., 70 dB). The advertiser may specify that the ambient noise level be above the preset level for more than a preset period of time (e.g., noisy levels detected for at least one hour per day for at least two consecutive days).”

As you would expect, people are already decrying the new technology as creepy and Orwellian. You may not know this, but Google has a loosely agreed-upon deal (and by “loosely agreed-upon” I mean “completely made up by me just now”) with the general public that whenever they announce some kind of new technology, it’s the duty of the public to sling words like “creepy” and “Orwellian” at Google. Good job, everybody.

I digress. Unless Android-supported devices are currently hiding some sophisticated, dormant features that Google included in the devices in anticipation that they’d only later have the technology to use said hidden features, these fears seem a little knee-jerky. Given that the main function (and, seemingly, success) of the environment-based ad deployment would rely upon the need to detect details about your environment, this is assuredly meant for something they’ve got in the oven or plan to mix up sometime in the future.

Hmm. Has Google recently been developing something that uses a person’s immediate environment in order to Google-ize their surroundings? Something that might even be capable of scanning an environment, listening for cues, or even having an accelerometer that can detect speed of motion?

Oh yeah! Those neatorific Google HUD glasses that were heavily floating around in semi-rumor form last month.

This environmentally-cued ad technology sounds extremely well-suited for something like the company’s head-up display glasses, much more than for a smartphone or tablet since the ad technology would require input from a device (such as temperature, humidity, sound, light, air composition, location, and speed of movement) that’s already attuned to the surroundings. You would probably need new hardware on the devices for some of that data collection to happen – the type of hardware that may be included in Google’s HUD glasses.

The HUD glasses will presumably be constantly assessing and relaying information about the surroundings – that’s kind of the point, right, to augment your reality via the display? – and would seem much more ideal for displaying environmentally-associated advertisements like a digital billboard or digital kiosk.

Search Engine Watch had previously speculated on how advertisements might be deployed via Google’s HUD glasses, suggesting that the device would enable advertisers to “pay an additional bid amount for customers in, say, a three block radius” or even include some kind of real-time bidding. Equipping the HUD glasses with sensors capable of assembling a profile about a user based on the cues of their environment would certainly lend itself to such an advertising strategy as this.

In the end, though, all of this speculation could fall apart more easily than a bamboo house during monsoon season and Google may have completely different agendas for the technology, if they even decide to use it at all. Who knows what pots are being stirred down in the Google X labs. Anybody else got any other scenarios they’d like to share? Then share’em in the comments.