The Moto X lets you search Google without touching the device. It utilizes “always-on” listening, which makes the device more useful when you’re doing other things (like driving).
For now, it seems that detection while the display is turned off would require the device to be charging, though detection in apps or on the home screen would work any time…Essentially, Google wants to build in the kind of functionality seen in the Moto X to its own Search app. This means a special focus on Search at times that you can’t or shouldn’t be looking at your device for extended periods, or when you can’t type on your device to interact with Search. The main objectives appear to be as follows: enable users to activate Search with minimal work from anywhere, provide an eyes-free interface for times when users shouldn’t be looking at their device, and return results that don’t require users to look at their device.
So, what about the times when your query returns only web results, and not a spoken or immediately visual response? Google has thought of that. First, Google will speak more detailed answers to you in the car for results that already include voice feedback. Instead of simply saying “here’s the weather in [location],” it will read out the card. For results that only include web links, Google is exploring options for “keeping” the results for later, or suggesting the user exit eyes-free mode when it’s safe to do so to view the results. This is an ongoing exploration, and Google is apparently still figuring out how to negotiate the sparse interface with queries for navigation, etc. where the screen would have to return to full interface.
Google is already turning to web results for Knowledge Graph-like “answers” for a lot of queries (sometimes to humorous effect), and this should fit in nicely with the always-on search functionality.
Google recently added “Ok, Google” functionality to Chrome for U.S. desktop users.
Image via YouTube