Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Input
Input
Technology
Raymond Wong

'Look and Talk' lets you use Google Assistant on Nest Hub Max without 'Hey Google'

At Google I/O 2022, Google announced a litany of new features — everything from new stuff for Maps, Docs, and Translate — coming to its services. As always, if you read between the lines, it all lays the groundwork for even more ambient computing in which AI and machine learning help you understand the world around you like never before. It might even secretly be the groundwork for future AR or mixed reality.

One announcement stood out to me: Look and Talk — a way to interact with a Nest Hub Max smart display without needing to say “Hey Google” over and over. Instead, the camera-equipped Hub Max uses its built-in camera and proximity sensor to identify when a person is in front of it and to start listening for commands.

Google says the Hub Max can identify people using its Face and Voice Match technology. Google’s Sissie Hsiao says the Hub Max uses a number of identifiers for Look and Talk, including head orientation, proximity, gaze detection, lip movement, and contextual awareness.

In a live demo, Hsiao looked right at her Hub Max without ever uttering “Hey Google” and started talking to the Assistant. It sure looked impressive. Anyone who has shouted the wake word at their devices knows how frustrating it can be to repeat it all day, especially at night when you don’t want your neighbors hearing you.

To assist with Look and Talk, the Google Assistant has received an upgrade in understanding the filler words that you might spit out while asking a question or making a voice command. Now, the Assistant can understand your “umms” and other filler words you might stutter with, process it, and carry on.

No love for regular Nest Hubs?

Look and Talk seems to be exclusive to the Hub Max, a product released in 2019. Google hasn’t refreshed the hardware since.

Without cameras, the first and second-generation Nest Hubs paper to be left out of the voice-free fun. I understand the first-gen Hub not getting Look and Talk, but I would have loved to see Google make use of the Soli radar sensor in the Hub 2 for proximity detection.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.