Cyberdine details how Nest Hub Max processes Look and Talk
If you’ve got a Nest Hub Max, you have the ability to use a feature called Look and Talk. I’ve been using it in conjunction with Home/Away Assist to turn Nest Aware on and off allowing me to use the device as both a home spy, and a smarter version of Google Home.
But you might want to know how Look and Talk is triggered and the Google AI Blog has posted in great detail about how the Nest Hub Max identifies your interest, that you actually want to ask it a question, and determines if you’re Sara Connor or not.

One of the more interesting things I found from the detailing is that you can look, trigger everything, but voice match and lip movement can fail. This is pretty much laid out in the query fulfillment gif upstairs.

So I decided to test the above animation and determined that lip movement is absolutely not a requirement. You can talk like a ventriloquist, cover your mouth, etc. It will still work. If you’ve triggered the look part, you can change your head orientation and still get it to pick up.
It’s an interesting under the hood look at how your little wiretap is processing video and audio data and a decent read.
[Google AI blog]