Discovered by 9to5Google in a recent deep dive of APK Insights was a feature for the Nest Hub Max called ‘Look and talk’† Codenamed “Blue Steel” in reference to the Ben Stiller movie “Zoolander,” the feature was just officially unveiled at Google I/O 2022, and it’s a small part of a much larger initiative where Google hopes to make its assistant feel much more natural. and comfortable to communicate with.
Look and Talk for the Nest Hub Max
The goal is to easily start a conversation with the Google Assistant on any device and speak and be understood naturally. Our votes are becoming the fastest way to tap into computer science and process questions, but so far it’s not perfect. For example, having to say “Hey Google” every time you want to ask your Nest Hub or phone something – nothing at all. †look and talk” tries to solve that by simply having you look at your Nest Hub Max up to 1.5 meters away and activate it before you speak.
You don’t have to worry about the right way to ask anymore, just relax and talk naturally
Sissie Hsiao, VP/GM, Google Assistant at Google
To accomplish this, the Assistant uses six machine learning models to process more than 100 signals from both the Hub’s camera and microphone in real time. Things like your head orientation, your proximity to the device, the direction of your gaze, how your lips move, and any other contextual awareness needed to accurately process your request.
Let me start by saying that this is absolutely wild. We all knew that the need for the hotword would one day fade into the background of smart home use, but until now we weren’t sure exactly how that would happen and had concerns about how Google could do it in a way that both efficiently and with respect for the user’s privacy.
As for Look and Talk, it is considered safe and all data is processed on the device and is never sent to Google or anyone else. It also uses both facial recognition and voice recognition at the same time, so it only works if it recognizes that it’s really you and you’re the one making the request when you look at your device, which is super smart.
Assistant handles pauses and filler words in voice requests
One of the biggest complaints I have when trying to teach my household how to properly use the Google Assistant is that they often pause halfway through their requests and the Assistant thinks they’re done talking. It will process the command, return to the results and not fulfill their request, and it makes them frustrated. In reality, they weren’t quite done talking yet, but instead they just came up with the right word to use. In this way, many millions of users are apprehensive about using the Assistant as they feel they need to perfect their thoughts before speaking. This is both inconvenient and annoying.
To fix this, Google is creating a more extensive neural network running on its Tensor chip and implementing a more polite and patient AI helper. Rather than assume the user has finished speaking just because there is a bit of silence, it will process what has been said and see if it makes sense or is a full sentence before shutting down the microphone. If it’s not quite sure that the user has conveyed a full thought or request (read: if they use awkward pauses or filler words like “ummm” while thinking out loud), it will gently encourage them to finish by saying “mm hm”. say.
In addition, it seems that the Google Assistant also gets a better idea of how to complete the user’s thoughts on their behalf. So polite, right? When Sissie Hsiao demonstrated this on stage, she used the word “something”, and instead the assistant understood how to complete the song title and automatically decided to play it on Spotify!
Basically, this is meant in the sense of “yes? and what else? Please keep going…” Admittedly, I was over the moon when I saw this demonstrated live at Google I/O 2022, and it will probably be the only update which solves most of my smart home frustrations.
Quick Phrases and Real Tone Applications
Quick Phrases – Google’s first implementation to eliminate the need for the “Okay Google” hotword for specific tasks like setting timers, switching smart bulbs, and more – is expanding to the Nest Hub Max! Finally, Real Tone, an effort to improve Google’s camera and imaging products for different skin tones to properly display users of different backgrounds, will work with Nest Hub Max for Look and Talk. More work in progress on this feature with the new monk skin color scale launched today (really, it’s just been made open-source thanks to its creator, Harvard professor Dr. Ellis Monk) Google AI’s skin color research aims to help improve the evaluation of skin color in machine learning and provides a set of recommended practices are used in ML Honesty†
If you’d like to watch the full Google I/O 2022 Keynote, you can do so below. We also have plenty of coverage of everything announced and revealed yesterday, so be sure to take a look† Let me know in the comments which of these Google Assistant features you’re most excited about and if you think Google is making history or playing with fire on AI and machine learning.
#Google #Assistant #activated #hot #word #patient #filler #words #pauses