Google Assistant will be able to be activated with the gaze soon

Google hopes that its virtual assistant will be able to activate automatically when it perceives the gaze of users

Photo: Sean Gallup/Getty Images

Google is working on a new technology that will allow users to activate the virtual assistant, included in phones with Android operating system, with just a glance.

The development of this function dates from the year 2020 when Google announced that its engineers were working on an application called “Look to Speak”. This tool, which was recently mentioned in the latest event developed by the company, allows the machine to simultaneously analyze both audio, video and text.

In this way They want their behavior to be as similar as possible to that of a human being. They argue that in normal conversation people do not repeat their names over and over again.

“In natural conversations, we don’t say people’s names every time we address them,” they said.

Currently the only Google devices that incorporate this type of technology are the Google Nest Hub. However, in order for it to be activated, certain requirements must first be met.

This includes that the user is a minimum distance from the device, as well as that their face is in an orientation facing the equipment. When all these requirements are met, the system understands that the user wants to interact with it and is therefore automatically activated.

All of this is analyzed by the Google Nest Hub at all times, so that the device can anticipate the user’s intentions and therefore activate seconds before the interaction begins.

In this way, erroneous activations are avoided, something extremely important considering that this type of equipment usually records user interactions to be able to analyze them later and improve their ability to understand what the person wants.

The Google Nest Hub also has the ability to analyze the voice commands indicated by users to know if it is a query or not. For this, what is known as non-lexical information is analyzed, that is, the tone of voice, the speed at which one speaks and some contextual signals.

This may also interest you:
– How to use Google’s virtual assistant to improve your work
– The virtual assistant arrives to give information with everything related to Covid-19
– 5 new features that Siri includes in iOS 16