Google will improve the accessibility of Android.A new feature, first available in Android 12, allows you to navigate within the interface using facial expressions.Here's how to use it.
Android 12 inaugurates a feature that can be very useful for people with reduced mobility: interface control with facial expressions.This tool allows you to navigate in the menus on the screen, to select elements, or to go back with an uptop of eyebrows, a look on the left or a smile.
The functionality, which is in accessibility settings, will start on Android 12, but should arrive on the previous versions of the system, during an update.Google did not say exactly when and to what versions the functionality will go back, on the other hand.Pixel phone owners will therefore be able to taste it before the others, but the tool will be democratized later.
Here's how to use it.
How to activate facial detection?
To take advantage of this new feature, you must first activate it in your phone settings.To do this :
The handling to be carried out to activate control via facial expressions.
Source: Capture NUMERAMA SCREE
If the manipulation has been properly done, you should see a small face in a bubble, at the top of the screen.This means that your phone has started to detect your movements.
How to use facial detection?
Once the option is activated, you can start controlling your phone via your facial expressions, depending on the following diagram:
During our tests (on a pixel 3), the tool proved to be reactive and efficient.Keep your mouth open for a few seconds allows you to quickly switch from one item to the other from the screen.With a smile, you can simulate the screen of the screen.Navigation is less easy than via the touch interface, but it is aimed at people to whom this option is not suitable.
Go further with facial detection
Google offers, in addition to these basic navigation elements, other expressions allowing to navigate within the bone.It is possible to configure an action when you look to the left, the top or the right.
To use them, simply select them from the camera contactor menu and assign them an action.Thus, it is possible, for example, to assign a shortcut of return to reception via an upper eye.
The detection is slightly less precise than with the default movements, but it still works quite well.On the other hand, this mechanically requires looking at the other hand than on the mobile screen.This complicates navigation a little, since the phone screen is no longer exactly in the user's field of vision or the user.
Part of the triggers usable via facial detection feature.
Source: Numerama screenshot
It is also possible to change "the intensity of facial expression", so that the phone ceases to detect involuntary movements.Thus, you can avoid potential false positives (with your phone which detects unintentional movement).
If navigation via these gestures is too fast for you, it is also possible to adjust the "duration of facial expression".This slows down the speed of the selection frame, allowing you to navigate in a slower and more precise manner.The default values tend to scroll a little too quickly to allow the precise selection of an item on the screen.
Facial recognition is carried out locally on your phone, so there is technically not to fear that Google scans all your facial expressions.
The latest news on Android 12