[ad_1]
Users can scan the phone screen and select tasks by smiling, raising eyebrows, opening mouth, etc.
Google said on Thursday that by raising eyebrows or smiling, people with speech or physical disabilities can now operate their Android smartphones hands-free.
Two new tools put machine learning and front-facing cameras on smartphones to detect facial and eye movements.
The user can scan the phone screen and select tasks by smiling, raising his eyebrows, opening his mouth or looking left, right or upwards.
“To make it easier for everyone to use Android, we have introduced new tools that will make it easier for you to control your phone and communicate using facial gestures,” Google said.
The U.S. Centers for Disease Control and Prevention estimates that 61 million adults in the U.S. have disabilities, which prompted Google and rivals Apple and Microsoft to make it easier for them to access products and services.
“Every day, people use voice commands like’Hey Google’ or their hands to navigate their phones,” the tech giant said in a blog post.
“However, this is not always feasible for people with severe motor and language impairments.”
These changes are the result of two new features, one is called the “camera switch”, which allows people to interact with their smartphones using their face instead of swiping and tapping.
The other is Project Activate, which is a new Android application that allows people to use these gestures to trigger an action, such as making the phone play a recorded phrase, send a text, or make a phone call.
“Now anyone can use eye movements and facial gestures tailored to their range of motion to navigate their phone-without hands and sound,” Google said.
The free Activate app is available for purchase in the Google Play store in Australia, the United Kingdom, Canada, and the United States.
[ad_2]
Source link