Connect with us

Technology

Disabled people can now use Android devices with facial gestures

Users can scan their cellphone display screen and choose an activity by smiling, elevating eyebrows, opening their mouth, or seeking to the left, proper or up.

Published

on

Google unfolds new ways to control Android devices with facial gestures
Google unfolds new ways to control Android devices with facial gestures

San Francisco: Individuals with speech or bodily disabilities can now use their Android-powered smartphones hands-free— using a raised eyebrow or smile.

On September 23, Google said that it has put two new tools, machine learning and front-facing cameras on smartphones to work detecting face and eye movements.

The changes are the result of two new features, one is called "Camera Switches," which lets people use their faces instead of swipes and taps to interact with smartphones.

The other is ‘Project Activate’, a new Android application which allows people to use those gestures to trigger an action, like having a phone play a recorded phrase, send a text, or make a call.

Customers can scan their cellphone display screen and choose an activity by smiling, elevating eyebrows, opening their mouth, or seeking to the left, proper or up.

“To make Android extra accessible for everybody, we’re launching new instruments that make it simpler to manage your cellphone and talk utilizing facial gestures,” Google mentioned.

According to Centers for Disease Control and Prevention (CDCP), nearly 61 million adults in the United States (US) live with disabilities, which has pushed Google and rivals Apple and Microsoft to make products and services more accessible to them.

The tech giant in a blog post said, "Every day, people use voice commands, like 'Hey Google,' or their hands to navigate their phones".

"However, that's not always possible for people with severe motor and speech disabilities," it added.

"Now it's possible for anyone to use eye movements and facial gestures that are customized to their range of movement to navigate their phone - sans hands and voice," Google said.

So far, the free Activate app is available in Australia, Britain, Canada and the United States at the Google Play shop.

On the other hand, Apple built an "AssistiveTouch" feature into the software powering its smart watch to let touchscreen displays be controlled by sensing movements such as finger pinches or hand clenches.

AssistiveTouch also works with Voice Over so you can navigate Apple Watch with one hand while using a cane or leading a service animal.

Earlier, Google also expanded the ‘Lookout app’ which uses computer vision to help people with impaired vision.

Now, the app can read out read out handwritten text in Latin-based languages, so users can read notes or letters sent to them. Additionally, Lookout's currency mode was expanded to include rupees and euros.

Trending