- Home
- Technology
- News
Disabled people can now use Android devices with facial gestures
Users can scan their cellphone display screen and choose an activity by smiling, elevating eyebrows, opening their mouth, or seeking to the left, proper or up.

San Francisco: Individuals with speech or bodily disabilities can now use their Android-powered smartphones hands-free— using a raised eyebrow or smile.
On September 23, Google said that it has put two new tools, machine learning and front-facing cameras on smartphones to work detecting face and eye movements.
The changes are the result of two new features, one is called "Camera Switches," which lets people use their faces instead of swipes and taps to interact with smartphones.
The other is ‘Project Activate’, a new Android application which allows people to use those gestures to trigger an action, like having a phone play a recorded phrase, send a text, or make a call.
Customers can scan their cellphone display screen and choose an activity by smiling, elevating eyebrows, opening their mouth, or seeking to the left, proper or up.
“To make Android extra accessible for everybody, we’re launching new instruments that make it simpler to manage your cellphone and talk utilizing facial gestures,” Google mentioned.
According to Centers for Disease Control and Prevention (CDCP), nearly 61 million adults in the United States (US) live with disabilities, which has pushed Google and rivals Apple and Microsoft to make products and services more accessible to them.
The tech giant in a blog post said, "Every day, people use voice commands, like 'Hey Google,' or their hands to navigate their phones".
"However, that's not always possible for people with severe motor and speech disabilities," it added.
"Now it's possible for anyone to use eye movements and facial gestures that are customized to their range of movement to navigate their phone - sans hands and voice," Google said.
So far, the free Activate app is available in Australia, Britain, Canada and the United States at the Google Play shop.
On the other hand, Apple built an "AssistiveTouch" feature into the software powering its smart watch to let touchscreen displays be controlled by sensing movements such as finger pinches or hand clenches.
AssistiveTouch also works with Voice Over so you can navigate Apple Watch with one hand while using a cane or leading a service animal.
Earlier, Google also expanded the ‘Lookout app’ which uses computer vision to help people with impaired vision.
Now, the app can read out read out handwritten text in Latin-based languages, so users can read notes or letters sent to them. Additionally, Lookout's currency mode was expanded to include rupees and euros.
Pakistan, Russia express desire to develop mutual cooperation
- 4 hours ago
Arteta tells critics to back off struggling Gyökeres
- 14 hours ago

Gold prices surge, silver hits historic high in Pakistan
- 3 hours ago

Many games from The Game Awards are now on sale
- 6 hours ago
India summons Bangladesh envoy over security concerns in Dhaka
- 3 hours ago

Tremors felt in Balochistan's Barkhan district
- 2 hours ago

My defense of a $40 cable paperweight – I’m sorry
- 15 hours ago
NDMA dispatches 27th aid consignment for Palestinians
- an hour ago

NFL Week 16 odds: Eagles can clinch NFC East with win vs. Commanders
- 3 hours ago

Please don’t make airports healthy again. Just make them more efficient.
- 13 hours ago

YouTube made its video player easier to navigate on TVs
- 15 hours ago
Marko: Verstappen would've won if Horner left earlier
- 3 hours ago







