- Home
- Technology
- News
Disabled people can now use Android devices with facial gestures
Users can scan their cellphone display screen and choose an activity by smiling, elevating eyebrows, opening their mouth, or seeking to the left, proper or up.

San Francisco: Individuals with speech or bodily disabilities can now use their Android-powered smartphones hands-free— using a raised eyebrow or smile.
On September 23, Google said that it has put two new tools, machine learning and front-facing cameras on smartphones to work detecting face and eye movements.
The changes are the result of two new features, one is called "Camera Switches," which lets people use their faces instead of swipes and taps to interact with smartphones.
The other is ‘Project Activate’, a new Android application which allows people to use those gestures to trigger an action, like having a phone play a recorded phrase, send a text, or make a call.
Customers can scan their cellphone display screen and choose an activity by smiling, elevating eyebrows, opening their mouth, or seeking to the left, proper or up.
“To make Android extra accessible for everybody, we’re launching new instruments that make it simpler to manage your cellphone and talk utilizing facial gestures,” Google mentioned.
According to Centers for Disease Control and Prevention (CDCP), nearly 61 million adults in the United States (US) live with disabilities, which has pushed Google and rivals Apple and Microsoft to make products and services more accessible to them.
The tech giant in a blog post said, "Every day, people use voice commands, like 'Hey Google,' or their hands to navigate their phones".
"However, that's not always possible for people with severe motor and speech disabilities," it added.
"Now it's possible for anyone to use eye movements and facial gestures that are customized to their range of movement to navigate their phone - sans hands and voice," Google said.
So far, the free Activate app is available in Australia, Britain, Canada and the United States at the Google Play shop.
On the other hand, Apple built an "AssistiveTouch" feature into the software powering its smart watch to let touchscreen displays be controlled by sensing movements such as finger pinches or hand clenches.
AssistiveTouch also works with Voice Over so you can navigate Apple Watch with one hand while using a cane or leading a service animal.
Earlier, Google also expanded the ‘Lookout app’ which uses computer vision to help people with impaired vision.
Now, the app can read out read out handwritten text in Latin-based languages, so users can read notes or letters sent to them. Additionally, Lookout's currency mode was expanded to include rupees and euros.

Please don’t make airports healthy again. Just make them more efficient.
- 6 hours ago

My defense of a $40 cable paperweight – I’m sorry
- 8 hours ago

Young Leaders Conference 2025 highlights social stewardship on day two
- 15 hours ago
AI boom seen lifting chipmaking equipment sales 9pc to $126bn in 2026
- 15 hours ago
Sabalenka named WTA Player of the Year for second straight season
- 18 hours ago
Arteta tells critics to back off struggling Gyökeres
- 7 hours ago

YouTube made its video player easier to navigate on TVs
- 8 hours ago
Tagic Army Public School (APS) Peshawar incident completes 11 painful years
- 14 hours ago
Pakistan qualify for semi-final of under 19 Asia Cup cricket
- 15 hours ago

Gold prices dip per tola in Pakistan, global markets
- 18 hours ago

A Kinect for kids is outselling Xbox to become the hot console this holiday
- 8 hours ago

Assailants kill cop, brother in gun attack in KP’s Lakki Marwat
- 15 hours ago





