Technology
- Home
- Technology
- News
Meta’s smart glasses can now describe what you’re seeing in more detail
Meta announced two new features designed to assist blind or low vision users by leveraging the Ray-Ban Meta smart glasses’ camera and its access to Meta AI. The news came as part of Global Accessibility Awareness Day. Rolling out to all users in the US and Ca…

Published 6 months ago on May 17th 2025, 5:00 am
By Web Desk

Meta announced two new features designed to assist blind or low vision users by leveraging the Ray-Ban Meta smart glasses’ camera and its access to Meta AI. The news came as part of Global Accessibility Awareness Day.
Rolling out to all users in the US and Canada in the coming weeks, Meta AI can now be customized to provide more detailed descriptions of what’s in front of users when they ask the smart assistant about their environment. In a short video shared alongside the announcement, Meta AI goes into more detail about the features of a waterside park, including describing grassy areas as being “well manicured.”
[Image: Meta AI can now go into greater detail while describing what you’re looking at. https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/meta_ai_description1.jpg?quality=90&strip=all]
The feature can be activated by turning on “detailed responses” in the Accessibility section of the Device settings in the Meta AI app. Although it’s currently limited to users in the US and Canada, Meta says detailed responses will “expand to additional markets in the future,” but provided no details about when or which countries would get it next.
First announced last September as part of a partnership with the Be My Eyes organization and released last November in a limited rollout that included the US, Canada, UK, Ireland, and Australia, Meta also confirmed today that its Call a Volunteer feature will “launch in all 18 countries where Meta AI is supported later this month.”
Blind and low vision users of the Ray-Ban Meta smart glasses can use the feature to connect to a network of over 8 million sighted volunteers and get assistance with everyday tasks such as following a recipe or locating an item on a shelf. By saying, “Hey Meta, Be My Eyes,” a volunteer will be able to see a user’s surroundings through a live feed from the glasses’ camera and can provide descriptions or other assistance through its open-ear speakers.

US boycotts G20 in South Africa over ‘White Genocide’ claim
- 12 hours ago

‘Eyes-off driving’ is coming, and we’re so not ready
- 16 hours ago
Pakistan grab Hong Kong Super Sixes 2025 title
- 33 minutes ago

Pakistan lauds Turkiye, Qatar for lowering Islamabad-Kabul differences
- 15 minutes ago

Nation marks Iqbal Day with pledge to get inspiration from his principles, ideals
- 11 hours ago
Pakistan, KSA vow to further deepen defence cooperation
- 19 minutes ago

iOS 26.1 lets you tweak Liquid Glass, and it’s out now
- 16 hours ago
Proposed Constitutional Amendment: Joint parliamentary committee approves all 49 amendments clause by clause
- 9 minutes ago

Taylor: Brown's shot at Bengals' D uncalled for
- 4 hours ago

The tragedy of Laika, the first animal to orbit the earth
- 14 hours ago

Shehbaz Sharif drops PM immunity clause from 27th Amendment bill
- 5 hours ago
'Officer attaining rank of Field Marshal, Marshal of the Air Force, Admiral of the Fleet to retain uniform for life'
- a day ago
You May Like
Trending










