Apple is adding a new child safety feature that lets kids send a report to Apple when they are sent photos or videos with nudity, according to The Guardian. After reviewing anything received, the company can report messages to law enforcement.
- Home
- Technology
- News
A new iMessage safety feature prompts kids to report explicit images to Apple
Apple is adding a new child safety feature to iMessage that lets kids send a report to Apple when they are sent photos or videos with nudity, according to The Guardian. After getting a report, the company can choose report messages to law enforcement.


The new feature expands on Apple’s Communication Safety feature, which uses on-device scanning to detect nudity in photos or videos received via Messages, AirDrop, or Contact Poster and blur them out. In addition to blurring the photo or video, Apple also shows a pop-up with options to message an adult, get resources for help, or block the contact.
As part of this new feature, which is in testing now in Australia with iOS 18.2, users will also be able to send a report to Apple about any images or videos with nudity.
“The device will prepare a report containing the images or videos, as well as messages sent immediately before and after the image or video,” The Guardian says. “It will include the contact information from both accounts, and users can fill out a form describing what happened.” From there, Apple will look at the report, and it can choose to take actions such as stopping a user from sending iMessages or reporting to law enforcement.
Earlier this week, Google announced an expansion of on-device scanning of text messages in its Android app that will include an optional Sensitive Content Warning that blurs images with nudity as well as offering “help-finding resources and options.” Once it rolls out, the feature will be enabled by default for users under 18.
The Guardian says that Apple plans to make the new feature available globally but didn’t specify when that might happen. Apple didn’t immediately reply to a request for comment.
In 2021, Apple announced a set of child safety features that included scanning a user’s iCloud Photos library for child sexual abuse material and would alert parents when their kids sent or received sexually explicit photos. After privacy advocates spoke out against the plan, Apple delayed the launch of those features to go back to the drawing board, and it dropped its plans to scan for child sexual abuse imagery in December 2022.

The best way to help Hurricane Melissa survivors may not be what you think
- 9 گھنٹے قبل

DOJ indicts a congressional candidate in Chicago
- 9 گھنٹے قبل

The tariffs case is Trump’s ultimate loyalty test for the Supreme Court
- 9 گھنٹے قبل

No-confidence motion against the PM (AJK) has been delayed again
- 4 گھنٹے قبل

ExxonMobil accuses California of violating its free speech
- 11 گھنٹے قبل

Pakistan and Afghan Taliban agreed to maintain a ceasefire during talks
- 7 گھنٹے قبل

The BrickBoy upgrade kit costs more than twice as much as the Lego Game Boy
- 11 گھنٹے قبل

Projecting the CFP top 12 after Week 9: Vandy's in the field!
- 10 گھنٹے قبل

Any external aggression will be met with a strong and severe response, DG ISPR
- 3 گھنٹے قبل

YouTube will restrict more content showing ‘graphic violence’ in video games
- 11 گھنٹے قبل

If Obamacare works, why is my health care more expensive?
- 9 گھنٹے قبل

Oppo Find X9 Pro’s massive 7,500mAh battery arrives in Europe
- 11 گھنٹے قبل







.jpg&w=3840&q=75)


