Technology
- Home
- Technology
- News
Facebook is starting to feed its AI with private, unpublished photos
For years, Meta's trained its AI programs using the billions of public images uploaded by users onto Facebook and Instagram's servers. But apparently, Meta has decided to try training its AI on the billions of images that users haven't uploaded to those serve…

Published 10 months ago on Jul 4th 2025, 2:00 pm
By Web Desk

For years, Meta trained its AI programs using the billions of public images uploaded by users onto Facebook and Instagram’s servers. Now, it’s also hoping to access the billions of images that users haven’t uploaded to those servers. Meta tells The Verge that it’s not currently training its AI models on those photos, but it would not answer our questions about whether it might do so in future, or what rights it will hold over your camera roll images.
On Friday, TechCrunch reported that Facebook users trying to post something on the Story feature have encountered pop-up messages asking if they’d like to opt into “cloud processing”, which would allow Facebook to “select media from your camera roll and upload it to our cloud on a regular basis”, to generate “ideas like collages, recaps, AI restyling or themes like birthdays or graduations.”
By allowing this feature, the message continues, users are agreeing to Meta AI terms, which allows their AI to analyze “media and facial features” of those unpublished photos, as well as the date said photos were taken, and the presence of other people or objects in them. You further grant Meta the right to “retain and use” that personal information.
Meta recently acknowledged that it scraped the data from all the content that’s been published on Facebook and Instagram since 2007 to train its generative AI models. Though the company stated that it’s only used public posts uploaded from adult users over the age of 18, it has long been vague about exactly what “public” entails, as well as what counted as an “adult user” in 2007.
Meta tells The Verge that, for now, it’s not training on your unpublished photos with this new feature. “[The Verge’s headline] implies we are currently training our AI models with these photos, which we aren’t. This test doesn’t use people’s photos to improve or train our AI models,” Meta public affairs manager Ryan Daniels tells The Verge.
Meta’s public stance is that the feature is “very early,” innocuous and entirely opt-in: “We’re exploring ways to make content sharing easier for people on Facebook by testing suggestions of ready-to-share and curated content from a person’s camera roll. These suggestions are opt-in only and only shown to you – unless you decide to share them – and can be turned off at any time. Camera roll media may be used to improve these suggestions, but are not used to improve AI models in this test,” reads a statement from Meta comms manager Maria Cubeta.
On its face, that might sound not altogether different from Google Photos, which similarly might suggest AI tweaks to your images after you opt into Google Gemini. But unlike Google, which explicitly states that it does not train generative AI models with personal data gleaned from Google Photos, Meta’s current AI usage terms, which have been in place since June 23, 2024, do not provide any clarity as to whether unpublished photos accessed through “cloud processing” are exempt from being used as training data — and Meta would not clear that up for us going forward.
And while Daniels and Cubeta tell The Verge that opting in only gives Meta permission to retrieve 30 days worth of your unpublished camera roll at a time, it appears that Meta is retaining some data longer than that. “Camera roll suggestions based on themes, such as pets, weddings and graduations, may include media that is older than 30 days,” Meta writes.
Thankfully, Facebook users do have an option to turn off camera roll cloud processing in their settings, which, once activated, will also start removing unpublished photos from the cloud after 30 days.
The feature suggests a new incursion into our previously private data, one that bypasses the point of friction known as conscientiously deciding to post a photo for public consumption. And according to Reddit posts found by TechCrunch, Meta’s already offering AI restyling suggestions on previously-uploaded photos, even if users hadn’t been aware of the feature: one user reported that Facebook had Studio Ghiblified her wedding photos without her knowledge.
Correction, June 27th: An earlier version of this story implied Meta was already training AI on these photos, but Meta now states that the current test does not yet do so. Also added statement and additional details from Meta.

ChatGPT downloads are slowing — and may cause problems for OpenAI’s IPO
- 14 hours ago

Foreign Office terms social media post by British SRA as one-sided
- 2 hours ago
Finance Minister vows investor-friendly policy environment
- an hour ago

Is this ‘de-extinction’ project actually onto something?
- 14 hours ago

What haunts America’s animal shelter workers
- 12 hours ago

Microsoft Office can now be controlled with Logitech’s MX Creative Console
- 5 hours ago

Larry’s risky business
- 14 hours ago
Iranian proposal rejected by Trump would open strait before nuclear talks, Iran official says
- 4 hours ago

Ex Senator Mushtaq Ahmad released from Israeli custody: Dar
- 3 hours ago

Microsoft is giving its Xbox employees an Xbox email address
- 5 hours ago
US bypasses congressional review for military sales of $8.6bn to Middle East allies
- 2 hours ago

General Motors is adding Gemini to four million cars
- 14 hours ago
You May Like
Trending








