Technology
- Home
- Technology
- News
Anthropic will start training its AI models on chat transcripts
Anthropic will start training its AI models on user data, including new chat transcripts and coding sessions, unless users choose to opt out. It's also extending its data retention policy to five years - again, for users that don't choose to opt out. All user…

Published 4 ماہ قبل on اگست 31 2025، 5:00 صبح
By Web Desk

Anthropic will start training its AI models on user data, including new chat transcripts and coding sessions, unless users choose to opt out. It’s also extending its data retention policy to five years — again, for users that don’t choose to opt out.
All users will have to make a decision by September 28th. For users that click “Accept” now, Anthropic will immediately begin training its models on their data and keeping said data for up to five years, according to a blog post published by Anthropic on Thursday.
The setting applies to “new or resumed chats and coding sessions.” Even if you do agree to Anthropic training its AI models on your data, it won’t do so with previous chats or coding sessions that you haven’t resumed. But if you do continue an old chat or coding session, all bets are off.
The updates apply to all of Claude’s consumer subscription tiers, including Claude Free, Pro, and Max, “including when they use Claude Code from accounts associated with those plans,” Anthropic wrote. But they don’t apply to Anthropic’s commercial usage tiers, such as Claude Gov, Claude for Work, Claude for Education, or API use, “including via third parties such as Amazon Bedrock and Google Cloud’s Vertex AI.”
New users will have to select their preference via the Claude signup process. Existing users must decide via a pop-up, which they can defer by clicking a “Not now” button — though they will be forced to make a decision on September 28th.
But it’s important to note that many users may accidentally and quickly hit “Accept” without reading what they’re agreeing to.
[Image: Anthropic’s new terms. https://platform.theverge.com/wp-content/uploads/sites/2/2025/08/0.png?quality=90&strip=all]
The pop-up that users will see reads, in large letters, “Updates to Consumer Terms and Policies,” and the lines below it say, “An update to our Consumer Terms and Privacy Policy will take effect on September 28, 2025. You can accept the updated terms today.” There’s a big black “Accept” button at the bottom.
In smaller print below that, a few lines say, “Allow the use of your chats and coding sessions to train and improve Anthropic AI models,” with a toggle on / off switch next to it. It’s automatically set to “On.” Ostensibly, many users will immediately click the large “Accept” button without changing the toggle switch, even if they haven’t read it.
If you want to opt out, you can toggle the switch to “Off” when you see the pop-up. If you already accepted without realizing and want to change your decision, navigate to your Settings, then the Privacy tab, then the Privacy Settings section, and, finally, toggle to “Off” under the “Help improve Claude” option. Consumers can change their decision anytime via their privacy settings, but that new decision will just apply to future data — you can’t take back the data that the system has already been trained on.
“To protect users’ privacy, we use a combination of tools and automated processes to filter or obfuscate sensitive data,” Anthropic wrote in the blog post. “We do not sell users’ data to third-parties.”
Third ‘Avatar’ film lights up global box offices
- 18 hours ago

PDMA issues alert about rains, snowfall over hills in KP
- 2 days ago
Bangladesh holds state funeral for slain youth leader amid tight security
- 2 days ago
Commissioning ceremony of 2nd Pak Navy Ship KHAIBAR held in Turkiye
- a day ago
May 9: Yasmin Rashid, Mahmoodur Rashid, others sentenced to 10 years’ imprisonment each in two more cases
- 2 days ago
Thai border clashes displace over half a million in Cambodia
- a day ago

Nine terrorists neutralised in two KP IBOs: ISPR
- 18 hours ago
Only state can declare jihad in Islamic country, says COAS Syed Asim Munir
- a day ago

Larry Ellison’s big dumb gift to his large adult son
- 11 hours ago
Green Shirts give India humiliating defeat in U-19 Asia Cup final
- a day ago

What does Trump’s AI czar want?
- 9 hours ago
Death anniversary of Hafeez Jalandhari being observed today
- a day ago
You May Like
Trending









