Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Guide
Tom’s Guide
Technology
Amanda Caswell

Claude AI will start training on your data soon — here's how to opt out before the deadline

Claude on laptop.

Anthropic, the company behind Claude, is making a significant change to how it handles your data. Starting today, Claude users will be asked to either let Anthropic use their chats to train future AI models or opt out and keep their data private. If you don’t make a choice by September 28, 2025, you’ll lose access to Claude altogether.

Previously, Anthropic had a privacy-first approach, meaning your chats and code were automatically deleted after 30 days unless required for legal reasons. Starting today, however, unless you opt out, your data will be stored for up to five years and fed into training cycles to help Claude get smarter.

The new policy applies to all plans including Free, Pro, and Max, as well as Claude Code under those tiers. Business, government, education and API users aren’t affected.

How it works

(Image credit: Shutterstock)

New users will see the choice during sign-up. Existing users will get a pop-up called “Updates to Consumer Terms and Policies.” The big blue Accept button opts you in by default, while a smaller toggle lets you opt out.

If you ignore the prompt, Claude will stop working after September 28.

It’s important to note, only future chats and code are impacted. Anything you wrote in the past won’t be used unless you re-open those conversations. And if you delete a chat, it won’t be used for training.

You can change your decision later in Privacy Setting, but once data has been used for training, it can’t be pulled back.

Why Anthropic is making this change

(Image credit: Getty Images)

The company says user data is key to making Claude better at vibe and traditional coding, reasoning and staying safe. Anthropic also stresses that it doesn’t sell data to third parties and uses automated filters to scrub sensitive information before training.

Still, the change shifts the balance from automatic privacy to automatic data sharing unless you say otherwise.

What will your choice be?

Anthropic’s decision essentially puts Claude on a faster track to improving the real-world data it trains on; the better it can get at answering complex questions, writing code and avoiding mistakes, the more progress it can make.

Yet, this progress comes at a cost to users who may now be part of the training set, stored for years instead of weeks.

For casual users, this might not feel like a big deal. But for privacy-minded users, or anyone who discusses work projects, personal matters or sensitive information in Claude, this update could be a red flag.

With the default switched on, the responsibility is now on you to opt out if you want your data kept private.

Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.

More from Tom's Guide

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.