Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Fortune
Fortune
Prarthana Prakash

Meta will now give persistent content violators up to 7 chances to be better before sending them to 'Facebook jail'

A man holding his phone against the Facebook logo (Credit: Lorenzo Di Cola/NurPhoto—Getty Images)

Facebook’s parent company Meta is clamping down on inflammatory content with a new set of rules—but is now willing to give persistent offenders numerous opportunities to see the error of their ways.

In its latest move, the social media company says it will explain why a user violated content policies up to seven times.

After the eighth offense, it will suspend the user's account, sending them to “Facebook jail,” a term that users have coined to describe being banned from the social media platform. 

Meta announced its policy change on Thursday in response to feedback from its Oversight Board that also brought forward the social media site’s shortcomings last December.

It said the new policy would prevent “over-penalizing” people when they have abided by Meta’s content regulations and lead to “faster and more impactful action.”

“Under the new system, we will focus on helping people understand why we have removed their content, which is shown to be more effective at preventing re-offending, rather than so quickly restricting their ability to post,” Monika Bickert, the vice president of content policy at Meta, wrote in a statement. 

In case of serious violations, where the content included terrorism, human trafficking or other inappropriate content, the account will face immediate action, including account removal, Bickert said. 

“The vast majority of people on our apps are well-intentioned. Historically, some of those people have ended up in ‘Facebook jail’ without understanding what they did wrong or whether they were impacted by a content enforcement mistake,” according to Bickert.

The company’s earlier policies swiftly placed month-long blocks on people’s accounts, even if their violations were minor or accidental.

While Meta’s current policy change gives more room for “well-intentioned” users to use the platform, in the past, the company had trouble with lax enforcement. 

In 2019, Brazilian footballer Neymar shared explicit images of a woman who had accused him of rape to his fanbase of many millions before Facebook took it down.

The same policies also allowed accounts to spread information found to be false about political figures like Hillary Clinton and Donald Trump. 

The Oversight Board, appointed in 2020, found last December that Meta’s “cross-check” program, which provides preferential treatment to VIP accounts, was “structured to satisfy business interests” and was deeply flawed.

The board made over 30 recommendations to Meta on enforcing fair and transparent policies.

It also said earlier this month that it has changed rules to allow expedited decision-making between two to 30 days when content policy has been violated.

In response to the new policy change, the Oversight Board said it “welcomed” the move but added that there was room for devising policies beyond this one, which addresses only “less-serious violations.”

Meta did not immediately return Fortune’s request for comment.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.