Get all your news in one place.
100’s of premium titles.
One app.
Start reading
NL Team

Of borderline content ignored by Facebook, 40% was problematic: Indian Express

Of the total “borderline content” ignored by moderators at social media giant Meta, formerly known as Facebook, nearly 40 per cent was problematic and included “hate”, “nudity” and “violence”, according to a report in the Indian Express citing an internal memo of the company, dated April 15, 2019.

“Borderline” content is defined by Facebook as content that “does not violate the community standards” but is identifiably problematic in an area covered by these “community standards”. The data was calculated with Facebook’s internal algorithms that counted views and interactions.

The memo revealed that such “borderline” content was actually viewed 10 times more, in comparison to content that was flagged as outrightly in violation of Facebook’s guidelines

The memo, dated April 15, 2019, came almost a year after Mark Zuckerberg, Meta Platforms Inc Chief Executive Officer, shared a detailed note on how the platform planned to filter and demote borderline content in 2018.

The report pointed out that the company underlined the “need for intervention to counteract perverse incentive” to reach more views as “borderline” content “gets more interaction than benign posts”.

The Indian Express quoted a Meta spokesperson on the conflict between high engagement with borderline posts and the need to remove such posts.

“We train AI systems to detect borderline content so we can distribute the content less,”he said.

The spokesperson said that content identified as “hate speech or violence and incitement” by those technologies is distributed “significantly” less, to decrease the risk of “problematic content going viral and potentially inciting violence ahead of or during the election”.

This comes after a series of internal documents and a Facebook whistleblower recently revealed that Facebook did not take significant action despite hate speech and “problematic content” being flagged in India in the internal reports of the company.

This isn’t the first time the company’s internal memo has been made public.A report based on an internal strategy note dated August 6, 2019 had revealed this month that Facebook staff tasked with reviewing hate speech on the platform had faced cost cuts even as divisive content increased across most markets, including India.

Also Read: Facebook said no problem as staff memos pointed out hate speech in India: Indian Express report

Newslaundry is a reader-supported, ad-free, independent news outlet based out of New Delhi. Support their journalism, here.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.