Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Fortune
Fortune
Rachyl Jones

The anti-hate group sued by Elon Musk released a new report on X’s content moderation

(Credit: Tolga Akmen/EPA/Bloomberg via Getty Images)

The Center for Countering Digital Hate, the nonprofit group that Elon Musk sued earlier this year, released a new report on Tuesday alleging that X has systematically failed to remove anti-semitic, Anti-Palestinian and Anti-Muslim content from the platform formerly known as Twitter.

According to the group, 98% of 200 tweets that it flagged to X were still posted on the platform after two weeks. The CCDH report comes amid concerns of a surge in social media disinformation and hate speech related to the Israel-Hamas war.

The report suggests ongoing shortcomings in X's content moderation capabilities just months after the CCDH released a similarly critical report about racist tweets on the platform—a move that resulted in a lawsuit from X.

“We will not be cowed by bullies and legal threats,” Imran Ahmed, chief executive officer of the nonprofit in charge of the studies, told Fortune. 

On October 31, the Center for Countering Digital Hate (CCDH) reported 200 posts that researchers determined violated the social media platform’s policies by spreading racist content. Posts, which the nonprofit flagged through X’s own reporting system, included Holocaust denial, racist caricatures, calls for violence, and conspiracy theories. One week later, 196 remained visible on the site, the report shows. On Monday, two weeks after the CCDH notified X, 196 still appeared online, Fortune confirmed.

CCDH's Ahmed said the latest report is especially important amid the new extremes of disinformation and hate tied to the Middle East conflict. “Good information and disinformation are intermingling seamlessly, with disinformation getting the blue badge of approval by Musk and being algorithmically promoted,” he said, referring to X’s amplification of posts from users who pay for verification status. “Also, because [X has] been sacking people in the last year, and we want to know how they’re doing now,” he said. Elon has cut 6,000 people, or 80% of its workforce, since he took over in October 2022. The company has lost two heads of trust and safety in the last year, and while it is unclear how many content moderation employees X has, the department has faced multiple rounds of layoffs this year. Musk and CEO Linda Yaccarino are currently managing the team. 

It's not clear from the report if the offensive tweets remained posted on X because the company had made a determination that they did not violate its content policies, or because X had not reviewed the tweets. A CCDH representative said they weren't able to confirm whether they received a response from X with determinations about the acceptability of the flagged tweets. X didn’t respond to a request for comment. 

Posts flagged by CCDH have garnered more than 25 million views as of Monday, driven largely by a small number of posts, while half have less than 1,000 views each. The CCDH felt the week-long timeframe was sufficient for X to respond to flagged posts, Ahmed said. Companies sometimes don’t respond to requests for months, though the damage can be done in a day, he said. Of the 101 accounts publishing the content, only two were suspended. X temporarily locked one additional account, but it was unlocked on Monday. 

“Elon Musk was annoyed because advertisers listened to what we said"

In May, the nonprofit published a similar report identifying that out of 100 racist or hateful tweets that the organization reported to X, 99 remained online after four days. In July, X filed a lawsuit against the CCDH, accusing it of embarking on “a scare campaign to drive away advertisers.” The suit says the nonprofit illegally scraped data from X’s platform and improperly gained access to a secured database to obtain and publish information. The lawsuit is ongoing, with the most recent document filed just days ago

The research was simple and came from public information, he said. In both cases, the organization identified a list of posts containing offensive content, which it found by searching through the followers, likes, and retweets of known hateful accounts. Researchers used X’s systems to report posts and checked back days later to see if they were still visible. 

“Elon Musk was annoyed because advertisers listened to what we said, and that’s why he sued us,” Ahmed told Fortune. “He didn’t allege defamation. He didn’t like that we were gathering the data in the first place.” 

X has received widespread criticism in the last month over how it has handled offensive content regarding the Israel-Hamas war. Days after the war began, Fortune observed disturbing photos and videos on X that appeared to show hostages, individuals being beaten, and dead bodies. EU Commissioner Thierry Breton warned the company about its content, to which Yaccarino responded it has “redistributed resources and refocused internal teams” to address the issue. But the CCDH’s report suggests X’s response is still lacking weeks later. 

“He just doesn’t understand how damaging his platform is becoming,” Ahmed said.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.