Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Holly Evans and Harriette Boucher

How social media is helping the far right spread fear and hate

At around 5.30pm on 7 July, Essex Police officers were called to Epping’s High Street following reports that a man was behaving inappropriately towards a teenage girl.

Hadush Kebatu, 38, an asylum seeker from Ethiopia, allegedly tried to kiss a schoolgirl as she ate pizza. He has since been charged with three counts of sexual assault.

But the news that he had only arrived in the UK eight days earlier via a small boat quickly took hold on social media, sparking a series of protests that then turned violent, and thrusting the historic Essex town into the heart of an anti-immigration row.

What started as a group of locals voicing their grievances outside the Bell Hotel, which is believed to house asylum seekers, has now escalated into what has been described as a “powder-keg situation”, with fears that it could prompt a wave of riots across the country, similar to those seen last summer.

Police seen near the Bell Hotel in Epping, Essex, during a protest (PA Wire)

In the past two weeks, prominent leaders of neo-Nazi groups and far-right organisations have been accused of exploiting the situation by pivoting demonstrations towards violence, with some demanding a “national call for action”.

Their weapon of choice? Social media, which the far right have long been known to harness as a tool to spread fear and hate.

Several right-wing activists have rebranded themselves as citizen journalists or political commentators, which has helped them to accrue millions of followers in the UK and across the globe.

Joe Mulhall, of the charity Hope Not Hate, says this is dangerous at a time when misinformation online spreads fast and can whip up tensions.

“It’s deeply concerning that a rumour or allegation can spread so quickly and take hold. Last year in Southport, misinformation from influencers like Andrew Tate spread like wildfire about the ethnicity and nationality of the perpetrator of the awful murders.

“When misinformation spreads, it can legitimise existing biases, and, as a rumour or allegation takes hold, things can quickly move offline.”

Locals initially came to voice their grievances after a man was arrested for allegedly sexually assaulting a teenage girl (AP)

The protests in Epping are coordinated and advertised on a private Facebook page titled Epping Says No. Its administrators include members of a group that calls itself Homeland. Founded in 2023 by a cohort that split from the neo-Nazi organisation Patriotic Alternative, Homeland has been described as the largest fascist group in the UK.

This week, one of its prominent members has shared several videos of the protests on social media, and has called for future action, urging: “If you live in an area that has a hotel occupied by asylum seekers, start organising.”

Members of other groups, including former neo-Nazi terror group Combat 18, the British National Party, and the Patriots of Britain, have also been spotted at the demonstrations.

Mulhall warns that in the context of overworked and overstretched police forces, racist and anti-immigration rhetoric online can fall under the radar. He says the UK “needs to be ahead of the curve” in order to clamp down on this activity.

Social media platforms such as X have played a crucial role in the spread of misinformation and anti-migrant rhetoric (Getty)

“Tracking these comments and the individuals responsible is tricky,” he explains. “The far right are no longer divided into neat groupings, but are instead thousands of people posting videos outside migrant accommodation, posting rumours and making comments online.

“Gone are the days when the police or social media companies can simply deplatform a particular group to resolve this issue.”

Since Elon Musk’s takeover of X, formerly known as Twitter, the platform has changed significantly, with the Tesla founder reportedly tweaking its algorithms and removing its fact-checking mechanisms.

This included turning the platform into a pro-Maga Trump echo chamber in the run-up to last year’s US presidential election, and reinstating previously banned figures such as Tommy Robinson and Katie Hopkinson.

Hope Not Hate describe a concerning trend in which US figures comment on UK politics and societal issues and boost far-right voices, such as those of anti-Islam activist Robinson, who has hinted that he will attend a protest in Epping on Sunday.

Mulhall says: “The far right has changed dramatically, and ironically, knows no borders. What we’re seeing now is key figures emerging online. We’re no longer looking at organisations, but key people who emerge during a time of crisis.

Far-right figures such as Tommy Robinson have accrued a large following on social media (Getty)

“The far right is international: they move around and they move in packs, and they try to find any weakness. They have no formal leader – there’s no single leader. It’s like they’re a group of fish that move around the internet, exploiting situations.

“It is no surprise that we’ve seen a rise in far-right activity in the UK, US and Europe – these groups and ideas are interconnected.”

Dr Karen Middleton, from the University of Portsmouth, who served as an expert witness in the UK government’s inquiry into social media, misinformation and harmful algorithms, says the protests in Epping have been “in many ways a continuation of the riots from last year”. She explains: “Sensationalist and polarising content gathers more clicks, gathers more engagement, so there is a systemic incentive for spreading misinformation online.”

She believes that large social-media platforms need to go much further in addressing the spread of misinformation, but cautions that this is not about limiting free speech. “This is about taking responsibility for published information that is online, that goes to a large number of people, and is very often spread by people with high profiles,” she says.

A spokesperson for the National Police Chiefs’ Council said communities have a part to play in halting the spread of misinformation, and urged people to “carefully consider” what they read, share, and trust online to avoid stoking tensions.

“We would encourage the public to access formal authorities for accurate information. The spread of disinformation and misinformation by individuals or groups can significantly contribute to community tensions and has real-world implications. We all have a responsibility in this respect, and relevant criminal law applies to online actions,” they added.

They also called on social media companies to be vigilant to the spread of false information, and to “ensure harmful content is detected, challenged and removed in a timely manner”.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.