Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Forbes
Forbes
Business
Maia Niguel Hoskin, Contributor

Google’s The Current Reveals The New Face Of Violent Online White Supremacist Organizations And How To Get Out Of These Radical Groups

It’s a common misconception that white supremacy is cloaked in white sheets and marches the streets of the deep rural south holding burning torches and covered with tattooed swastikas. Although these images do exist, white supremacist groups have become more sophisticated in their messaging and recruitment, and despite major tech companies attempts to curb extremist content on their platforms the face and reach of extremist groups extends beyond cliché racist prototypes and now even lurks within the cervices of the world-wide-web. An insightful report was recently released by The Current, a digital publication for Google’s Jigsaw which is a department within Google that investigates and confronts emerging digital threats and examines the evolution of international violent white supremacy. In the study, The Current drew from interviews with former white supremacists from around the world to explore the many faces of violent online white supremacy and how current members can escape these fundamental and violent groups. 

The new face of online white supremacy 

White supremacist and alt-right groups have slowly emerged from the dark underbelly they once dwelled — propagating hate across the country. In Charlottesville, they marched the streets chanting, “Jews will not replace us” and have enacted physical violence on civilians during counterprotests and various other random physical attacks. A recent study found that white supremacists and other right-wing extremists have been responsible for 67% of domestic terror attacks and plots so far this year. But the rapid increase in visibility of white supremacist groups has not been restricted to counterprotests and random acts of physical and verbal violence against members of marginalized groups. 

Over ten years ago Jesse Daniels examined how white supremacy has evolved beyond print media to the digital age in her 2009 book Cyber Racism. Daniels described the new face of white supremacist groups which contradicts the “hillbilly” and technologically illiterate image that once existed. In her book, Daniels paints a vivid picture of how extremist groups have used electronic communication to further spread politically motivated racial agendas within and outside of American borders. Extremist groups have also mastered the ability to camouflage their sites to appear as harmless and inconspicuous discussion boards aimed at sharing general knowledge and information. Meanwhile, peddling racist and prejudice rhetoric and challenging common knowledge about racial events such as the Holocaust, the Civil Rights Movement, and slavery.

Daniels’ argument goes hand in hand with findings that emerged from The Current’s study, which found that the rapid spread of violent online white supremacy has been intentionally decentralized to allow messaging from these organizations to be accessed transnationally.

“The ‘lone wolf’ is a myth. These intentionally informal white supremacy networks are increasingly transnational. They have become a global problem by constantly collaborating across borders through the internet in an unprecedented manner,” the report reads. 

This aggressive approach has allowed for greater reach in messaging and recruitment. The Current also found that violent white supremacy seems to “thrive in obscure corners of the internet” on closed apps such as Discord as well as online platforms such as BitChute, an online video sharing platform like YouTube, 8kun formerly known as 8chan, and Gab – with the latter two platforms having been linked to recent mass shootings and hate crimes. Such sites allow not only for the spread of hateful conspiracy theories and rhetoric about members from marginalized groups, but they also create additional channels for like-minded individuals to directly communicate internationally. 

Recruiting younger populations

Gone are the days when white supremacist groups had to depend on recruitment strategies such as physically canvasing college campuses for young impressionable minds that would follow their lead. In fact, before the pandemic and the subsequent transition to online learning for universities and colleges across the country, the Anti-Defamation League reported that white supremacist recruitment efforts on college campuses had increased for the third straight year, with more than 313 cases of white supremacist propaganda reported between September 2018 and May 2019. This was a 7% increase from the previous academic year, in which 292 incidents of extremist propaganda was reported. Now more than ever extremist groups are turning to online sites to bolster their recruitment efforts, but they aren’t only targeting college students. 

Twitter went a buzz when Joanna Schroeder’s tweets  which began with, “Do you have white teenage sons? Listen up.” went viral last year when the mother of three made a strong call to action to other white parents about the insidious nature of violent white supremacist and alt-right sites. Shroeder warned parents about viral social media memes – guised as “humorous” and seemingly harmless but are anything but. Laced with strong undertones of homophobia, anti-Semitism, heterosexism, and racism —Shroeder warned parents that these memes were being used by violent extremist groups to normalize white supremacist perspectives and eventually indoctrinate children into the world of alt-right extremism and white supremacy. Perhaps, white parents should take serious heed to Schroeder’s warning. The editor and founder of the neo-Nazi website Daily Stormer has openly admitted that the site targets children as young as 11 years old. 

Looking toward a solution 

In response to the growing number of extremist sites popping up on the internet, some larger online platforms such as YouTube, Twitter, and Facebook have removed extremist content from their sites. For example, Facebook banned the far-right, anti-Islamic British group Britain First from Facebook in March 2018. But The Current’s study reveals that there’s still a lot of work that needs to be done in the fight against the spread of violent online transnational white supremacy and Keegan Hankes, the Interim Research Director at the Southern Poverty Law Center agrees. 

“White supremacists are typically early adopters of technology. They go and hang out in places where there aren’t strong rules — places they’re more likely to get a foothold,” says Hankes. 

In September, the U.S. Department of Homeland Security released a plan to counter-terrorism and target violence. The report specified that online spaces appear instrumental to the recent growth of white supremacist groups both domestically and abroad. “Celebration of violence and conspiracy theories about the ‘ethnic replacement’ of whites as the majority ethnicity in various Western countries are prominent in their online circles,” the department said. In the case of Britain First — after being booted from Facebook the site simply relocated their videos to BitChute, Gab, and Telegram, which is how extremist sites typically continue to grow their momentum after being removed from a larger platform. 

That, said, it’s not enough for only larger platforms to do their due diligence. The Current found that addressing online white supremacy requires vigilant monitoring and explicit objection to white supremacist propaganda. “Technology platforms and organizations can provide effective interventions to stop early-stage radicalization – like search and discovery interventions – and provide tools to help violent extremists permanently disengage,” The Current reported. However, online platforms shutting down extremist sites is only one piece of the puzzle. The Current reports safe transitional spaces for former white supremacist must also be created. “Joining a supportive network of formers – a transitional space – that facilitates relearning social norms is key for formers to reinvent themselves and helps in reducing recidivism,” the report reads.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.