Online sexism is often dismissed as random — just a few bad comments or offensive jokes. But what appears scattered and spontaneous is increasingly structured, repeated and amplified in ways that make it far more influential.
This shift can be understood through masculinism, an ideology that frames men as a disadvantaged group and defines feminism and gender equality as threats. While individual sexist comments may appear isolated, masculinism provides a shared narrative thread that connects them and reinforces them across online spaces.
Read more: Driven by social media, masculinism has moved from the fringes to the mainstream
Masculinist groups, such as incel (involuntary celibate) communities, Men Going Their Own Way (MGTOW) and men’s rights activists like Andrew Tate openly reject gender equality and may even encourage violence against women, turning sexism into something more deliberate and far-reaching.
In January 2026, the French High Council for Gender Equality issued a warning that online masculinist groups are no longer niche or innocuous. These organized groups have grown in influence and can affect how women are treated in society.
To understand why this matters, it helps to understand how everyday sexist behaviours or discourse is entangled with and can evolve into co-ordinated online movements. Sexism is no longer limited to individual views or fringe pockets of the internet, it is now shared across many online platforms.
The pattern behind the noise
As a researcher in feminist theory and gender studies, specializing in the analysis of narrative and cultural representation, I study how gendered ideas are represented, produced and circulated across different media.
Most people see sexist comments online every day. These range from crude jokes to attacks on feminism or claims that men are the “real victims” in today’s society.
Because these comments often look casual and unplanned, many people see them as random, harmless or just personal opinions. However, research in social sciences and communications shows that they do not spread by accident. Instead, they follow loose patterns of co-ordination.
This type of co-ordination happens when people share the same language, ideas and feelings of resentment online over and over again.
As these messages appear repeatedly across digital platforms, what feels like a personal opinion becomes part of a more organized pattern, even if users are not aware of that bigger picture.
The role of repetition and emotion
Groups like men’s rights activists, anti-feminist or misogynist communities were once seen as small and insignificant with little influence. But over time, some have developed a growing presence on popular social media platforms, podcasts and video channels.
Their ideas now reach far beyond their original online space. Influencers like Justin Waller and Sneako (featured on Louis Theroux’s latest Netflix documentary, Inside the Manosphere) have played a significant role in popularizing masculinist ideas.
Their content often combines self-help messaging with narratives that portray women as manipulative or men as unfairly disadvantaged. Tate alone has amassed billions of views across platforms, reflecting the scale at which such ideas circulate.
Messages that trigger anger or a sense of unfairness are more likely to be shared. Research in psychological and cognitive sciences shows that emotional and moral language makes political messages more likely to be spread, even among people who disagree with them.
The main concern is not how many people openly support violence against women. The greater risk is what repeated exposure does over time. When certain groups, like women or feminists, are presented repeatedly as dangerous or immoral, people may become more accepting of harsh treatment toward them, even if there is no open call to violence.
Regular exposure to misogynistic content can also make users more likely to move toward extreme views, including far-right content. Radicalization does not happen overnight and is, in fact, the result of consistent exposure and gradual normalization over time.
When people see the same messages again and again, harmful language loses its shock value and starts to feel acceptable.
What’s alarming is that the consequences extend beyond digital spaces.
When harmful ideas aren’t questioned
Reports show that sexist language and attitudes are increasingly appearing in schools and family settings.
Teachers report that students repeat the misogynistic messages they’ve seen on social media or online video platforms and treat them as jokes or “common sense” rather than harmful ideas and behaviours.
Similar patterns can appear in workplaces, where women’s contributions may be dismissed through humour. When we become used to harmful content, we stop questioning it.
Understanding these patterns doesn’t mean that nobody is allowed to disagree with gender policies. In a democratic society, it’s healthy for people to have different views on how equality can be achieved. However, there’s a difference between fair disagreements and organized narratives that treat gender equality as a serious threat.
If we want to counter this phenomenon, we have to recognize the impact of how girls and women are portrayed online and how everyday sexist content can influence the way they are treated in real life.
Sepita Hatami does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
This article was originally published on The Conversation. Read the original article.