Get all your news in one place.
100's of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
Comment
Van Badham

It is no fluke that social media platforms are addictive and causing harm. They were designed that way

Teenage girl at home looking at social media on her cell phone while lying on the couch
‘This week a court in New Mexico and another in Los Angeles determined that social media platforms were legally responsible for harms caused to users,’ Van Badham writes. Photograph: Hispanolistic/Getty Images

A disdain towards the notion of “consequence” somewhat defines the contemporary western moment of the powerful. So two recent US court decisions that are adverse to the interests of – oh my god, would you believe it? – tech companies should be heralded to the full height of every sky.

Within days of each other this week, a court in New Mexico and another in Los Angeles determined that social media platforms were legally responsible for harms caused to users.

In New Mexico, a suit brought by that state’s attorney general claimed that Meta – the owner of Instagram and Facebook, among other platforms – had misled its users about its safety processes, and enabled child sexual exploitation on its services.

Evidence cited by the plaintiffs demonstrated failures to implement basic safety measures such as effective age verification, while undercover agents posing as children on the platforms reported they had been contacted by adults and subjected to sexualised communication.

Inconveniently for the Meta defence, internal documents that acknowledged risks of exploitation and harm on platforms were provided to the court.

A jury was convinced that Meta’s actions violated consumer protection law and engaged in practices both deceptive and unfair. Knowingly exploiting their users’ lack of knowledge around platform safety was deemed “unconscionable”, and $US375m of civil penalties were imposed.

Yet the Los Angeles case suggests potentially even greater consequences for tech giants.

Here, a lawsuit was brought by a young woman against Meta (Instagram) and Google (YouTube), alleging these platforms were deliberately designed to be addictive, and the resulting addiction she experienced caused debilitating experiences of mental health harm, including depression, anxiety, dysmorphia and even suicidal ideation.

Her use of the platforms began as a primary-aged child – and at one point her use of the platforms was recorded at up to 16 hours a day.

Here, $US3m compensation towards the plaintiff and an additional $US3m in punitive damages were imposed, with Meta found to be liable for 70% of the harms caused, and Google the other 30%.

That the platforms have been found liable for damage they have caused feels like it shouldn’t be news to anyone who, you know, uses them.

What made the Los Angeles outcome notable, however, wasn’t the individual harm it acknowledged, or the even size of individual compensation received.

Other social media platforms – Snapchat and TikTok – had both settled on confidential terms with the plaintiff before the trial commenced. And what legal scholars have suggested is that the tendency of social media platforms to settle is to avoid the discovery of liabilities and precedents established should such cases go to trial.

Wisely, if not morally – as the jury in Los Angeles found that Meta’s and Google’s platforms were of an inherent, deliberate design to be addictive for children. Tech scholar Rob Nicholls discussed in the Conversation the case’s claim that “companies borrowed heavily from the behavioural and neurobiological techniques used by poker machines and exploited by the cigarette industry to maximise youth engagement and drive advertising revenue”

Oh.

Infinite scroll, algorithmic recommendations, autoplay/engagement loops and vanishing, time-sensitive content were some of the features cited for addictive function.

This structural engineering within the platform design to encourage addictive behaviours is the most significant finding from the process, given what has consistently protected the platforms against liability for harms has been the claim that they were not responsible for user-produced content on their sites.

No longer. Turns out the killer isn’t in the building. It is the building.

With thousands of similar cases against the tech giants pending, Meta and YouTube have – unsurprisingly – begun appeals, because, according to Nicholls, these verdicts “could also be used as the basis for both class actions and individual actions on a global basis”.

Commentators are suggesting these cases may comprise social media’s Big Tobacco moment.

The defence in the Los Angeles case argued that the young plaintiff was facing difficulties in her home life and therefore the social media platforms she was using all her waking hours may even have been, you know, “healthy” for her.

From memory, not even tobacco companies claimed in the legal discussion around their own liabilities that stepping outside for a ciggie constituted prevention of workplace violence.

And yet, we have put the dominant means for human communication in the hands of corporations with moral standards that are the equivalent. Or below.

As more countries speed towards adopting their own versions of Australia’s landmark social media ban for teenagers and children, these findings should deeply embarrass everyone who claimed the ban was “boomer” moralising, rather than one of the most consequential public health decisions of the decade.

And in the spirit of a new age of consequence that may finally be dawning, I suggest it’s time for adults to revisit the known, demonstrable damage derived from adult use of these platforms and ask ourselves: just what is this addictive hard-wiring doing to us?

  • Van Badham is a Guardian Australia columnist

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.