Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Alex Hern

TechScape: What should social media giants do to protect children?

Time to talk … GCHQ and the National Cyber Security Centre want a conversation with tech giants about tackling child safety online.
Time to talk … GCHQ and the National Cyber Security Centre want a conversation with tech giants about tackling child safety online. Photograph: SolStock/Getty Images

This week, the technical leads of GCHQ and the National Cyber Security Centre made a powerful intervention into an incredibly controversial debate: what should social media companies do to protect children on their platforms?

But that wasn’t how the intervention was taken by all parties. Others heard something rather different: tired arguments against end-to-end encryption, dressed up in new clothes but disguising the same attack on privacy rights with the same excuse that’s always wheeled out by law enforcement.

From our story:

Tech companies should move ahead with controversial technology that scans for child abuse imagery on users’ phones, the technical heads of GCHQ and the UK’s National Cyber Security Centre have said.

So-called “client-side scanning” would involve service providers such as Facebook or Apple building software that monitors communications for suspicious activity without needing to share the contents of messages with a centralised server.

Ian Levy, the NCSC’s technical director, and Crispin Robinson, the technical director of cryptanalysis – codebreaking – at GCHQ, said the technology could protect children and privacy at the same time. “We’ve found no reason why client-side scanning techniques cannot be implemented safely in many of the situations one will encounter” they wrote in a new discussion paper .

You may remember the debate around client-side scanning from a year ago. To quote myself:

Apple is taking a major step into the unknown. That’s because its version of this approach will, for the first time from any major platform, scan photos on the users’ hardware, rather than waiting for them to be uploaded to the company’s servers.

By normalising on-device scanning for CSAM [child sexual abuse material], critics worry, Apple has taken a dangerous step. From here, they argue, it is simply a matter of degree for our digital life to be surveilled, online and off. It is a small step in one direction to expand scanning beyond CSAM; it is a small step in another to expand it beyond simple photo libraries; it is a small step in yet another to expand beyond perfect matches of known images.

So why is the intervention from Levy and Robinson important? To me, it’s a sincere attempt to address the concerns of these critics, to lay out the advantages of client-side scanning in tackling specific categories of threat – and to propose meaningful solutions to common fears.

The devil is in the details

To take one example from the 70-page paper: the pair try to tackle the fear that lists of photos that are scanned for CSAM could expand beyond known CSAM to include, say, images with a political character. In plainer language, what would stop China demanding that Apple include the famous pictures of Tank Man in its scanning apparatus, and forcing the company to flag any iPhones containing that picture as potentially criminal?

Robinson and Levy suggest a system that would do just that. They propose that the list of images be assembled by child protection groups around the world – organisations like the National Center for Missing and Exploited Children in the US, or Britain’s Internet Watch Foundation (IWF). Each of those groups already maintains a database of “known” CSAM, which they cooperate to keep as comprehensive as possible, and the scanning database can be made only of those images in all the groups’ lists.

They can then publish a hash, a cryptographic signature, of that database when they hand it over to tech companies, who can show the same hash when it is loaded on to your phone. Even if China were able to force its domestic child protection group to include Tank Man in its list, it would be unable to do the same for the IWF, so the image wouldn’t make it onto devices; and if it forced Apple to load a different database for China, then the hash would change accordingly, and users would know that the system was no longer trustworthy.

The point is not that the proposed solution is the best possible way to solve the problem, Levy and Robinson write, but to demonstrate that “details matter”: “Discussing the subject in generalities, using ambiguous language or hyperbole will almost certainly lead to the wrong outcome.”

The fear and fury is genuine

In a way, this is a powerful rhetorical move. Insisting that the conversation focus on the details is an insistence that people who dismiss client-side scanning on principle are wrong to do so: if you believe that privacy of private communications is and should be an inviolable right, then Levy and Robinson are effectively arguing that you be cut out of the conversation in favour of more moderate people who are willing to discuss trade-offs.

But it is frustrating that much of the response has been the same generalities that accompanied Apple’s announcement a year ago. Technology news site the Register, for instance, published a furious editorial saying: “The same argument has been used many times before, usually against one of the Four Horsemen of the Infocalypse: terrorists, drug dealers, child sexual abuse material (CSAM), and organized crime.”

I’ve spent enough time talking to people who work in child protection to know that the fear and fury about the harm caused by some of the world’s largest companies is genuine, regardless of whether you think it is correctly targeted. I don’t claim to know Levy and Robinson’s motivations, but this paper represents an effort to create conversation, rather than continue with a shouting match between two irreconcilable sides of an argument. It deserves to be treated as such.

It’s not ‘yourscraft’

What’s Minecraft is mine.
What’s Minecraft is mine. Photograph: Chris Bardgett/Alamy

Minecraft is big. You might have heard of it. So when the game makes a moderation decision, it’s a bit more important than when Bungie decided to nerf scout rifles in Destiny 2. Particularly when the moderation decision is this:

Minecraft will not allow non-fungible tokens (NFTs) to be used on the popular gaming platform, with the company describing them as antithetical to Minecraft’s “values of creative inclusion and playing together”.

Minecraft represented an attractive potential market for NFTs, with a user base – estimated at more than 141 million by August 2021 – already engaged in sharing unique digital items developed for the game.

But the Microsoft-owned development studio behind Minecraft, Mojang, has put an end to speculation NFTs could be allowed in the game. In a blog post on Wednesday, the developers said blockchain technology was not permitted, stating it was antithetical to Minecraft’s values.

Minecraft’s incredible success is thanks to its extendability. As well as the built-in creative aspects of the game – often described as the 21st century’s answer to Lego – users can modify it in ways large and small, producing new experiences. That flexibility proved tempting for NFT creators, who settled on the idea of creating new features in Minecraft and selling them as digital assets.

In theory, it’s the perfect NFT opportunity: a digital-native creation, with a use-case that is actually achievable, and a demonstrably viable market. Startups flocked to the field: NFT Worlds sells pre-generated Minecraft landscapes, on which people would be able to build experiences and resell them for profit; Gridcraft operates a Minecraft server with its own crypto-based economy.

Or they did. Now, it seems, NFTs have become such a toxic phenomenon that even passive acceptance is too much for a company like Mojang. If you want to make it in this world, you have to go it alone.

  • If you want to read the complete version of the newsletter please subscribe to receive TechScape in your inbox every Wednesday.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.