Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Dan Milmo Global technology editor

TechScape: UK parliament pushes back on the online safety bill

A teenager lies on the floor with his tablet device
A teenager lies on the floor with his tablet device Photograph: Yalana/Getty Images/iStockphoto

The digital wild west met its sheriff this week. The joint committee on the online safety bill announced on Tuesday that it was reining in the tech industry’s “land of the lawless” with its report on the pioneering legislation. Unveiling the document, the committee’s Conservative chair, Damian Collins MP, said: “The Committee were unanimous in their conclusion that we need to call time on the wild west online.”

The 192-page report recommends a sweeping overhaul of the bill, which is aimed at companies that provide user-generated content – ie social media networks like Facebook and Twitter and video sharing platforms like YouTube and TikTok – as well as search engines like Google. The bill imposes a duty of care on tech firms to protect users from harmful content, at the risk of a substantial fine brought by Ofcom, the communications industry regulator implementing the act.

The bill matters because it represents an attempt to properly regulate social media companies, video sharing sites and search engines under one legislative roof for the first time. And the report matters too. It is a thorough piece of work by a cross-party group of MPs and peers who know their stuff and supported the 170 conclusions unanimously. The government has already pledged to give it serious consideration and the committee is also likely to endure as a watchdog for the bill once it becomes law – an oversight structure modelled on the human rights joint committee. Here are some of the changes to the bill that the report recommends.

Make the objectives of the bill clear

The committee says the bill should set out its core objectives “clearly at the beginning”. This is also quite handy for any TechScape readers who want to stay on top of a complex piece of legislative work. The report says Ofcom should protect UK citizens online by ensuring that tech firms do the following: comply with UK law and do not endanger public health or national security; provide a higher level of protection for children than for adults; identify and deal with the risk of “reasonably foreseeable harm” arising from the operation and design of their platforms (algorithms and the like); recognise and respond to the disproportionate level of harms experienced by people on the basis of protected characteristics (disability, age, sexual orientation, religion etc); make sure your systems are designed to be safe, ie that they don’t steer users down dangerous content rabbit holes; safeguard freedom of expression and privacy; and operate with transparency and accountability in respect of online safety.

Legal but harmful content

One of the most controversial parts of the draft bill was clause 11, which covered the duty of care that applied to adults: protecting them from legal but harmful content. This caused concern because under the draft bill not only would the culture secretary have a key role in defining such content – meaning Nadine Dorries would, at least technically, have a censorship role over what is acceptable speech online – but it also contained an amorphous threat against freedom of expression.

The report proposes doing away with clause 11 altogether and replacing it with categories of transgression that mirror illegality in the offline world. This would mean banning online content that constitutes abuse, harassment or stirring up violence or hatred based on the protected characteristics in the Equality Act 2010. This, the committee hopes, will make tech platforms deal with hate speech.

The new approach would also mean banning other forms of content that are illegal in the offline world, such as intimidation of candidates in elections and facilitating human trafficking. The report’s reasoning is that because these harms are already illegal in the offline world, “society has recognised they are legitimate reasons to interfere with freedom of speech rights [online].”

Journalistic exemptions – and citizen journalists

While we’re on the subject of freedom of speech, the report also recommends a toughening up of the bill’s exemption of news organisation content from takedowns by tech platforms. Under the committee’s new recommendation, if it isn’t illegal, it stays up: “We recommend that the news publisher content exemption is strengthened to include a requirement that news publisher content should not be moderated, restricted or removed unless it is content the publication of which clearly constitutes a criminal offence.”

There is also an attempt to provide cover for “citizen journalists”, such as bloggers, by addressing the draft bill’s protections for content of “democratic importance”. The report recommends instead that the bill protects content that is in the “public interest”. Citizen journalists whose content has been taken down erroneously or unfairly can have their work reinstated rapidly via a dedicated and speeded-up complaints procedure.

Protecting children

In the draft bill, one of the three duties of care is to protect children from harmful content (the other two are protecting users from illegal harms and protecting adults from legal but harmful content). There are a number of new protections for children recommended by the committee. These include: requiring all pornography sites to prevent children from accessing their content, which could involve introducing age assurance measures; the definition of internet services likely to be accessed by children should be taken from the information commissioner’s age appropriate design code; introducing minimum standards for age assurance measures (from entering your date of birth on to a pop-up form to more stringent age verification); and Ofcom should draw up a code of practice for protecting children online, which should refer to the UN Convention on the Rights of the Child, the AADC and children’s right to receive information under the European convention on human rights.

Algorithms, misinformation, anonymity and safety by design

The report recommends tackling harmful algorithms, anonymous abuse and the spread of misinformation via a “safety by design code of practice” overseen by Ofcom. This code of practice requires platforms to look at how they operate, how they are designed and how that might harm users. For instance, tech companies must look at the algorithms that push content at users and prevent them from steering people down dangerous “rabbit holes”. Mass-spreading of misinformation will be addressed by drawing up measures to prevent frictionless sharing of content at scale, as well as being covered in the code of practice. Anonymous trolling – and spreading of misinformation by anonymous accounts – will also be included in the code of practice as a specific category, with platforms required to come up with measures to deal with vexatious anonymous accounts, including the ability to prevent banned trolls from setting up new accounts.

The big players should also be required to commission annual, independent third-party audits of the effects of their algorithms, their risk assessments (where they outline to Ofcom the harms their services could cause) and their transparency reports (which will include stuff like the incidences of illegal and harmful content, and how many users have encountered such content). Ofcom should also have the power to inspect these audits and do its own checks.

New criminal categories

The report recommends the creation of new criminal offences including: cyberflashing; encouraging someone to commit self-harm; intentionally sending flashing images to someone with epilepsy (with the intent of causing a seizure); and knowingly sending false, persistent or threatening communications. Tech execs are also hit with an expansion of criminal liability. The report calls for tech companies to appoint a boardroom-level executive who will be designated the firm’s “safety controller” and will be liable for a new criminal offence: failing to deal with “repeated and systemic failings that result in a significant risk of serious harm to users”. The committee sees the latter offence as a back-stop, but tech companies obviously do not like it one bit.

Advertising

Martin Lewis, the consumer champion, made an impassioned appearance at a committee hearing in October, telling MPs and peers that people’s lives were being “destroyed” by fraudsters using his image in scam online advertisements. The committee listened and has recommended that fraudulent ads be brought within the scope of the bill. Under its proposal, Ofcom will be responsible for acting against tech companies that consistently allow fraudulent or harmful ads on their platforms.

No currency for crypto

The Bank of England came out with more cryptocurrency warnings on Tuesday. The bank’s deputy governor, Sir Jon Cunliffe, said the price of digital money such as bitcoin could “theoretically or practically drop to zero”. Meanwhile the bank’s staff blog, looking at bitcoin, said: “It’s just a bunch of code that exists only in cyberspace. It’s not backed by the state.” Even if Threadneedle Street is right, it only serves to underline the anti-establishment status of cryptocurrencies.

If you want to read the complete version of the newsletter please subscribe to receive TechScape in your inbox every Wednesday.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.