Security in a digital world requires that our communications are safe from digital eavesdroppers. The way we do that is to encrypt our messages using mathematical tools. The most powerful of these use trapdoor functions – that is, ones that work easily in one direction (making encryption easy) but not in the other (making decryption difficult).
Trapdoor functions utilise a property of multiplication – its asymmetry. It’s simple to multiply two numbers together, for example, 971 and 1,249, to get 1,212,779, but it’s quite hard to start with 1,212,779 and work out which two prime numbers (its factors) have to be multiplied to produce it. And the task becomes exponentially harder the bigger the original numbers are. Which is why, up to now, computer scientists believe that it’s impossible in practice for a conventional computer, no matter how powerful, to factorise any number that’s longer than 2,048 bits. Why so? Because it would take it 300tn years, or about 22,000 times longer than the age of the universe (to use just one of the popular analogies), for the machine to crack the problem.
This explains why the 2,048-bit limit is the basis for the most commonly used form of asymmetric encryption today, the RSA system, which relies on the difficulty of factoring the product of two large prime numbers, namely, numbers that are divisible only by themselves and 1. That doesn’t mean that RSA encryption is unbreakable (mathematicians never say never) – just that it won’t be broken in the near future and so the world can rest assured that it’ll be good for, say, the next 25 years.
Being an alert reader, you will already have spotted the critical fly in this soothing ointment – the assumption that the computers we will be using in 25 years’ time will be similar to the ones we use today. Since the early 1980s, physicists and computer scientists such as Richard Feynman, Paul Benioff, Yuri Manin (who died last weekend at the age of 85) and Britain’s David Deutsch have been thinking about a different idea – using some ideas from subatomic physics to design a new and very distinct kind of computing engine – a quantum computer. In 1985, Deutsch published a proposal for one. And in recent times, companies such as Google and IBM have begun building them.
Why is that relevant? Basically because quantum computers are potentially much more powerful than conventional ones, which are based on digital bits – entities that have only two possible states, on and off (or 1 and zero). Quantum machines are built around qubits, or quantum bits, which can simultaneously be in two different states.
At this point, you may be anxiously checking for the nearest exit. Before doing so, remember that to understand subatomic physics you need first of all to divest yourself of everything you think you know about the physical world we ordinary mortals inhabit. We may sometimes be rude about people who believe in fairies, but particle physicists fervently believe in the neutrino, a subatomic particle that can pass right through the Earth without stopping and we take these scientists seriously.
Way back in 1994, the mathematician Peter Shor showed why we might be right to do so. Any entity equipped with a powerful enough quantum computer, he argued, could potentially break most commonly used cryptographic codes, including RSA. The problem was that the dream machine would need a billion qubits to do the job reliably. Other researchers recently calculated that it would need “just” 20m qubits but could do the requisite calculation in about eight hours.
However, a new paper by a group of Chinese researchers claiming that they can break 2,048-bit RSA has caused a brief flurry in cryptographic circles. It was rapidly debunked by a couple of experts, including US computer scientist Scott Aaronson, who described it as “one of the most actively misleading quantum computing papers I’ve seen in 25 years and I’ve seen… many”.
There will be more where that came from. So it’s time for a reality check. Quantum computers are interesting, but experience so far suggests they are exceedingly tricky to build and even harder to scale up. There are now about 50 working machines, most of them minuscule in terms of qubits. The biggest is one of IBM’s, which has – wait for it – 433 qubits, which means scaling up to 20m qubits might, er, take a while. This will lead realists to conclude that RSA encryption is safe for the time being and critics to say that it’s like nuclear fusion and artificial general intelligence – always 50 years in the future. That doubtless will not prevent Rishi Sunak from declaring his intention to make the UK “a world leader in quantum” but my money is on RSA being secure for my lifetime – and possibly even Sunak’s.
What I’ve been reading
Exit by Hari Kunzru is a terrific essay in Harper’s magazine on the ideological underpinnings of the tech industry.
Life of illusion
Worth catching on the Literary Hub platform is Nothing Is Real: Craig Brown on the Slippery Art of Biography.
What ChatGPT Reveals About the Collapse of Political/Corporate Support for Humanities/Higher Education is a sobering piece by Eric Schliesser on the Crooked Timber blog.