February 2020
Paul Benioff (born 1930) is a US physicist who wrote a paper in 1980 that imagined the feats computing might achieve if it could harness quantum mechanics, where the word quantum refers to the tiniest amount of something needed to interact with something else – it’s basically the world of atoms and sub-atomic particles. Benioff’s imagination helped give rise to the phrase ‘quantum computing’, a term that heralds how the storage and manipulation of information at the sub-atomic level would usher in computing feats far beyond those of ‘classical’ computers.[1]
Benioff was coincidently writing about a vague concept being outlined by Russian mathematician Yuri Manin (born 1937) who that same year talked up the promises of quantum computing in his book, Computable and uncomputable. Since then, others such as US physicist Richard Feynman (1918-1988) have promoted the potential of computing grounded in the concept of ‘superposition,’ when matter can be in different states at the same time.[2]
Quantum computing is built on manipulating the superposition of the qubit, the name of its computational unit. Qubits, which are often atoms, electrons or protons, are said to be in the ‘basis states’ of 0 or 1 at the same time when in superposition, whereas a computational unit in classical computing can only be 0 or 1. This qubit characteristic, on top of the ability of qubits to engage with qubits that are not physically connected (a characteristic known as entanglement), is what proponents say gives quantum computers the theoretical ability to calculate millions of possibilities in seconds, something far beyond the power of the transistors powering classical computers.
In 2012, five years after Canada’s privately owned D-Wave built the world’s first rudimentary (28-qubit) quantum computer,[3] US physicist and academic John Preskill (born 1953) devised the term ‘quantum supremacy’ to describe how quantum machines one day could make classical computers look archaic.[4]
In October last year, a long-awaited world first arrived. NASA and Google claimed to have attained quantum supremacy when something not “terribly useful” was computed “in seconds what would have taken even the largest and most advanced supercomputers thousands of years”.[5] The pair were modest that their computation on a 53-qubit machine meant they were only able “to do one thing faster, not everything faster”. Yet IBM peers doused their claim as “grandiosity” anyway, saying one of IBM’s supercomputers could have done the same task in two-and-a-half days.[6]
Nonetheless, most experts agreed the world had edged closer to the transformative technology. Hundreds of millions of dollars are pouring into research because advocates claim that quantum computing promises simulations, searches, encryptions and optimisations that will lead to advancements in artificial intelligence, communications, encryption, finance, medicine, space exploration, even traffic flows, to name just some areas.
No one questions that practical quantum computing has the potential to change the world.[7] But the hurdles are formidable to accomplish a leap built on finicky qubits in superposition, entanglement and ‘error correction’, which is the term for overcoming ‘decoherence’ caused by derailed qubits that can’t be identified as out of whack when they are in superposition.[8] There’s no knowing as to when, or if, a concept reliant on mastering so many tricky variables will eventuate.[9] While incremental advancements will be common, the watershed breakthrough could prove elusive for a while yet.
To be clear, quantum computing is expected to be designed to work alongside classical computers, not replace them. Quantum computers are large machines that require their qubits to be kept near absolute zero (minus 273 degrees Celsius) in temperature, so don’t expect them in your smartphones or laptops. And rather than the large number of relatively simple calculations done by classical computers, quantum computers are only suited to a limited number of highly complex problems with many interacting variables such as the modelling of climate, traffic, molecules and economies, where classic computers fall short. Quantum computing would come with drawbacks too. The most flagged disadvantage are the warnings that a quantum computer could quickly crack the encryption that protects classical computers. Another concern is that quantum computing’s potential would add to global tensions if one superpower gains an edge – China is investing heavily and in 2017 claimed to have used quantum techniques to create hack-free communications.[10] In the commercial world, the same applies if one company dominates. Like artificial intelligence, quantum computing has had its ‘winters’ – when its challenges smothered the excitement and research dropped off.
That points to the biggest qualification about today’s optimism about quantum computing; that it might take a long time to get beyond today’s rudimentary levels where quantum machines are no more powerful than classical supercomputers and can’t do practical things. But if quantum computing becomes mainstream, a new technological era would have started.
Problems to solve
Lisbon, the capital of Portugal, is snarled in traffic. Why not use a quantum algorithm to find the best route? That’s what Volkswagen and D-Wave did in November. Their algorithm calculated the best way for buses to skirt traffic along a flexible route between stops. D-Wave CEO Vern Brownell said the pilot program “could be historic” because it was the “first time a quantum computer has been used to a real-time workload”.[11]
D-Wave's quantum computer that tackled Lisbon’s congestion was built to solve such optimisation problems. The many variables associated with traffic and the different interactions or constraints between those variables are said to be beyond the ability of classical computers to solve within a useful time frame – in this case, before the bus trip is over.[12]
Quantum computing’s theoretical advantages are that a quantum computer can process all the states a qubit can have at once and its computation power increases exponentially with each additional qubit. For three qubits, there are eight states to work with simultaneously, for four there are 16, for 10 there are 1,024, and for 20 there are 1,048,576 states, as Wired calculates.[13]
Brownell says that as quantum computers are probabilistic by nature they team well with AI, which is based on probabilistic models. And already they are being paired to address problems as shown when Woodside Energy in November signed an AI and quantum computing contract with IBM to develop an ‘intelligent plant’. The dual aims of the deal are, first, to reduce corrosion-driven maintenance costs that amount to A$1 billion a year and, second, to protect the company from cyberattack.[14] The quantum algorithms would help to optimise the flow of hydrocarbon fluids around its facilities while protecting computer systems from hackers, even those who might one day be armed with quantum computers.
The prospect of quantum computing excites many industries. Aeroplane and satellite manufacturers think that quantum computing will lead to sturdier and lighter alloys for their products. Battery makers hope quantum simulations will help develop batteries that will outperform lithium-ion ones. Pharmaceuticals reckon that quantum grunt can devise medicines (compounds) that could tackle untreatable diseases. They suggest being able to bring drugs and vaccines to market much faster and cheaper by using quantum computers to model molecules in ways that are impossible using classical computers. And so on for climate change solutions, financial modelling and many other areas. More intriguing, perhaps, is that quantum computers could help provide answers to science’s most fundamental abstract questions.
The restarting of a more-powerful European Organisation for Nuclear Research’s Large Hadron Collider under the French-Swiss border scheduled for 2020 is likely to boost the number of proton collisions per second by 150%. That’s a problem because when it was shut down in 2018, the collider’s data output of about 300 gigabytes of data every second needed to be divided between 170 computing centres in 42 countries for processing. To process the looming data torrent, scientists will need 50 to 100 times more computing power than they have at their disposal today. Such blockages to research explain the urgency for quantum computation.[15]
Advocates say, in time, up to half existing computing workloads could be executed by quantum devices, which would help a world running against the limits of ‘Moore’s Law’, which observed that the speed and ability of classical computers doubled every couple of years.
But quantum computers could come with mischief too. A big problem was flagged in 1994 when US mathematician Peter Shor (born 1959) published an algorithm[16] that, if handed to a quantum computer, could crack in seconds the encryption or maths puzzles that protect classical computers.[17] Many fear that rudimentary quantum computers could attain this ability, which would mean that quantum’s disadvantages could precede its touted benefits.
Adherents and doubters
Michelle Simmons (born 1967), professor of quantum physics at the University of New South Wales, was named 2018 Australian of the Year and in 2019 was appointed an Officer of the Order of Australia for services to quantum computing.[18] In July last year, to further add to her prestige, Simmons’s team of researchers announced a leap that will “provide a route to the realisation” of quantum computing.[19] The innovation was the world’s first two-qubit gate between phosphorus donor electrons in silicon that Simmons described as a “massive result, perhaps the most significant of my career”.[20]
Simmons’s team is said to follow a unique approach that requires not only the placement of individual atom qubits in silicon but also all the associated circuitry to initialise, control and read-out the qubits at the nanoscale – a concept of such precision it was thought impossible. The researchers not only brought the qubits to just 13 nanometres, or 13 one-billionths of a metre, apart, but engineered all the control circuitry with sub-nanometre precision – for comparison, the width of a human hair is 60,000 nanometres.[21] Such are the technicalities of the advancements needed to inch the world forward towards quantum computation.
Mikhail Dyakonov (born 1940) is a Russian professor of physics who works at the University of Montpellier in France. He has spent decades studying quantum and condensed matter physics. Such are his achievements, his name describes physical marvels such as the spin relaxation mechanism, plasma wave instability and surface waves. He has won prizes for physics in France, Russia and the US.[22] He is perhaps the world’s most credible naysayer about quantum computation meeting the optimism that surrounds it. “The proposed strategy relies on manipulating with high precision an unimaginably huge number of variables” is the summary of the case against quantum computing he made in 2018 in IEEE Spectrum, the magazine of The Institute of Electrical Engineers.[23]
Dyakonov explains that while a conventional computer with N bits at any given moment must be in one of its 2N possible states, the state of a quantum computer with N qubits is described by the values of the 2N quantum amplitudes, which are continuous parameters (ones that can take on any value, not just a 0 or a 1). This is where the hoped-for power of the quantum computer comes from “but it is also the reason for its great fragility and vulnerability”, he says.
Experts estimate that between 1,000 and 100,000 qubits are needed for a useful quantum computer, he says. But the number of continuous parameters describing the state of such an effective quantum computer at any given moment is at least 10300. How big is that number, asks Dyakonov? “It is much, much greater than the number of subatomic particles in the observable universe.”
Then, there are the effects of errors. In a classical computer, errors happen when transistors are switched off when they are supposed to be on, and vice versa. Error-correction programs within a classical computer can override these mistakes. “Could we ever learn to control the more than 10300 continuously variable parameters defining the quantum state of such a system? My answer is simple. No, never,” Dyakonov says.
The vast number of scientists backed by hundreds of millions of dollars and some of the world’s biggest governments, organisations and companies expect to prove Dyakonov wrong by turning the theoretical musings of Benioff, Manin and others into a new technological era.
By Michael Collins, Investment Specialist