Don’t miss the latest developments in business and finance.

Google's Willow chip: A quantum leap in error correction and scalability

Quantum computing has to tackle error correction if it is to become viable in the commercial sense

The demand for semiconductor chips in India is set to undergo a dramatic shift, with 60 per cent of it, in value terms, expected to come from chips smaller than 10 nano­metre (nm) by 2032. This insight comes from a forthcoming report by the Indian El
Representative Picture
Business Standard Editorial Comment Mumbai
3 min read Last Updated : Dec 24 2024 | 11:05 PM IST
Google’s latest quantum chip, Willow, may have achieved a big breakthrough in terms of solving the intractable challenge of error correction in quantum computing. If this effect is replicable, it introduces the possibility of scaling up quantum computing to tackle many problems that are impossible with conventional, classical supercomputers. Google claims Willow, which is a 105-qubit (quantum bit) system, has demonstrated two key capabilities. One, it solves certain problems much faster. The benchmark used was a random circuit sampling (RCS) problem — essentially a method of proving that a sequence of numbers, or paths, is random. Willow took around five minutes to solve a problem that would take the fastest classical supercomputer 10 septillion years (10 followed by 24 zeros) — much more than the age of the universe. However, RCS has no commercial applications.
 
The second key capability Willow demonstrated actually has more important implications. Quantum computing has to tackle error correction if it is to become viable in the commercial sense. Qubits are affected by their environment and errors build up rapidly as the capacity of a quantum computer increases. Error correction is critical. Willow demonstrated what is called a “below threshold” capability of error correction. That is, as it handled larger arrays of qubits, it reduced the number of errors more efficiently. This offers hope that large quantum computers can be built with even lower errors, using the insights gleaned from Willow. Quantum computing works by exploiting the quantum effects of superposition and entanglement. Entanglement allows two quantum particles to affect each other at a distance. Changing the state of one particle instantly changes the state of the other particle, even if they are separated. Apart from other applications, this allows the instantaneous transmission of information and, crucially, the information cannot be intercepted and decrypted. Many research organisations, including India’s Space Applications Centre and Physical Research Laboratory, have demonstrated use of entanglement in quantum-encryption systems.
 
Superposition allows a quantum particle to exist in two states at the same time. This means it can carry or process more information quickly. A conventional computer uses bits, each of which can store either a one or a zero. Thanks to superposition, a qubit can store both values simultaneously. Simplistically, a classical array of say, four bits, could store any one of 16 different numbers at a given time, whereas an array of four qubits could store all 16 possible numbers at the same time. Superposition therefore allows much faster calculations. But quantum computers need to be kept in very stable environments (even vibrations from passing trucks can trigger errors, or superposition collapses) at very low temperatures and the errors still build up as qubit arrays grow. Hence, Willow’s demonstration errors can actually be reduced as qubit arrays grow, and are of extreme importance. This promises that really large quantum arrays can be assembled, allowing for the power to tackle problems that classical supercomputers cannot handle. Quantum computing could find uses in areas as diverse as encryption/decryption, drug discovery, industrial chemistry, managing power grids, and nuclear reactors, and even doing fundamental research. Coupled with advances in artificial intelligence, quantum computing could jumpstart an entirely new era of supercomputing.

Topics :GoogleSamplingTechnologyBusiness Standard Editorial CommentEditorial CommentBS Opinion

Next Story