Although quantum computing is still in its infancy, IBM has made a couple of breakthroughs in terms of better error detection and a scalable design.
Quantum computing refers to theoretical computation systems that don’t process in binary “on” and “off” (1s and 0s) like others do in today's world. A quantum chip can theoretically use a transistor equivalent that takes advantage of the unique ability of subatomic-size units called “qubits” to be in multiple states simultaneously.
If a quantum computer could be built with just 50 quantum bits (qubits), no combination of today’s top 500 supercomputers could successfully outperform it.
The ability to detect errors is important, said Jay Gambetta, manager of the Theory of Quantum Computing and Information group at IBM. “In quantum mechanics, it’s very hard to make quantum information resilient to errors. We’ve shown the ability to detect and measure both kinds of quantum errors simultaneously.”
IBM has demonstrated it can simultaneously detect and measure the two types of quantum errors (bit-flip and phase-flip). “We’ve done it in square array,” said Gambetta. “One dimension does bit and the other does phase flips.”
Google recently announced an approach that lines up qubits in a row, but IBM said that Google’s "linear" approach is not scalable because it prevents the simultaneous detection of both types of quantum errors.
A quantum computer needs to deal with both errors at the same time in order to provide for practical error detection – which needs to come before correction, the next major step.
“It’s a long road, but [quantum computing] is much closer than a lot of people think,” said Gambetta. “We’re only in the early stages, doing little tiny logic circuits. Next step is full error correct, then logical qubit encoding.”
IBM's other breakthrough includes a new four-quantum bit circuit design with a lattice structure, the only physical architecture that can be scalable. IBM claims its square lattice architecture is the best configuration for continuing to add more qubits to get to a working system.
The world is inherently quantum, so to solve anything in nature requires quantum computing, according to Gambetta. “There are things we cannot touch yet – such as simulating quantum chemistry. We also know it can search bigger data faster, but this requires a way to load every bit of data.”
Nobel Prize winner Richard Feynman first laid out the concept of a quantum computer at an MIT conference sponsored by IBM in 1981, and Big Blue has been researching ever since.