Article Image

IPFS News Link • Science, Medicine and Technology

Landmark hot qubit research promises bigger, cheaper quantum computers

• New Atlas

Traditional computers, which perform their wonderfully quick calculations using millions of simple on/off transistors organized into logic gates, spent the last half a century getting faster and faster, to the point where we could reasonably expect the number of transistors on a chip to double every couple of years, while becoming half as expensive. To follow this famous "Moore's Law," they've become smaller and smaller, and thus faster and faster, to the point where human manufacturing ingenuity has run up against a hard obstacle.

The latest transistors are so small that they can no longer reliably control the flow of electrons, because at distances measured in just a few atomic widths, electrons can "quantum tunnel," or basically instantaneously disappear and reappear on the other side of a transistor, or hop to an adjacent path, causing all sorts of errors in computing. So next-gen transistor-based chips can't get any smaller, and this physical boundary threatens to grind processor development to a halt.

Quantum computing appears to be a promising solution, using the extraordinary weirdness of quantum-scale physics to unlock a new path forward. Instead of a transistor bit, which either lets electrons through or it doesn't, quantum "qubits" use nano-scale physics to express different states; the clockwise or counter-clockwise spin of an electron, for example, or the horizontal or vertical polarization of a photon – these become your ones and zeroes.

And where a transistor-based bit can be either open or shut, 1 or 0, a quantum "qubit" takes advantage of superposition – effectively being able to exist in both states at once, and indeed all levels of probability between those two states. Like Schrodinger's cat, a qubit is only forced to collapse into a single 1 or 0 reality when it's measured. While in superposition, it can run multiple calculations simultaneously, firing computing into a probability-based dimension that will be vastly superior for a certain sub-set of tasks.


thelibertyadvisor.com/declare