Advanced Computer Architecture Big Picture Summary 8

Updated Tuesday June 4, 2024 8:51 AM GMT+3

Revised 5/17/2024 - This article is not standalone and is not a complete treatment of its topic. It's meant to cap a detailed reading of a similarly titled set of technical papers. (see the course calendar).

Quantum Computers

The term quantum, from physics, originated in a breakthrough scientific discovery a century ago. It is, unfortunately, somewhat mysterious for some people who may need to use the term in a technical context. The following is a cursory glance into quantum computing from the viewpoint of a classic computer architect. Getting into physics is avoided on purpose.

Energy, a primal feature of the physical world, seems to “condense” or form matter in discrete definite amounts called quanta. Also, the resulting particles, the building blocks of matter, seem to behave similarly at a small scale, e.g., spin or acquire/release energy in fixed amounts. This quantum behavior, among many, was astonishing at first. There would be other even less familiar behaviors at the quantum scale. The physical world at the scale with which humans interact and above behaves very differently. It seems to change continuously at those scales. Our electronic computing machines also behave in ways consistent with the direct human experience of the world.

Conventional computers switch between two distinct and deterministically time-sequential states. In other words, the binary states are separate in time. They are always observable separately with certainty. These states usually derive from a large-scale physical behavior, such as the flow of electrons in a semiconducting material. Only two are considered, in a mostly ignored analog continuous behavior, to simplify the logical analysis of the system [Feynman Lectures on Computation, Westview Press 1996]. In a quantum computer, the focus shifts to more atomic (i.e., at the particle level) quantum properties like spin or position. See Big Picture 1 for more on classical switching.

Hover 🔍︎

Classic switching: simulating a binary state digital signal. The ON/OFF switching threshold depends on some large non-quantum scale physical characteristics of the device used to generate the signal waveform.

Quantum properties have physical states that are only distinguished probabilistically. Those states overlap in time in unfamiliar ways, not in line with how we experience the physical world. At times, the physical properties behind the quantum states may not be observable with certainty like those that non-quantum computers rely on. Hence, in addition to the familiar binary states encoded logically as 0 or 1, they effectively introduce a third physical state that is neither 0 nor 1 with certainty. The desired computation occurs at this quantum state. Maintaining it is not easy. It is a big challenge for the quantum machine. To sum up, instead of a bit (binary digit), the new unit of information is tri-state, denoted as Qbit.

Clearly, quantum computers demand very different technology. What may be less obvious but more important is the fundamental change in the underlying switching behavior of the physical machine. Rapidly changing states that overlap in time are naturally parallel at a level lower than a computation. Switching is much faster but also, unfortunately, much more difficult to predict and control.

Quantum switching promises to overcome the polynomial efficiency limitation of conventional computers. In those machines, the solution time must grow like a polynomial function of the input size to solve arbitrary instances of problems, as large as we need, in a reasonable time. Otherwise, we are limited to tiny problem instances. For example, for some reason, we can only universally solve (like in the case of sorting) some seemingly simple problems with practical uses, such as the Knapsack problem, in times that grow like an exponential function or worse on non-quantum computers. The new machines are not subject to that limitation on run time growth. However, they will need new algorithms to exploit different physics. Applications that suffer from the growth limitation should see fundamental advantages. Those who rely on it, most notably secure computing, will face a huge problem.

Some experts thought at times that quantum computers would be the next big thing, eventually replacing conventional computers, at least in main impact areas, and thus radically changing our computing experience. Some remain skeptical and, at best, think quantum computers may serve a specialized role, that is, if we can overcome significant technical software and hardware issues. Time will tell. As of this writing, quantum computing is no longer a futuristic thing. Major technology players are shipping production real-world machines. They continue to pour enormous resources into the field, with substantial, steady progress reported.