Advanced Computer Architecture

Big Picture Summary 6

Updated Friday November 20, 2020 9:20 PM GMT+3

Revised 11/20/2020 - This article is not standalone. It's meant to cap a detailed reading of a similarly titled set of technical papers (see the course calendar).

Quantum Computers

The term quantum, as in quantum physics, originated in a breakthrough scientific discovery a century ago. It is, unfortunately, somewhat mysterious for a lot of people, some of whom may need to use the term in a technical context.

Energy, the most primal feature of the physical world, seems to “condense” or form matter in discrete definite amounts (termed quanta). Resulting matter building-block particles also seem to behave similarly at a small scale, e.g., spin or acquire/release energy in fixed amounts as well.

Conventional computers switch between 2 distinct and deterministically time-sequential states. The states based on some large-scale physical behavior, such as the flow of electrons in a semiconducting material, are predictable. Only two, in a mostly ignored analog continuous behavior, are considered to simplify the logical analysis of the system [Feynman Lectures on Computation, Westview Press 1996]. In a quantum computer, the focus shifts to individualized quantum properties, like spin or position. Quantum properties have probabilistically-distinguishable states which overlap in time in unfamiliar ways (not how we experience the physical world).

Clearly, quantum computers demand very different technology. What may be less clear, but more important, is the fundamental change in the underlying switching behavior of the physical machine. Rapidly changing states which overlap in time are naturally parallel at a level lower than a computation. Switching is much faster but much more difficult to predict and control, unfortunately.

Quantum switching promises to overcome the polynomial efficiency limitation of conventional computers (i.e., solution time must grow like a polynomial function of input size to be able to solve arbitrary instances of problems in a reasonable time). We will need new algorithms to exploit different physics, however. Applications which suffer from the limitation should see significant advantages. Those who rely on it, most notably secure computing, will face a huge problem.

Experts, increasingly it seems, believe that quantum computers will be the next big thing, eventually replacing conventional computers and radically changing our computing experience. Others remain skeptical and, at best, think quantum computers may serve a limited specialized role, that is, if we can overcome hard technical software and hardware issues. Time will tell. As of this writing, major technology players are pouring enormous resources into the field with significant progress reported.