Advanced Computer Architecture Big Picture Summary 1

Updated Sunday January 21, 2024 11:27 AM GMT+3

Revised 9/3/2023 - This article is not standalone and is not a complete treatment of its topics. It's meant to cap a detailed reading of a similarly titled set of technical papers.

Beginnings

Modern digital computers are electronic. They rely on transistors, a basic physical building block of electronic circuits, to implement digital logic. It is well known that computers employ a language of 0s and 1s, but why? Where do 0s and 1s originate? How do you get there from transistors?

From Physical Device to Code

To implement digital logic, we do not care about the smooth variations in some physical signals in the circuits. We only care about drastic changes. The discrete physical states corresponding to those changes can encode discrete symbols, at least 2 (binary), the minimum for serious coding (one is usable but is pretty limiting). We could track ten discrete positions, or states, to code the familiar ten numeric digits. The resulting physical machine would not do any better or more than a machine based on two states could. It would only be more complicated.

Hover 🔍︎

Simulating the digital signals generated by switching action: a continuous waveform must cross a threshold to change state. The ON/OFF threshold depends on the physical characteristics of the device used to generate the signals. The figure depicts the sequence ON-OFF-ON-OFF of switch actions.

Therefore, an electronic digital computer utilizes a transistor as an elementary 2-state switch that provides a physical basis for binary coding. The two states denote the ON and OFF actions of a switching device. A binary code, in turn, provides a basis for describing a simplified truth-based version of Boolean logic (one that focuses on the binary true or false outcomes of testing logical propositions).

A binary coding scheme can provide a way to describe the switching states of transistors in an electronic circuit that implements a digital device. With some circuitry, it can also instruct the device to take on an operational position by specifying the ON/OFF states that the transistors should take. Or, it could have the circuit tell its current switching settings. In other words, a binary-coded language to communicate with the device, specific to its circuitry, naturally arises.

In addition, the transistor ON/OFF binary states may encode a stored value interpreted as a number in base-2. A base-2 number is a natural binary code. The number may denote its numeric value, a character shape, or a selector code, perhaps for a specific sub-circuit or data pathway (an opcode of an ADD circuit, for example). Fundamentally, 0s and 1s provide a way to store numbers and operations the same way, in the same memory.

Hover 🔍︎

(a) Some letters of the alphabet in Morse (grayed), an older binary coding scheme, and ASCII (ISO 8859-6), a modern one. The last column shows the numeric values when we view ASCII as base-2 numbers. In Morse, 0s and 1s may replace the conventional dots and dashes. (b) Codes for some Intel 8080 operations (opcodes), defined relative to an internal store, where results may accumulate.

Operations and Operands

A modern computer is a complex machine. It is composed of thousands of specialized electronic circuits. Its logical switching states may not directly relate to that of individual transistors, but they do trace back there. The circuitry-specific binary language of a computer works by design at a level meaningful to its users. It typically provides a way to specify operations, called ops for short, and their operands. Binary codes for circuits that perform the ops (such as ADD) and coded operands or codes for circuit parts that store them are assembled in specific ways to form a machine instruction. The instruction is then a binary coding of a command that the machine can execute. The bits that encode the operations and operands cause some transistors, eventually, to take on the ON/OFF switching states needed to run the machine instruction.

Hover 🔍︎

To add 5 to an operand from a store (CX) and put the result in a different store (AX), a user may consult some code tables to put together (assemble) a machine instruction. The codes assigned to operations and other machine resources, such as operand stores, reflect details specific to the machine and the coding choices made by its designers. It is a simple but rather tedious job that a machine can do. The user may express the desired calculation in a less cryptic symbolic notation that an assembler program can process. The friendlier assembly language is also specific to the machine.

Programming languages of all types (at various levels above zeros and ones) facilitate communicating that operation and operand information in terms more meaningful to humans and the problems they are trying to solve. A compiler program then translates the higher-level notation and collects the pieces needed to build the binary code that a machine may run.

In Summary

A breakthrough feature of the modern computer is that it utilizes binary coding that a) may be transformed by rules of logic and b) maps to simple physical switching devices, leading to relatively simple hardware. Any physical switches would do. A transistor is a highly reliable switch, tiny in size and cost. The transistor was transformative (see the Spectrum magazine link). It enabled the faster, more logically complex computing machines of today.

Hover 🔍︎ + Click

Three key ingredients make modern computers: a 0/1 based binary coding, physical 2-state ON/OFF switches, and a true or false switching version of Boolean logic, where a variable depicts testing a logical proposition for truth rather than the proposition itself.