Beginnings
Modern digital computers are electronic. They rely on transistors, a basic physical building block of electronic circuits, to implement digital logic. It is well known that computers employ a language of 0s and 1s, but why? Where do 0s and 1s originate? How do you get there from transistors?
From Physical Device to Code
To implement digital logic, we do not care about the smooth variations in some physical signals in the circuits. We only care about drastic changes. The discrete physical states corresponding to those changes can encode discrete symbols, at least 2 (binary), the minimum for serious coding (one is usable but is pretty limiting). We could track ten discrete positions, or states, to code the familiar ten numeric digits. The resulting physical machine would not do any better or more than a machine based on two states could. It would only be more complicated.
Therefore, an electronic digital computer utilizes a transistor as an elementary 2-state switch that provides a physical basis for binary coding. The two states denote the ON and OFF actions of a switching device. A binary code, in turn, provides a basis for describing a simplified truth-based version of Boolean logic (one that focuses on the binary true or false outcomes of testing logical propositions).
A binary coding scheme can provide a way to describe the switching states of transistors in an electronic circuit that implements a digital device. With some circuitry, it can also instruct the device to take on an operational position by specifying the ON/OFF states that the transistors should take. Or, it could have the circuit tell its current switching settings. In other words, a binary-coded language to communicate with the device, specific to its circuitry, naturally arises.
In addition, the transistor ON/OFF binary states may encode a stored value interpreted as a number in base-2. A base-2 number is a natural binary code. The number may denote its numeric value, a character shape, or a selector code, perhaps for a specific sub-circuit or data pathway (an opcode of an ADD circuit, for example). Fundamentally, 0s and 1s provide a way to store numbers and operations the same way, in the same memory.
Operations and Operands
A modern computer is a complex machine. It is composed of thousands of specialized electronic circuits. Its logical switching states may not directly relate to that of individual transistors, but they do trace back there. The circuitry-specific binary language of a computer works by design at a level meaningful to its users. It typically provides a way to specify operations, called ops for short, and their operands. Binary codes for circuits that perform the ops (such as ADD) and coded operands or codes for circuit parts that store them are assembled in specific ways to form a machine instruction. The instruction is then a binary coding of a command that the machine can execute. The bits that encode the operations and operands cause some transistors, eventually, to take on the ON/OFF switching states needed to run the machine instruction.
Programming languages of all types (at various levels above zeros and ones) facilitate communicating that operation and operand information in terms more meaningful to humans and the problems they are trying to solve. A compiler program then translates the higher-level notation and collects the pieces needed to build the binary code that a machine may run.
In Summary
A breakthrough feature of the modern computer is that it utilizes binary coding that a) may be transformed by rules of logic and b) maps to simple physical switching devices, leading to relatively simple hardware.
Any physical switches would do. A transistor is a highly reliable switch, tiny in size and cost. The transistor was transformative (see the Spectrum magazine link). It enabled the faster, more logically complex computing machines of today.