FactOTD

Why All Computing Is Built on Just Two Numbers

March 28, 2026 ยท 4 min read

The Fact

Binary code, using only 0s and 1s, is the foundation of all digital computing because transistors can only be on or off.

It seems almost absurd that the full complexity of modern civilization โ€” the internet, smartphones, artificial intelligence, digital medicine โ€” rests on a number system that uses only two digits. A child could write down every symbol in binary: 0 and 1. Yet those two symbols, combined in sufficiently complex patterns, can represent any number, any letter, any color, any sound, any instruction a computer might execute. Understanding why computing uses binary requires understanding a physical truth about the materials computers are made of.

The Physics of On and Off

A transistor is a semiconductor device โ€” typically made of silicon โ€” that controls the flow of electrical current. Its fundamental behavior is to either allow current to flow or not, depending on a control voltage applied to a gate terminal. In practice, transistors exist in one of two distinct states: conducting (on) and non-conducting (off). The physical mechanism is the movement of charge carriers โ€” electrons or holes โ€” through a channel of semiconductor material that can be opened or closed by the electric field from the gate.

This binary behavior โ€” on or off โ€” is not a design choice but a physical reality. Silicon transistors do not have a stable "middle" state. If you try to operate a transistor in an intermediate condition, small variations in temperature, manufacturing, and noise push it randomly toward one extreme or the other. Engineering a reliable three-state or ten-state transistor from silicon is extremely difficult; engineering a reliable two-state transistor is comparatively straightforward because the two states are inherently stable.

This physical reality dictated the choice of number system. If your hardware only reliably represents two states, you use a two-state number system. Binary โ€” base-2, using only 0 and 1 โ€” maps perfectly onto on/off transistor states.

Counting in Binary

Binary works exactly like the decimal system we use for everyday arithmetic, except it uses powers of 2 instead of powers of 10. In decimal, the number 345 means 3 ร— 100 + 4 ร— 10 + 5 ร— 1. In binary, the number 1011 means 1 ร— 8 + 0 ร— 4 + 1 ร— 2 + 1 ร— 1, which equals 11 in decimal. Any integer can be represented in binary; it just takes more digits. The decimal number 255 requires the binary sequence 11111111 โ€” eight bits, or one byte.

A byte is the fundamental unit of digital information: eight bits, capable of representing 256 different values (2 to the power of 8). Text, in the ASCII standard, represents each letter and symbol as a specific byte value โ€” the letter 'A' is 65 in decimal, or 01000001 in binary. An image is stored as a sequence of bytes representing the color of each pixel. Sound is stored as bytes representing the amplitude of the audio waveform at each time step. Every type of data โ€” text, images, video, code, databases โ€” is ultimately a sequence of binary numbers.

From Binary to Logic Gates to Computation

The connection between transistors and computing is made through logic gates โ€” circuits built from combinations of transistors that implement logical operations. An AND gate outputs 1 only if both its inputs are 1. An OR gate outputs 1 if either input is 1. A NOT gate outputs the opposite of its input. These three fundamental gates, combined in sufficiently complex circuits, can implement any logical or arithmetic operation imaginable.

An adder circuit โ€” which adds two binary numbers โ€” is built from logic gates. A multiplier is built from adders. A processor is built from millions or billions of such circuits operating in coordinated parallel. The extraordinary complexity of a modern processor performing trillions of operations per second is entirely constructed from variations on these simple logical primitives, all ultimately built from transistors that are either on or off.

Why Not Analog Computing?

Analog computers, which represent values as continuously varying voltages rather than discrete 0s and 1s, were common before digital computers became dominant. They are inherently faster for certain tasks โ€” differential equations, signal processing โ€” because they compute continuously rather than through discrete steps. But they suffer a fundamental weakness: analog signals degrade. Noise accumulates, precision is limited by physical tolerances, and results drift over time and temperature. Digital binary signals, by contrast, can be perfectly regenerated at each logic gate โ€” a 0 or 1 is always refreshed to a clean 0 or a clean 1. This regenerative property makes digital systems arbitrarily scalable; you can chain billions of logic gates and the signal at the end is as clean as the signal at the beginning. That scalability is why digital computing conquered the world.

F

FactOTD Editorial Team

Published March 28, 2026 ยท 4 min read

The FactOTD editorial team researches and verifies every fact before publication. Our mission is to make learning effortless and accurate. Learn about our process โ†’

Related Articles

technologyYour Smartphone Is Millions of Times More Powerful Than Apollo's ComputersThe computer that guided Apollo 11 to the Moon and back operated at 0.043 MHz and had 4 kilobytes of RAM. A mid-range smartphone in 2026 has a processor running at 3,000 MHz and 8 gigabytes of RAM. The comparison is so extreme it borders on the surreal โ€” which makes it a perfect lens for understanding how radically computing has transformed in 55 years.technologyMoore's Law: The Prophecy That Powered 50 Years of ComputingIn 1965, Gordon Moore looked at four years of data points and extrapolated a trend line. His prediction โ€” that transistor density on microchips would double roughly every two years โ€” turned out to be one of the most accurate long-range forecasts in industrial history, reshaping civilization for half a century.historyNintendo Was Founded in 1889 to Make Playing Cards โ€” 130 Years Before MarioNintendo is one of the world's most recognizable video game companies, but it existed for over 90 years before it made a single video game. Its story begins in 1889 in Kyoto, where a craftsman named Fusajiro Yamauchi began hand-painting playing cards โ€” and from that modest origin grew one of the most consequential entertainment companies in history.technologyQuantum Computing: Why Qubits Can Be Both 0 and 1 at the Same TimeQuantum computers do not compute faster by running the same algorithms with bigger numbers โ€” they compute differently, exploiting quantum mechanical phenomena that have no analog in classical physics. The qubit's ability to exist in a superposition of states is the foundation of that difference, and understanding it requires a brief trip into the physics of the very small.