FactOTD

Moore's Law: The Prophecy That Powered 50 Years of Computing

March 28, 2026 ยท 4 min read

The Fact

Moore's Law, proposed in 1965, predicted that transistor counts on microchips would double roughly every two years โ€” a trend that held for over 50 years.

In April 1965, Gordon Moore, then director of R&D at Fairchild Semiconductor, published a short paper in Electronics magazine in which he charted the number of transistors in integrated circuits from 1959 to 1965 and extrapolated the trend forward. The number had been doubling roughly every year, he observed, and he predicted this trend would continue for at least another decade. He was not wrong about the decade. He was wrong about the duration โ€” it continued for five decades.

What a Transistor Is and Why Density Matters

A transistor is a semiconductor device that acts as a switch, controlling the flow of electrical current. In digital computing, transistors represent the binary digits 0 and 1 โ€” off and on โ€” that underlie all computation. The more transistors you can fit on a chip, the more operations it can perform per second, and typically the more power-efficient those operations become because smaller transistors switch faster and require less energy.

In 1965, a state-of-the-art microchip contained roughly 64 transistors. By 1971, the Intel 4004 โ€” the first commercially available microprocessor โ€” had 2,300. By 1989, Intel's 486 processor had 1.2 million. By 2006, the Intel Core 2 Duo had 291 million. By 2019, Apple's A13 Bionic chip had 8.5 billion. Modern high-end chips contain well over 100 billion transistors in an area smaller than a fingernail.

This exponential growth meant that the computing power available at a given price point doubled approximately every two years for half a century. A computer that would have cost millions of dollars and occupied an entire room in 1970 could be surpassed by a device costing $5 by 2010. This relentless improvement made possible the personal computer revolution, the internet, smartphones, streaming video, artificial intelligence, and virtually every other digital technology that defines modern life.

Why It Worked So Long

Moore's prediction continued to hold not because of a single technology but because the semiconductor industry repeatedly reinvented how it built transistors to maintain the trend. When one approach began to hit physical limits, engineers developed new techniques โ€” from simple scaling of existing designs to new materials, three-dimensional chip architectures, and novel transistor geometries โ€” each extending the trend for another generation.

The manufacturing process is measured in nanometers, representing the typical feature size of transistors on a chip. In 1970 the process node was 10,000 nanometers. By 2020 it had reached 5 nanometers โ€” 5 billionths of a meter, a scale at which a single transistor is only about 20 silicon atoms wide. At these dimensions, quantum mechanical effects that were irrelevant at larger scales become significant engineering constraints. Electrons tunnel through barriers that classical physics would consider impenetrable. Heat dissipation becomes a severe challenge. Manufacturing variation at atomic scales creates reliability problems that require sophisticated error correction.

The Slow Bending of the Curve

Around 2005-2010, the straightforward scaling of transistor size โ€” making them smaller and running them faster โ€” began to hit thermal limits. Chips were generating more heat than could be dissipated, creating the "power wall" that ended the era of dramatically increasing clock speeds. The industry shifted from making individual processors faster to putting multiple processor cores on a single chip, maintaining the transistor count doubling while changing the nature of the performance gains.

By the 2020s, Moore's Law had clearly slowed. Transistor density is still increasing, but the doubling period has stretched from roughly two years to closer to three or four, and the relationship between transistor count and practical computing performance has become more complex as other bottlenecks โ€” memory bandwidth, interconnect speed, software efficiency โ€” increasingly limit what raw transistor counts can achieve.

Gordon Moore himself acknowledged before his death in 2023 that the physical limits of silicon were real and that the era of exponential transistor scaling was approaching its end. The industry is now exploring entirely new computational paradigms โ€” quantum computing, neuromorphic chips, photonic processors โ€” to maintain something like Moore's Law's spirit of continuous improvement even as its literal transistor-doubling mechanism approaches its limits.

F

FactOTD Editorial Team

Published March 28, 2026 ยท 4 min read

The FactOTD editorial team researches and verifies every fact before publication. Our mission is to make learning effortless and accurate. Learn about our process โ†’

Related Articles

technologyWhy All Computing Is Built on Just Two NumbersEvery email, every photograph, every video game, every financial transaction, and every AI response is ultimately expressed as a sequence of 0s and 1s. This is not an arbitrary choice โ€” it is a physical necessity imposed by the most fundamental behavior of the semiconductor devices that power all digital computing.technologyA 'Jiffy' Is a Real Unit of Time in Computer Science โ€” Not Just an ExpressionThe phrase 'in a jiffy' implies something happening very fast โ€” and in computer science, that casual expression has been formalized into a precise technical unit representing a single cycle of a computer's system clock.technologyAda Lovelace: The Mathematician Who Invented Computer Programming in 1843In 1843, a woman translated an Italian paper about a machine that had never been built, then added notes three times as long as the original โ€” including the first algorithm ever designed to be executed by a computer. The machine was never completed. The algorithm was correct.technologyYour Smartphone Is Millions of Times More Powerful Than Apollo's ComputersThe computer that guided Apollo 11 to the Moon and back operated at 0.043 MHz and had 4 kilobytes of RAM. A mid-range smartphone in 2026 has a processor running at 3,000 MHz and 8 gigabytes of RAM. The comparison is so extreme it borders on the surreal โ€” which makes it a perfect lens for understanding how radically computing has transformed in 55 years.