FactOTD

The Intel 4004: From 2,300 Transistors to 100 Billion in 50 Years

March 28, 2026 ยท 4 min read

The Fact

The first commercially available microprocessor, the Intel 4004, released in 1971, had 2,300 transistors. Modern chips have over 100 billion.

In 1969, a Japanese company called Busicom approached Intel with a commission: design a set of chips for a new desktop calculator. Intel engineer Marcian "Ted" Hoff looked at the proposed design and suggested something radical โ€” instead of building a set of fixed-function chips optimized for the calculator, build a single general-purpose processing chip that could be programmed for any application. The calculator would be programmed with calculator instructions, but the chip itself would not be a calculator. It would be a computer.

That decision, negotiated over months and eventually resulting in Intel buying back the design rights from Busicom for $60,000, produced the Intel 4004, released to the general market on November 15, 1971. It was the world's first commercially available microprocessor โ€” all the core components of a computer's central processing unit implemented on a single chip.

What 2,300 Transistors Could Do

The 4004 was a 4-bit processor, meaning it processed data in chunks of 4 bits (one nibble) at a time. It ran at a clock speed of 740 kilohertz โ€” about 740,000 cycles per second โ€” and could perform approximately 92,000 simple arithmetic operations per second. It was manufactured on a 10-micron process, meaning the smallest features on the chip were 10 micrometers (10,000 nanometers) wide. The chip itself measured 3.8 x 2.8 millimeters โ€” smaller than a thumbnail.

In context, this was astonishing. Just four years earlier, the Apollo Guidance Computer that helped navigate the Apollo 11 mission to the Moon used integrated circuits containing only a handful of transistors each, and its entire construction required thousands of such chips wired together in a custom assembly. The 4004 put the equivalent of a basic computer's logic onto a single die that could be manufactured in quantity and sold for roughly $300.

For Busicom's calculator application, the 4004 was ideal. But Hoff's original insight โ€” that a programmable general-purpose chip could be used for anything โ€” proved prophetic. Within a year, Intel's Federico Faggin and his team had developed the 8008, a more capable 8-bit processor. By 1974 came the 8080. By 1978 came the 8086, whose instruction set architecture forms the basis of the x86 architecture used in virtually all personal computers and servers today, half a century later.

The Journey to 100 Billion Transistors

The progression from 2,300 transistors in 1971 to over 100 billion today traces almost exactly the curve predicted by Moore's Law. A brief tour of milestones makes the scale of the achievement visceral: Intel's 80386 in 1985 had 275,000 transistors. The Pentium Pro in 1995 had 5.5 million. The Core 2 Duo in 2006 had 291 million. By 2012, Intel's Ivy Bridge reached 1.4 billion. Apple's M1 chip in 2020 contained 16 billion. NVIDIA's H100 GPU, used in AI computing, contains 80 billion. Recent generations exceed 100 billion.

Each doubling required not just making transistors smaller but solving new engineering problems: heat dissipation, quantum tunneling, photolithography at the limits of visible light's resolution, dopant atom distribution at scales where individual atoms matter. Modern chips are fabricated using extreme ultraviolet lithography, which uses light with a wavelength of 13.5 nanometers โ€” shorter than an X-ray โ€” to pattern features on silicon with atomic-scale precision.

Why the 4004 Was a Threshold

What made the 4004 historically significant was not its performance โ€” it was modest even by the standards of 1971 minicomputers. It was the concept: a complete processor on a single mass-producible chip, programmable for any purpose, available to any customer willing to pay a few hundred dollars. That concept meant that computing intelligence could be embedded in any product, any device, any application imaginable. The programmable microprocessor is what made the personal computer possible, which made the internet possible, which made smartphones possible. The entire digital economy traces its lineage to the decision Ted Hoff made in 1969 to consolidate a calculator's logic onto a single general-purpose chip.

F

FactOTD Editorial Team

Published March 28, 2026 ยท 4 min read

The FactOTD editorial team researches and verifies every fact before publication. Our mission is to make learning effortless and accurate. Learn about our process โ†’

Related Articles

technologyThe First Internet Message Was 'LO' โ€” Because the System Crashed After Two LettersThe first message ever sent over the network that would become the internet was supposed to be 'LOGIN.' Instead it was 'LO' โ€” because the receiving computer crashed after two characters. The accidental poetry of that truncated greeting, inadvertently echoing 'hello' or 'lo and behold,' seems fitting for the birth of the technology that would eventually connect most of humanity.technologyWhy All Computing Is Built on Just Two NumbersEvery email, every photograph, every video game, every financial transaction, and every AI response is ultimately expressed as a sequence of 0s and 1s. This is not an arbitrary choice โ€” it is a physical necessity imposed by the most fundamental behavior of the semiconductor devices that power all digital computing.technologyCloud Computing Was Predicted in 1961 โ€” Long Before the Internet ExistedIn April 1961, at MIT's centennial celebration, a mathematician named John McCarthy suggested that 'computation may someday be organized as a public utility.' He made this prediction a decade before the internet, three decades before the World Wide Web, and four decades before Amazon Web Services would make it a commercial reality. The idea was so far ahead of its time that it had to be reinvented independently by a new generation of engineers before the technology could support it.technologyENIAC: The 30-Ton Computer That Launched the Digital AgeIn 1945, a machine filled an entire room, weighed as much as a loaded semi-truck fleet, and drew so much power it reportedly dimmed the lights of an entire Philadelphia neighborhood when switched on. It was called ENIAC โ€” and it could perform 5,000 additions per second, making it the fastest calculator in the world.