FactOTD

Your Smartphone Is Millions of Times More Powerful Than Apollo's Computers

March 28, 2026 ยท 4 min read

The Fact

Modern smartphones have far more computing power than the computers used in NASA's Apollo moon-landing missions.

When Apollo 11 descended toward the lunar surface on July 20, 1969, its navigation was managed by a computer that had been designed specifically for the mission and was, at the time, one of the most advanced computing devices ever built for a real-time control application. The Apollo Guidance Computer weighed about 32 kilograms, consumed 55 watts of power, and ran at a clock speed of approximately 2.048 MHz. It had 4 kilobytes of erasable memory (RAM) and 72 kilobytes of read-only memory. It was remarkable engineering for its era. It is also less powerful than a cheap digital watch today.

The Apollo Guidance Computer: What It Had to Do

The AGC, designed at MIT's Instrumentation Laboratory under the direction of Charles Draper, was required to perform real-time navigation calculations โ€” continuously tracking the spacecraft's position and velocity, computing course corrections, and managing the timing of engine burns โ€” while also monitoring dozens of systems and responding to crew inputs. It did this using circuits hand-wired with magnetic core memory and integrated circuits that were essentially custom-designed for this application.

The AGC's 4 kilobytes of RAM were used for temporary calculations; its 72 kilobytes of ROM held the mission software, permanently woven in magnetic cores by a process so delicate it was done by hand, with individual wires threaded through tiny magnetic rings by teams of skilled workers, mostly women, at the Raytheon Corporation. The software itself, including the famous alarm-handling code that allowed the mission to continue when an overload alarm triggered during the lunar descent, was a masterpiece of constrained programming.

For all its ingenuity, the AGC would be outperformed by a programmable calculator bought for $50 at a pharmacy today.

The Scale of the Difference

A mid-range smartphone processor in 2026 runs at clock speeds of roughly 3,000 MHz โ€” about 1,500 times faster than the AGC's 2 MHz. The comparison on RAM is even more dramatic: 8 gigabytes versus 4 kilobytes is a ratio of 2,097,152 โ€” over two million times more. Processing power measured in FLOPS (floating-point operations per second) shows a similar picture: the AGC managed roughly 14,000 operations per second; a modern smartphone processor performs upward of 1 trillion floating-point operations per second in the right circumstances. That is approximately 70 million times more.

Storage comparisons are similarly mind-bending. The AGC's 76 kilobytes total versus the 256 gigabytes of flash storage in a modern smartphone represents a ratio of roughly 3.4 million. A single high-resolution photograph taken on a smartphone contains more data than the entire memory capacity of the computer that sent humans to the Moon.

What Made the Apollo Achievement So Extraordinary

The paradox of this comparison is that it makes the Apollo achievement more impressive, not less. The engineers who sent humans to the Moon did so with computing resources that are today considered insufficient for a digital wristwatch. They compensated through extraordinary software efficiency โ€” every byte of memory was precious, every processor cycle was budgeted, and the code was tested to exhaustion.

The AGC's software included priority scheduling, error detection and recovery, and real-time response guarantees that modern embedded systems engineers still study and respect. When the 1202 alarm triggered during the final Apollo 11 descent, indicating that the computer was overloaded with navigation requests, the AGC's operating system correctly identified it as a recoverable overflow condition and restarted the critical navigation tasks rather than crashing the system. That behavior was not accidental โ€” it was deliberately designed, carefully tested, and saved the mission.

The lesson of the Apollo computer is that raw computing power matters less than how intelligently it is applied. The engineers of 1969 accomplished with 4 kilobytes what many modern software projects fail to accomplish with gigabytes โ€” proof that constraints, rather than limiting achievement, often force the creativity that makes it possible.

F

FactOTD Editorial Team

Published March 28, 2026 ยท 4 min read

The FactOTD editorial team researches and verifies every fact before publication. Our mission is to make learning effortless and accurate. Learn about our process โ†’

Related Articles

technologyWhy All Computing Is Built on Just Two NumbersEvery email, every photograph, every video game, every financial transaction, and every AI response is ultimately expressed as a sequence of 0s and 1s. This is not an arbitrary choice โ€” it is a physical necessity imposed by the most fundamental behavior of the semiconductor devices that power all digital computing.historyNintendo Was Founded in 1889 to Make Playing Cards โ€” 130 Years Before MarioNintendo is one of the world's most recognizable video game companies, but it existed for over 90 years before it made a single video game. Its story begins in 1889 in Kyoto, where a craftsman named Fusajiro Yamauchi began hand-painting playing cards โ€” and from that modest origin grew one of the most consequential entertainment companies in history.technologyQuantum Computing: Why Qubits Can Be Both 0 and 1 at the Same TimeQuantum computers do not compute faster by running the same algorithms with bigger numbers โ€” they compute differently, exploiting quantum mechanical phenomena that have no analog in classical physics. The qubit's ability to exist in a superposition of states is the foundation of that difference, and understanding it requires a brief trip into the physics of the very small.technologyWhy Your Computer Forgets Everything When You Turn It OffEvery time you shut down your computer without saving, you lose your work. This is not a design flaw or a software failure โ€” it is a direct consequence of the physical mechanism that makes RAM fast enough to be useful in the first place. Understanding why RAM is volatile reveals something fundamental about the tradeoffs at the heart of computer architecture.