Computing
Fun computing facts to improve your knowledge and get better at trivia.
Ada Lovelace Wrote the First Computer Program in 1843 — For a Machine That Wasn't Built
Ada Lovelace wrote the world's first computer program in 1843 for a machine that was never completed. Her vision of what computing could become was so far ahead of its time that it wasn't fully appreciated for over a century.
Nintendo Was Founded in 1889 to Make Playing Cards — 130 Years Before Mario
Nintendo is one of the world's most recognizable video game companies, but it existed for over 90 years before it made a single video game. Its story begins in 1889 in Kyoto, where a craftsman named Fusajiro Yamauchi began hand-painting playing cards — and from that modest origin grew one of the most consequential entertainment companies in history.
A 'Jiffy' Is a Real Unit of Time in Computer Science — Not Just an Expression
The phrase 'in a jiffy' implies something happening very fast — and in computer science, that casual expression has been formalized into a precise technical unit representing a single cycle of a computer's system clock.
The Word 'Robot' Was Invented in a 1920 Play — And It Already Imagined the AI Problem
Before robots existed in any physical form, a Czech playwright named Karel Čapek gave them their name — and their defining narrative: artificial beings who do humanity's labor, grow conscious of their condition, and eventually rise against their creators.
Before Copy-Paste Existed: How the Apollo Moon Landing Code Was Written by Hand
Long before integrated development environments or version control systems existed, the software that guided astronauts to the Moon was drafted on paper by teams of engineers working under extraordinary pressure. The story of Apollo's software is one of human ingenuity operating at the very edge of what was technically possible.
IBM's 1980 Hard Drive: 500 Pounds, $40,000, and One Gigabyte of Storage
In 1980, IBM shipped the world's first gigabyte-capacity hard drive. The IBM 3380 weighed more than 500 pounds, required a refrigerator-sized cabinet, and carried a price tag of $40,000. Today, the same capacity fits on a chip smaller than a fingernail.
TYPEWRITER: The One Word That Lives Entirely on the Top Row of Your Keyboard
The word TYPEWRITER uses only the letters Q, W, E, R, T, Y, U, I, O, and P — the exact keys on the top row of a QWERTY keyboard. This either reflects a clever design choice, a remarkable coincidence, or a very useful sales demo trick, depending on who you ask.
Google Was Almost Called 'Backrub' — The Naming Story Behind the World's Biggest Search Engine
Before Google became a verb, a noun, and arguably the most recognized brand on Earth, it operated under the unglamorous name 'Backrub.' Understanding why reveals something important about how the early internet worked — and how close we came to a very different technological era.
The First VCR Was the Size of a Piano — And Cost More Than a House
In 1956, Ampex Corporation unveiled a machine that could record and play back television video. It weighed 750 pounds, stood as tall as a piano, and cost $50,000 — roughly half a million dollars in today's money. It also changed the world.
The Wooden Mouse: Doug Engelbart's 1964 Invention That Redefined How We Interact with Computers
In 1964, Doug Engelbart carved a small wooden box with two perpendicular wheels on its underside and a single button on top. He called it an 'X-Y position indicator for a display system.' The world would eventually call it a mouse.
Mark Twain's Typewritten Manuscript: How America's Greatest Writer Embraced New Technology
Mark Twain purchased one of the first Remington typewriters available to the public in the early 1870s, and 'Life on the Mississippi,' published in 1883, is widely cited as the first book-length manuscript submitted to a publisher having been typed rather than handwritten.
The World's First Webcam Was Watching a Coffee Pot — The Full Story
The Trojan Room Coffee Pot camera, installed in the Cambridge University Computer Laboratory in 1991, was the world's first webcam. It was built for the most human of reasons: computer scientists were tired of making the trip to the kitchen only to find an empty coffee pot.
Moore's Law: The Prophecy That Powered 50 Years of Computing
In 1965, Gordon Moore looked at four years of data points and extrapolated a trend line. His prediction — that transistor density on microchips would double roughly every two years — turned out to be one of the most accurate long-range forecasts in industrial history, reshaping civilization for half a century.
Why Your Keyboard Isn't Alphabetical: The QWERTY Origin Story
Every time you type on a keyboard, you are using a layout designed in the 1870s for a machine that no longer exists, to solve a mechanical problem that modern technology eliminated decades ago. QWERTY persists not because it is optimal, but because of one of the most powerful forces in human behavior: the cost of changing something that already works well enough.
SQL: The 50-Year-Old Language Still Running the World's Data
In the early 1970s, IBM researchers developed a language for querying databases that was so well-designed it has outlasted generations of programming languages, computing paradigms, and technology revolutions. SQL — Structured Query Language — is still how the world talks to its most important data.
Python Was Named After Monty Python, Not a Snake — Here's Why That Matters
In December 1989, a Dutch programmer named Guido van Rossum started a new programming language as a holiday project. He wanted it to be fun to use, and he named it after the British comedy group Monty Python. Thirty-five years later, Python is the most widely used programming language on Earth.
Linux: How a Hobby Project Became the Software Powering the World
In August 1991, a 21-year-old Finnish computer science student posted a message to an internet newsgroup saying he was writing a free operating system 'just a hobby, won't be big and professional.' That casual announcement introduced Linux to the world — a kernel that now runs more than 90% of the world's servers, all Android smartphones, and every supercomputer on the TOP500 list.
ENIAC: The 30-Ton Computer That Launched the Digital Age
In 1945, a machine filled an entire room, weighed as much as a loaded semi-truck fleet, and drew so much power it reportedly dimmed the lights of an entire Philadelphia neighborhood when switched on. It was called ENIAC — and it could perform 5,000 additions per second, making it the fastest calculator in the world.
Ada Lovelace: The Mathematician Who Invented Computer Programming in 1843
In 1843, a woman translated an Italian paper about a machine that had never been built, then added notes three times as long as the original — including the first algorithm ever designed to be executed by a computer. The machine was never completed. The algorithm was correct.
Why Your Computer Forgets Everything When You Turn It Off
Every time you shut down your computer without saving, you lose your work. This is not a design flaw or a software failure — it is a direct consequence of the physical mechanism that makes RAM fast enough to be useful in the first place. Understanding why RAM is volatile reveals something fundamental about the tradeoffs at the heart of computer architecture.
The Altair 8800: The Kit Computer That Sparked the Personal Computer Revolution
In January 1975, a mail-order electronics kit appeared on the cover of Popular Electronics magazine. It had no keyboard, no monitor, no storage, and no software. Programming it meant flipping switches on the front panel. It sold out immediately, inspired two Harvard students to drop out and write software for it, and started a revolution.
The Original Computer Bug Was an Actual Bug: The Story Behind the Term
Software engineers spend a substantial portion of their professional lives hunting bugs. The word's origin in computing can be traced to a specific afternoon in September 1947, when a technician at Harvard found a moth caught between the relay contacts of the Mark II computer — and taped it into the logbook with the note 'First actual case of bug being found.'
The Intel 4004: From 2,300 Transistors to 100 Billion in 50 Years
The Intel 4004, released in November 1971, was designed to power a Japanese desktop calculator. It had 2,300 transistors and could perform about 92,000 operations per second. It was also the ancestor of every microprocessor in every computer, smartphone, and data center on Earth today.
Creeper: The World's First Computer Virus Just Wanted to Play Tag
In 1971, a programmer wrote a program that copied itself across ARPANET computers and displayed a taunting message. It was not malicious — it was more of an experiment, perhaps even a prank. But Creeper was the first self-replicating program in computing history, and it inadvertently launched an arms race that has never stopped.
The IBM 350: The First Hard Drive Weighed a Ton and Stored 5 MB
In 1956, IBM introduced a data storage device that weighed over a ton, occupied the space of two large refrigerators, required its own air compressor, and stored five megabytes of data. Renting it cost around $3,200 per month. Today, a device 10,000 times smaller stores 10 million times more data.
The First Touchscreen Was Built in 1965 — 40 Years Before the iPhone
Touch interfaces feel like a natural, intuitive way to interact with technology. What feels natural took over 40 years to move from a defense research lab in rural England to the pocket of every smartphone user on the planet — a journey through military technology, ATMs, musical instruments, and eventually the consumer electronics revolution.
Quantum Computing: Why Qubits Can Be Both 0 and 1 at the Same Time
Quantum computers do not compute faster by running the same algorithms with bigger numbers — they compute differently, exploiting quantum mechanical phenomena that have no analog in classical physics. The qubit's ability to exist in a superposition of states is the foundation of that difference, and understanding it requires a brief trip into the physics of the very small.
Why All Computing Is Built on Just Two Numbers
Every email, every photograph, every video game, every financial transaction, and every AI response is ultimately expressed as a sequence of 0s and 1s. This is not an arbitrary choice — it is a physical necessity imposed by the most fundamental behavior of the semiconductor devices that power all digital computing.
Cloud Computing Was Predicted in 1961 — Long Before the Internet Existed
In April 1961, at MIT's centennial celebration, a mathematician named John McCarthy suggested that 'computation may someday be organized as a public utility.' He made this prediction a decade before the internet, three decades before the World Wide Web, and four decades before Amazon Web Services would make it a commercial reality. The idea was so far ahead of its time that it had to be reinvented independently by a new generation of engineers before the technology could support it.
How Linus Torvalds Built Git in Two Weeks — and Why It Conquered Software Development
In April 2005, Linus Torvalds sat down to solve a specific problem: the Linux kernel project needed a new version control system after the one it had been using became unavailable under acceptable terms. He gave himself one week to have something working. He had a usable system in about two weeks. He named it Git — British slang for an unpleasant person — and it has since become the universal foundation of modern software development.
Your Smartphone Is Millions of Times More Powerful Than Apollo's Computers
The computer that guided Apollo 11 to the Moon and back operated at 0.043 MHz and had 4 kilobytes of RAM. A mid-range smartphone in 2026 has a processor running at 3,000 MHz and 8 gigabytes of RAM. The comparison is so extreme it borders on the surreal — which makes it a perfect lens for understanding how radically computing has transformed in 55 years.
Computing — Frequently Asked Questions
Did you know that the first computer programmer was a woman named Ada Lovelace, who wrote an algorithm for the Analyti?+
The first computer programmer was a woman named Ada Lovelace, who wrote an algorithm for the Analytical Engine in 1843. Source: The British Library
Did you know that nintendo was founded in 1889 as a company that produced handmade playing cards.?+
Nintendo was founded in 1889 as a company that produced handmade playing cards. Source: Nintendo Co., Ltd.
Did you know that a 'jiffy' is an actual unit of time used in computer science, equal to one cycle of a computer's sys?+
A 'jiffy' is an actual unit of time used in computer science, equal to one cycle of a computer's system clock. Source: NIST
Did you know that the term 'Robot' was first used in a 1920 play called R.U.R. by Czech writer Karel Čapek.?+
The term 'Robot' was first used in a 1920 play called R.U.R. by Czech writer Karel Čapek. Source: The New Yorker
Did you know that the code for the first Apollo moon landing was written by hand on paper before being typed into the ?+
The code for the first Apollo moon landing was written by hand on paper before being typed into the computer. Source: Smithsonian
Did you know that the first 1GB hard drive, released by IBM in 1980, weighed over 500 pounds and cost $40,000.?+
The first 1GB hard drive, released by IBM in 1980, weighed over 500 pounds and cost $40,000. Source: IBM Museum
Did you know that the word 'Typewriter' can be typed using only the top row of keys on a QWERTY keyboard.?+
The word 'Typewriter' can be typed using only the top row of keys on a QWERTY keyboard. Source: Oxford English Dictionary
Did you know that the original name of 'Google' was 'Backrub'.?+
The original name of 'Google' was 'Backrub'. Source: Google History