Technology
Fun technology facts to improve your knowledge and get better at trivia. Use these to look smarter, win quiz nights, and always have an interesting fact to share.
Ada Lovelace Wrote the First Computer Program in 1843 — For a Machine That Wasn't Built
Ada Lovelace wrote the world's first computer program in 1843 for a machine that was never completed. Her vision of what computing could become was so far ahead of its time that it wasn't fully appreciated for over a century.
The Only Alarm in History That Could Wake You at 4 A.M.: The World's First Alarm Clock
The world's first alarm clock, built by Levi Hutchins of New Hampshire in 1787, could only ring at one time: 4:00 in the morning. Hutchins built it to wake himself up for work, and he built it to ring at 4 a.m. because that's when he wanted to wake up — not because he wanted anyone else to use it.
Larry the Bird: How an NBA Legend Accidentally Became Twitter's Mascot
The bird logo that became one of the most recognized symbols on the internet is officially named Larry — a tribute to Larry Bird, the NBA Hall of Famer who played for the Boston Celtics, placed there by Twitter co-founder Biz Stone.
Octothorpe: The Real Name of the # Symbol and the Debate Over Its Origin
The # symbol has been called many things: the number sign, the pound sign, the hash, and since 2007, the hashtag. Its most obscure official name, octothorpe, was coined by engineers at Bell Telephone Laboratories in the 1960s and remains disputed in its origin.
Nintendo Was Founded in 1889 to Make Playing Cards — 130 Years Before Mario
Nintendo is one of the world's most recognizable video game companies, but it existed for over 90 years before it made a single video game. Its story begins in 1889 in Kyoto, where a craftsman named Fusajiro Yamauchi began hand-painting playing cards — and from that modest origin grew one of the most consequential entertainment companies in history.
A 'Jiffy' Is a Real Unit of Time in Computer Science — Not Just an Expression
The phrase 'in a jiffy' implies something happening very fast — and in computer science, that casual expression has been formalized into a precise technical unit representing a single cycle of a computer's system clock.
Leo Fender Invented the World's Most Iconic Guitars and Never Learned to Play One
The man who built the instruments that shaped rock and roll, country, blues, and jazz never played a chord. Leo Fender's story is one of the great paradoxes in the history of music — and a testament to the idea that understanding users can matter more than being one.
The Word 'Robot' Was Invented in a 1920 Play — And It Already Imagined the AI Problem
Before robots existed in any physical form, a Czech playwright named Karel Čapek gave them their name — and their defining narrative: artificial beings who do humanity's labor, grow conscious of their condition, and eventually rise against their creators.
Amazon's First Sale: The Obscure Academic Book That Launched an E-Commerce Empire
When Amazon.com opened for business in 1995, the very first book a customer purchased was not a bestseller or a popular novel — it was an academic volume on artificial intelligence and cognitive science by Douglas Hofstadter.
Before Copy-Paste Existed: How the Apollo Moon Landing Code Was Written by Hand
Long before integrated development environments or version control systems existed, the software that guided astronauts to the Moon was drafted on paper by teams of engineers working under extraordinary pressure. The story of Apollo's software is one of human ingenuity operating at the very edge of what was technically possible.
McDonald's Made Bubblegum-Flavored Broccoli — and Kids Hated It Anyway
At some point in McDonald's research and development history, someone sat down and proposed making broccoli taste like bubblegum. The resulting product never reached menus, but the story of why it was tried — and why it failed — says something interesting about nutrition, child psychology, and the limits of food technology.
IBM's 1980 Hard Drive: 500 Pounds, $40,000, and One Gigabyte of Storage
In 1980, IBM shipped the world's first gigabyte-capacity hard drive. The IBM 3380 weighed more than 500 pounds, required a refrigerator-sized cabinet, and carried a price tag of $40,000. Today, the same capacity fits on a chip smaller than a fingernail.
One Pencil, 35 Miles: The Remarkable Engineering Hidden in a Simple Writing Tool
The humble pencil is one of the most efficient writing instruments ever devised. A single standard pencil contains enough graphite to draw a continuous line stretching 35 miles or produce roughly 45,000 words before the core is exhausted.
The Eiffel Tower Was Meant to Be Torn Down — A Radio Antenna Saved It
When Gustave Eiffel completed his iron tower for the 1889 Paris World's Fair, the agreement was clear: it would stand for twenty years and then be dismantled. The structure that would become the world's most visited monument was nearly destroyed on schedule — until radio technology gave it an unexpected new purpose.
TYPEWRITER: The One Word That Lives Entirely on the Top Row of Your Keyboard
The word TYPEWRITER uses only the letters Q, W, E, R, T, Y, U, I, O, and P — the exact keys on the top row of a QWERTY keyboard. This either reflects a clever design choice, a remarkable coincidence, or a very useful sales demo trick, depending on who you ask.
NERF Stands for Non-Expandable Recreational Foam — And Has a Stranger Origin Than You Think
NERF — Non-Expandable Recreational Foam — is one of the most recognized toy brand names in the world. But the acronym came second: the foam came first, invented almost by accident when a toy designer wanted a ball you could throw inside without breaking anything.
Google Was Almost Called 'Backrub' — The Naming Story Behind the World's Biggest Search Engine
Before Google became a verb, a noun, and arguably the most recognized brand on Earth, it operated under the unglamorous name 'Backrub.' Understanding why reveals something important about how the early internet worked — and how close we came to a very different technological era.
The First VCR Was the Size of a Piano — And Cost More Than a House
In 1956, Ampex Corporation unveiled a machine that could record and play back television video. It weighed 750 pounds, stood as tall as a piano, and cost $50,000 — roughly half a million dollars in today's money. It also changed the world.
Just Setting Up My Twttr: The First Tweet and the Platform That Changed Public Discourse
At 12:50 PM on March 21, 2006, Jack Dorsey sent seven words that would eventually reshape how billions of people communicate, argue, organize, and consume news. The platform wasn't even called Twitter yet.
The First Mobile Phone Call Was Made in 1973 — and It Was to a Rival
On April 3, 1973, Martin Cooper stood on a New York City sidewalk and made a phone call from a device that weighed over a kilogram and could only hold a charge for 20 minutes. The person he called was his chief competitor. It was the first mobile phone call in history.
The First Email Was Sent in 1971 — and the 'E' Just Stands for 'Electronic'
Ray Tomlinson sent the first email in 1971 and doesn't remember what it said. He also chose the @ symbol for email addresses — a decision that turned an obscure typewriter key into one of the most recognized symbols in the world.
The Wooden Mouse: Doug Engelbart's 1964 Invention That Redefined How We Interact with Computers
In 1964, Doug Engelbart carved a small wooden box with two perpendicular wheels on its underside and a single button on top. He called it an 'X-Y position indicator for a display system.' The world would eventually call it a mouse.
Mark Twain's Typewritten Manuscript: How America's Greatest Writer Embraced New Technology
Mark Twain purchased one of the first Remington typewriters available to the public in the early 1870s, and 'Life on the Mississippi,' published in 1883, is widely cited as the first book-length manuscript submitted to a publisher having been typed rather than handwritten.
The Phone That Started It All: Motorola DynaTAC 8000X and the Birth of Mobile Communication
When Motorola introduced the DynaTAC 8000X in 1983, it changed human communication forever. The brick-sized device that cost nearly $4,000 laid the foundation for the smartphone era.
'Me at the Zoo': The 19-Second Video That Launched YouTube
The first video ever uploaded to YouTube is almost comically understated: 18 seconds of a young man standing in front of elephant enclosures, noting that elephants have long trunks. What it started was anything but small.
The 'V' in V8 Engine Means Exactly What You Think — and the Shape Has a Purpose
The V8 engine's name describes its geometry precisely: eight cylinders arranged in two banks of four, forming a V shape when viewed from the front. This configuration is not arbitrary — it solves specific engineering problems in a space-efficient way.
The First Domain Name Ever Registered: Symbolics.com and the Dawn of the Internet Address
On March 15, 1985, a computer manufacturer called Symbolics Inc. registered Symbolics.com — the first .com domain name in internet history, predating Google by 13 years and Facebook by 19.
The World's First Webcam Was Watching a Coffee Pot — The Full Story
The Trojan Room Coffee Pot camera, installed in the Cambridge University Computer Laboratory in 1991, was the world's first webcam. It was built for the most human of reasons: computer scientists were tired of making the trip to the kitchen only to find an empty coffee pot.
The First Car Had Three Wheels and Was Invented in 1885 — Here's Why
The world's first true automobile, the Benz Patent-Motorwagen, was built by Karl Benz in 1885. It had three wheels, a single-cylinder engine, and a top speed of about 16 km/h — and it changed the world.
The Black Box Is Actually Orange: Why Aviation's Most Important Recorder Is Misnamed
Every commercial aircraft carries devices universally known as 'black boxes,' yet they are not black — they are a vivid, high-visibility orange. The misnomer persists, but the color choice is deliberate and potentially life-saving: these devices need to be found in the wreckage of catastrophic accidents.
The 1936 Berlin Olympics: When Television First Brought the Games to the World
The 1936 Summer Olympics in Berlin were simultaneously a dark moment in Olympic history and a landmark in the development of broadcast media. For the first time, a television camera pointed at the Games and sent moving images of athletic competition to viewers who were not there.
Your Smartphone Is Millions of Times More Powerful Than Apollo's Computers
The computer that guided Apollo 11 to the Moon and back operated at 0.043 MHz and had 4 kilobytes of RAM. A mid-range smartphone in 2026 has a processor running at 3,000 MHz and 8 gigabytes of RAM. The comparison is so extreme it borders on the surreal — which makes it a perfect lens for understanding how radically computing has transformed in 55 years.
Creeper: The World's First Computer Virus Just Wanted to Play Tag
In 1971, a programmer wrote a program that copied itself across ARPANET computers and displayed a taunting message. It was not malicious — it was more of an experiment, perhaps even a prank. But Creeper was the first self-replicating program in computing history, and it inadvertently launched an arms race that has never stopped.
The IBM 350: The First Hard Drive Weighed a Ton and Stored 5 MB
In 1956, IBM introduced a data storage device that weighed over a ton, occupied the space of two large refrigerators, required its own air compressor, and stored five megabytes of data. Renting it cost around $3,200 per month. Today, a device 10,000 times smaller stores 10 million times more data.
The First Touchscreen Was Built in 1965 — 40 Years Before the iPhone
Touch interfaces feel like a natural, intuitive way to interact with technology. What feels natural took over 40 years to move from a defense research lab in rural England to the pocket of every smartphone user on the planet — a journey through military technology, ATMs, musical instruments, and eventually the consumer electronics revolution.
Quantum Computing: Why Qubits Can Be Both 0 and 1 at the Same Time
Quantum computers do not compute faster by running the same algorithms with bigger numbers — they compute differently, exploiting quantum mechanical phenomena that have no analog in classical physics. The qubit's ability to exist in a superposition of states is the foundation of that difference, and understanding it requires a brief trip into the physics of the very small.
Why All Computing Is Built on Just Two Numbers
Every email, every photograph, every video game, every financial transaction, and every AI response is ultimately expressed as a sequence of 0s and 1s. This is not an arbitrary choice — it is a physical necessity imposed by the most fundamental behavior of the semiconductor devices that power all digital computing.
Why Your Computer Forgets Everything When You Turn It Off
Every time you shut down your computer without saving, you lose your work. This is not a design flaw or a software failure — it is a direct consequence of the physical mechanism that makes RAM fast enough to be useful in the first place. Understanding why RAM is volatile reveals something fundamental about the tradeoffs at the heart of computer architecture.
Cloud Computing Was Predicted in 1961 — Long Before the Internet Existed
In April 1961, at MIT's centennial celebration, a mathematician named John McCarthy suggested that 'computation may someday be organized as a public utility.' He made this prediction a decade before the internet, three decades before the World Wide Web, and four decades before Amazon Web Services would make it a commercial reality. The idea was so far ahead of its time that it had to be reinvented independently by a new generation of engineers before the technology could support it.
How Linus Torvalds Built Git in Two Weeks — and Why It Conquered Software Development
In April 2005, Linus Torvalds sat down to solve a specific problem: the Linux kernel project needed a new version control system after the one it had been using became unavailable under acceptable terms. He gave himself one week to have something working. He had a usable system in about two weeks. He named it Git — British slang for an unpleasant person — and it has since become the universal foundation of modern software development.
Moore's Law: The Prophecy That Powered 50 Years of Computing
In 1965, Gordon Moore looked at four years of data points and extrapolated a trend line. His prediction — that transistor density on microchips would double roughly every two years — turned out to be one of the most accurate long-range forecasts in industrial history, reshaping civilization for half a century.
Over 1.1 Billion Websites Exist — But Most Are Digital Ghost Towns
The number 1.1 billion sounds impossibly large until you realize that the overwhelming majority of those websites are effectively abandoned — digital structures sitting empty and unmaintained in corners of the internet that almost no one ever visits.
Why Your Keyboard Isn't Alphabetical: The QWERTY Origin Story
Every time you type on a keyboard, you are using a layout designed in the 1870s for a machine that no longer exists, to solve a mechanical problem that modern technology eliminated decades ago. QWERTY persists not because it is optimal, but because of one of the most powerful forces in human behavior: the cost of changing something that already works well enough.
SQL: The 50-Year-Old Language Still Running the World's Data
In the early 1970s, IBM researchers developed a language for querying databases that was so well-designed it has outlasted generations of programming languages, computing paradigms, and technology revolutions. SQL — Structured Query Language — is still how the world talks to its most important data.
Python Was Named After Monty Python, Not a Snake — Here's Why That Matters
In December 1989, a Dutch programmer named Guido van Rossum started a new programming language as a holiday project. He wanted it to be fun to use, and he named it after the British comedy group Monty Python. Thirty-five years later, Python is the most widely used programming language on Earth.
Netflix Uses 15% of Global Internet Traffic — Here's How It Manages That Scale
One company accounts for roughly one in seven bytes of downstream internet traffic during peak evening hours worldwide. Netflix streams to over 260 million subscribers across nearly every country, serving ultra-high-definition video continuously — a logistical and engineering challenge that required the company to invent new approaches to content delivery.
Linux: How a Hobby Project Became the Software Powering the World
In August 1991, a 21-year-old Finnish computer science student posted a message to an internet newsgroup saying he was writing a free operating system 'just a hobby, won't be big and professional.' That casual announcement introduced Linux to the world — a kernel that now runs more than 90% of the world's servers, all Android smartphones, and every supercomputer on the TOP500 list.
Why the Padlock in Your Browser Actually Matters: The Science of HTTPS
Every time you see a padlock icon in your browser's address bar, a sophisticated cryptographic handshake has just protected your connection from anyone attempting to intercept it. Understanding how that works reveals one of the most elegant systems in modern computing.
ENIAC: The 30-Ton Computer That Launched the Digital Age
In 1945, a machine filled an entire room, weighed as much as a loaded semi-truck fleet, and drew so much power it reportedly dimmed the lights of an entire Philadelphia neighborhood when switched on. It was called ENIAC — and it could perform 5,000 additions per second, making it the fastest calculator in the world.
The First Internet Message Was 'LO' — Because the System Crashed After Two Letters
The first message ever sent over the network that would become the internet was supposed to be 'LOGIN.' Instead it was 'LO' — because the receiving computer crashed after two characters. The accidental poetry of that truncated greeting, inadvertently echoing 'hello' or 'lo and behold,' seems fitting for the birth of the technology that would eventually connect most of humanity.
Ada Lovelace: The Mathematician Who Invented Computer Programming in 1843
In 1843, a woman translated an Italian paper about a machine that had never been built, then added notes three times as long as the original — including the first algorithm ever designed to be executed by a computer. The machine was never completed. The algorithm was correct.
90% of All Human Data Was Created in Just Two Years — What That Number Actually Means
The statistic seems impossible at first: roughly 90 percent of all the data that exists in the world was generated in just the last two years. Yet this figure, tracked consistently by IBM and other research institutions, reveals something profound about the accelerating pace of digital creation.
The First Web Browser Was Built in 1990 — and Almost Nobody Knows Its Real Name
In 1990, a British scientist at a particle physics laboratory in Switzerland wrote a piece of software that would transform human civilization. The browser he created, called WorldWideWeb, looked nothing like Chrome or Firefox — but it started everything.
The Intel 4004: From 2,300 Transistors to 100 Billion in 50 Years
The Intel 4004, released in November 1971, was designed to power a Japanese desktop calculator. It had 2,300 transistors and could perform about 92,000 operations per second. It was also the ancestor of every microprocessor in every computer, smartphone, and data center on Earth today.
The $7.5 Million Domain Name: How Business.com Sparked the Gold Rush of the Early Web
In 1999, a seven-character domain name sold for $7.5 million — more than many companies were worth at the time. That transaction was not merely expensive; it was a signal that the internet had developed its own real estate market, with its own logic, its own speculation, and its own boom.
The Altair 8800: The Kit Computer That Sparked the Personal Computer Revolution
In January 1975, a mail-order electronics kit appeared on the cover of Popular Electronics magazine. It had no keyboard, no monitor, no storage, and no software. Programming it meant flipping switches on the front panel. It sold out immediately, inspired two Harvard students to drop out and write software for it, and started a revolution.
The First Spam Email Was Sent in 1978 — and People Were Furious
On May 3, 1978, a marketing manager named Gary Thuerk sent an unsolicited commercial message to every user he could find on the western region of ARPANET — 393 people. The response was immediate and nearly universally negative. He was effectively told never to do it again. He did it again anyway, roughly 45 years before spam would account for nearly half of all email traffic worldwide.
'Just Setting Up My Twttr': The First Tweet and the Birth of Microblogging
On March 21, 2006, Jack Dorsey typed five lowercase words into a prototype messaging service and posted them to the world: 'just setting up my twttr.' That mundane message marked the beginning of a platform that would reshape politics, journalism, and public discourse across the globe.
Me at the Zoo: The 18-Second Video That Launched YouTube
It is eighteen seconds long, shot on a basic digital camera, and features a young man standing in front of an elephant enclosure at the San Diego Zoo commenting on the length of elephant trunks. It is also one of the most historically significant videos ever recorded.
The Internet of Things: How Your Thermostat Joined the Global Network
When technologists coined the phrase 'Internet of Things' in the late 1990s, it described a vision so ambitious it sounded like science fiction: a world in which every physical object capable of generating useful data would be connected to a global network. That future is now the present.
Who Coined 'Surfing the Web'? A Librarian From New York You've Never Heard Of
The phrase 'surfing the web' is so woven into everyday language that almost nobody wonders where it came from. The answer is a public librarian in Liverpool, New York, who chose a mousepad image to inspire one of the most enduring coinages of the digital age.
Tim Berners-Lee Invented the Web in 1989 to Help Physicists Share Papers
In March 1989, a software engineer at CERN named Tim Berners-Lee submitted a proposal to his manager titled 'Information Management: A Proposal.' His manager wrote 'Vague but exciting' on the cover. That vague but exciting proposal described the architecture of the World Wide Web — the system that would become the foundation of modern civilization as we know it.
How TikTok Conquered the World: The App That Broke the Google-Facebook Monopoly
For over a decade, only apps made by Facebook and Google managed to cross the 3 billion download threshold. TikTok shattered that duopoly — and the story of how it did so reveals just how dramatically digital culture has shifted.
Wikipedia: How the World's Largest Encyclopedia Is Written by Volunteers
On January 15, 2001, a website went live that was built on a premise that most knowledge professionals considered absurd: that a credible encyclopedia could be written and maintained by anonymous volunteers with no editorial gatekeepers, no pay, and no requirement that contributors have expertise in the topics they edited. Within a decade, it was the most consulted reference work in human history.
347 Billion Emails Per Day: Inside the World's Most Trafficked Communication System
Every day, roughly 347 billion emails travel across the global internet — more than 4 million per second, 24 hours a day. Of those, nearly half are unsolicited junk. Understanding how email infrastructure handles this volume, and how spam filters manage to keep inboxes functional, reveals one of the internet's most important and least visible engineering achievements.
Email Is Older Than the Web — Ray Tomlinson Invented It in 1971
Most people encounter email and the web as parts of the same internet ecosystem, but email is roughly 20 years older than the web. Ray Tomlinson sent the first email between two computers in 1971 — and in doing so, invented the @ symbol as an addressing convention that has become one of the most recognized typographic marks in the world.
Google Processes 8.5 Billion Searches a Day — How Is That Even Possible?
Every second of every day, approximately 99,000 people type a question into Google and expect an answer within a fraction of a second. The infrastructure required to deliver those answers — across all languages, all topics, all geographies simultaneously — is one of the largest and most sophisticated computing systems ever built.
HTTP at 35: The Invisible Protocol That Runs the Entire Web
Every time you load a webpage, watch a video, or submit a form online, a protocol designed in 1989 by a single scientist is silently coordinating the exchange. HTTP is so fundamental to the web that it is easy to forget it had to be invented by someone.
The Internet Ran Out of Addresses: The IPv4 Exhaustion Crisis Explained
When the engineers who designed IPv4 in 1983 allocated 32 bits for internet addresses, they created space for roughly 4.3 billion unique addresses. At the time, this seemed not just sufficient but almost unimaginably abundant. They were wrong.
The Original Computer Bug Was an Actual Bug: The Story Behind the Term
Software engineers spend a substantial portion of their professional lives hunting bugs. The word's origin in computing can be traced to a specific afternoon in September 1947, when a technician at Harvard found a moth caught between the relay contacts of the Mark II computer — and taped it into the logbook with the note 'First actual case of bug being found.'
GPS Was a Military Secret — The Cold War Technology That Now Lives in Your Pocket
The technology guiding your every turn was once a top-secret military asset — and it took a Cold War tragedy to make it available to the public.
Technology — Frequently Asked Questions
Did you know that the first computer programmer was a woman named Ada Lovelace, who wrote an algorithm for the Anal...?+
The first computer programmer was a woman named Ada Lovelace, who wrote an algorithm for the Analytical Engine in 1843. Source: The British Library
Did you know that the first alarm clock could only ring at 4 a.m.?+
The first alarm clock could only ring at 4 a.m. Source: MIT Museum
Did you know that the Twitter bird's official name is Larry, named after NBA legend Larry Bird.?+
The Twitter bird's official name is Larry, named after NBA legend Larry Bird. Source: Twitter Archive
Did you know that the hashtag symbol is technically called an octothorpe.?+
The hashtag symbol is technically called an octothorpe. Source: Merriam-Webster
Did you know that nintendo was founded in 1889 as a company that produced handmade playing cards.?+
Nintendo was founded in 1889 as a company that produced handmade playing cards. Source: Nintendo Co., Ltd.
Did you know that a 'jiffy' is an actual unit of time used in computer science, equal to one cycle of a computer's ...?+
A 'jiffy' is an actual unit of time used in computer science, equal to one cycle of a computer's system clock. Source: NIST
Did you know that leo Fender, the inventor of the Telecaster and Stratocaster guitars, never actually knew how to p...?+
Leo Fender, the inventor of the Telecaster and Stratocaster guitars, never actually knew how to play the guitar. Source: Fender Musical Instruments
Did you know that the term 'Robot' was first used in a 1920 play called R.U.R. by Czech writer Karel Čapek.?+
The term 'Robot' was first used in a 1920 play called R.U.R. by Czech writer Karel Čapek. Source: The New Yorker