FactOTD

Cloud Computing Was Predicted in 1961 — Long Before the Internet Existed

March 28, 2026 · 4 min read

The Fact

Cloud computing as a concept was proposed by John McCarthy in 1961, who suggested computing might be organized as a public utility.

John McCarthy is best known for coining the term "artificial intelligence" at the 1956 Dartmouth Conference, where he and a group of colleagues essentially founded the field of AI research. But five years later, at MIT's centennial convocation in April 1961, he offered a prediction that would prove equally far-sighted: he suggested that computing would eventually be organized and sold like a public utility, the way electric power or water service is provided to households and businesses through centralized infrastructure.

The Utility Model of Computing

McCarthy's analogy to electricity was precise and prescient. Electric utilities work by building central generating plants and distributing power through a grid to customers who pay for what they use, without needing to own or understand generating equipment. McCarthy's vision was that computing would work similarly: large centralized computers would provide computational capacity, and users would access that capacity through terminals, paying for what they consumed without needing to own or manage the hardware.

In 1961, this vision was informed by the practice of "time-sharing" — a then-novel approach to operating computers that allowed multiple users to share a single machine simultaneously. McCarthy had been one of the developers of time-sharing systems at MIT's Compatible Time-Sharing System project, which allowed multiple terminals to connect to a single mainframe and share its processing time. The concept of a utility emerged naturally from this work: if many users could share one machine, why not make that service available to anyone willing to pay?

Why It Took 40 Years

McCarthy's vision was technically sound but premature by several decades. The missing components were a universal networking infrastructure to connect users to the central computers, sufficiently cheap and powerful hardware to make large-scale shared computing economical, and software frameworks to manage multi-tenant computing securely and flexibly.

Telecommunications networks in 1961 were circuit-switched telephone systems, expensive and incompatible with the kind of open, flexible, packet-based networking that internet-era cloud computing requires. ARPANET, the precursor to the internet, was still eight years away. The World Wide Web was 30 years away.

As these prerequisites gradually fell into place — internet ubiquity through the 1990s, server hardware reaching mass-market prices, the development of virtualization technologies that allowed physical servers to be divided into multiple isolated virtual machines — the utility computing model became technically and economically viable.

From Vision to Amazon Web Services

The company that built the first commercial cloud at scale was not a technology company in the traditional sense. Amazon, primarily an online retailer, had built enormous internal computing infrastructure to support its e-commerce business. By the mid-2000s, Amazon's engineers noticed that they had excess capacity and that building new internal services was painfully slow because teams had to request and provision servers through a cumbersome process.

The solution — provide computing resources as a programmable API, available on demand and billed per use — was Amazon Web Services, launched publicly in 2006 with S3 (storage) and EC2 (virtual computers). Customers could provision a virtual server in minutes, pay by the hour, and scale up or down instantly. McCarthy's 1961 vision was finally real.

AWS, Google Cloud, and Microsoft Azure now collectively support the vast majority of the world's internet applications, from startups to multinational corporations. The shift to cloud computing has eliminated billions of dollars of capital expenditure on private data centers, enabled businesses to scale instantly, and created entirely new economic models for software delivery. The concept that McCarthy articulated as a speculative idea at a university ceremony in 1961 — that computing would be a utility like electricity — has become so thoroughly realized that most people have never used computing any other way.

F

FactOTD Editorial Team

Published March 28, 2026 · 4 min read

The FactOTD editorial team researches and verifies every fact before publication. Our mission is to make learning effortless and accurate. Learn about our process →

Related Articles

technologyThe First Internet Message Was 'LO' — Because the System Crashed After Two LettersThe first message ever sent over the network that would become the internet was supposed to be 'LOGIN.' Instead it was 'LO' — because the receiving computer crashed after two characters. The accidental poetry of that truncated greeting, inadvertently echoing 'hello' or 'lo and behold,' seems fitting for the birth of the technology that would eventually connect most of humanity.technologyENIAC: The 30-Ton Computer That Launched the Digital AgeIn 1945, a machine filled an entire room, weighed as much as a loaded semi-truck fleet, and drew so much power it reportedly dimmed the lights of an entire Philadelphia neighborhood when switched on. It was called ENIAC — and it could perform 5,000 additions per second, making it the fastest calculator in the world.technologyPython Was Named After Monty Python, Not a Snake — Here's Why That MattersIn December 1989, a Dutch programmer named Guido van Rossum started a new programming language as a holiday project. He wanted it to be fun to use, and he named it after the British comedy group Monty Python. Thirty-five years later, Python is the most widely used programming language on Earth.technologySQL: The 50-Year-Old Language Still Running the World's DataIn the early 1970s, IBM researchers developed a language for querying databases that was so well-designed it has outlasted generations of programming languages, computing paradigms, and technology revolutions. SQL — Structured Query Language — is still how the world talks to its most important data.