The Internet Ran Out of Addresses: The IPv4 Exhaustion Crisis Explained
March 28, 2026 · 3 min read
The Fact
IPv4, which uses 32-bit addresses, was expected to have enough addresses when created in 1983. The world ran out of new IPv4 addresses in 2011.
Four Billion Addresses and a Catastrophic Miscalculation
In the early 1980s, the internet was a research network used by a small community of academics and military researchers. The idea that billions of ordinary people would one day carry internet-connected devices in their pockets, or that homes would contain dozens of connected appliances, was not yet even speculative science fiction. It was simply outside the frame of possibility.
IPv4 — Internet Protocol version 4 — was formalized in 1981 and defined the rules by which computers on the internet identify themselves and route data. Each device on an IPv4 network is assigned a numerical address: four numbers between 0 and 255, separated by dots, such as 192.168.1.1. With 32 bits allocated for these addresses, the system supports 2 to the 32nd power unique addresses — approximately 4.3 billion.
In 1983, when the internet officially adopted the TCP/IP protocols that included IPv4, 4.3 billion seemed more than enough. The entire internet at that time consisted of a few hundred machines. Even the most optimistic projections did not envision a network of that size for decades, if ever.
The Arithmetic of Exhaustion
The internet grew far faster than anyone predicted. The commercialization of the web in the early 1990s triggered explosive growth in the number of connected computers. By the late 1990s, with the dot-com boom underway and personal computers becoming household items, the eventual exhaustion of IPv4 addresses moved from theoretical concern to engineering emergency.
Compounding the problem was inefficiency in how addresses had been allocated. In the early internet, large blocks of addresses — entire /8 networks containing 16 million addresses each — had been assigned to universities, corporations, and government agencies on the assumption that they would need them. MIT, for example, was allocated a block of 16 million addresses for its campus network. Many of these large allocations were underused, effectively withdrawing addresses from circulation while demand elsewhere went unmet.
Engineers developed techniques to extend IPv4's useful life. Network Address Translation, or NAT, allowed multiple devices on a local network — such as all the computers in a home or office — to share a single public IPv4 address. This dramatically reduced the number of public addresses needed and bought years of additional time, but it also introduced complexity and limitations that created problems for certain applications.
The Day the Pool Ran Dry
The Internet Assigned Numbers Authority (IANA), the body responsible for managing the global pool of unallocated IPv4 addresses, formally exhausted its supply of addresses to distribute to regional registries on February 3, 2011. Individual regional registries continued distributing addresses from their own reserves for varying periods after that, but the fundamental supply had been consumed.
The transition to IPv6 — which uses 128-bit addresses, creating a theoretical address space of 340 undecillion unique addresses (a number so large it effectively cannot be exhausted by human civilization) — had been planned since the mid-1990s. But the transition proved far slower than hoped, complicated by the need to update enormous quantities of existing networking infrastructure, software, and devices to support the new protocol.
Progress has been substantial: major internet service providers, operating systems, and websites have adopted IPv6, and global IPv6 traffic has grown significantly. Yet IPv4 and IPv6 continue to coexist, with complex translation mechanisms bridging devices that speak only one language. The miscalculation of 1983 left an infrastructure legacy that the internet is still working to fully overcome.
FactOTD Editorial Team
Published March 28, 2026 · 3 min read
The FactOTD editorial team researches and verifies every fact before publication. Our mission is to make learning effortless and accurate. Learn about our process →