FactOTD

A 'Jiffy' Is a Real Unit of Time in Computer Science — Not Just an Expression

March 28, 2026 · 3 min read

The Fact

A 'jiffy' is an actual unit of time used in computer science, equal to one cycle of a computer's system clock.

When Slang Becomes Specification

Language often moves in surprising directions. A colloquial expression for "very quickly" has, in the world of computer science, been assigned a precise technical definition. In computing, a jiffy is a formal unit of time equal to one period of the system clock — the fundamental oscillator that drives the timing of a computer's central processor.

The value of one jiffy depends on the specific system. In many Linux and Unix-based operating systems, a jiffy is defined as 1/HZ seconds, where HZ is the system clock frequency measured in cycles per second. Historically, many systems used a HZ of 100, making one jiffy equal to 10 milliseconds. Newer systems often use HZ values of 250 or 1000, making a jiffy 4 or 1 millisecond respectively. The specific value is less important than the concept: a jiffy is the minimum resolution of the system's time measurement, the smallest interval that can be meaningfully counted.

Why Computers Need Their Own Time Units

Human beings measure time in seconds, minutes, and hours — units calibrated to the rhythms of daily life. Computers operate at timescales that have no natural analog in human experience. A modern processor executing three billion clock cycles per second completes millions of operations in the time it takes a human to blink. The conventional vocabulary of time is simply too coarse-grained to describe what happens inside a modern computer.

The jiffy fills a specific niche in operating system design. It serves as the base unit for the kernel's timer — the mechanism that decides when to switch between running processes, when to update system clocks, and when to respond to hardware interrupts. By defining a jiffy as one clock cycle, operating system developers have a convenient unit for specifying time intervals in code without hardcoding specific millisecond values that might change when the system clock is configured differently.

This is a common pattern in computing: defining abstract units that represent "one fundamental interval of this system" rather than specifying absolute values in advance. The jiffy's flexibility — its value can change without breaking code that references it — is precisely what makes it useful.

The Word's Pre-Computing History

The word "jiffy" predates computers by at least two centuries, appearing in English slang as far back as the late 18th century. It was used colloquially to mean a very short, unspecified period of time — "I'll be back in a jiffy" — with no more precision than that.

The physicist Gilbert N. Lewis used the term in a different technical sense in 1927, defining a "jiffy" as the time it takes light to travel one centimeter — approximately 33.3564 picoseconds. This scientific usage was never widely adopted, but it established a precedent for giving the informal term a formal quantitative definition.

Computer science picked it up independently, giving it yet a third technical definition that has now become the most common formal usage. The story of the jiffy — from 18th-century slang, through a physicist's classroom definition, to the timing architecture of modern operating systems — is a small example of how technical vocabulary grows: often by borrowing familiar words and lending them unfamiliar precision.

F

FactOTD Editorial Team

Published March 28, 2026 · 3 min read

The FactOTD editorial team researches and verifies every fact before publication. Our mission is to make learning effortless and accurate. Learn about our process →

Related Articles