The BLS publishes two headline productivity measures. One divides output by hours worked. The other divides output by workers employed. Over 78 years, the gap between them has reversed direction entirely — and the reason reveals something fundamental about how the American economy has changed.
How productive is the American worker? The answer depends on which question you ask. If you ask how much output is produced per hour of labor, you get one number. If you ask how much is produced per worker, regardless of how many hours that person puts in, you get a different number. Both are published by the Bureau of Labor Statistics, indexed to 2017 = 100, covering the nonfarm business sector, and tracked quarterly since 1947.
Over nearly eight decades, the relationship between them has undergone a quiet reversal. In the late 1940s, output per worker was substantially higher than output per hour — a reflection of long working hours in an industrial economy. By the 2010s, output per hour had overtaken output per worker — a reflection of shorter weeks, more part-time employment, and a service economy that runs on different schedules.
That crossover tells you whether productivity gains come from people working more efficiently in each hour, or from changes in how many hours they work. This episode traces both measures across 78 years and explains what the widening gap reveals.
The standard measure of labor productivity. It divides real output by total hours worked by all persons — it does not matter whether those hours were worked by 10 people or 1,000.
Divides the same total output by the number of persons employed, regardless of hours. A worker who puts in 20 hours and one who puts in 50 hours each count as one worker.
When average hours rise, output per worker grows faster than output per hour. When average hours fall, the reverse happens. The gap between the two measures is the hours effect.
Most economists prefer output per hour because it isolates efficiency — how much is produced from each unit of labor input. Output per worker captures something different: the lived economic reality of how much each employed person actually produces, which depends on both their efficiency and their working time. For wages, tax revenue, and living standards, per-worker output matters. For technological progress and growth potential, per-hour output matters.
The chart below plots both measures from 1947 to 2025 (Q4 data, 2017 = 100). Both have roughly quintupled — American workers produce about five times as much as their grandparents did after World War II.
But look at the gap. In the early decades, the blue line (output per worker) sits above the slate line (output per hour) — workers put in longer hours, boosting their total per-person output. By the late 1980s, the two lines converge. From roughly 2010 onward, they reverse: output per hour now exceeds output per worker. That crossover is the story of the American working week — from 40+ hour manufacturing shifts in 1947 to an average of 34 hours across all private industries by 2025.
The most dramatic short-term divergence came in 2020, when both measures spiked — output per hour to 110.6, output per worker to 109.9 — not because workers became more efficient, but because lower-productivity workers disproportionately lost their jobs. The composition of the surviving workforce shifted toward higher-output sectors. By 2025, the structural spread had reasserted itself: output per hour at 119.6, output per worker at 117.1.
If both measures are expressed as indices with 2017 = 100, then their ratio tells you how average hours per worker have changed relative to 2017. A ratio below 1.0 means workers were putting in more hours than in 2017. Above 1.0 means fewer. The table below traces this relationship at key milestones.
| Year | Output/Hour | Output/Worker | Ratio | What It Means |
|---|---|---|---|---|
| 1947 | 23.0 | 26.9 | 0.85 | Workers averaged ~15% more hours than 2017 |
| 1970 | 41.7 | 46.0 | 0.91 | Hours declining; service sector growing |
| 1990 | 57.9 | 59.6 | 0.97 | Measures nearly converge |
| 2000 | 73.4 | 75.5 | 0.97 | Dot-com era; composition effects balance out |
| 2010 | 95.0 | 95.0 | 1.00 | Crossover: hours match 2017 level exactly |
| 2020 | 110.6 | 109.9 | 1.01 | Pandemic composition effect |
| 2025 | 119.6 | 117.1 | 1.02 | Per-hour measure now structurally higher |
In 1947, the ratio stood at 0.85 — workers were putting in roughly 15% more hours per week than the 2017 average. The postwar economy was hours-intensive: manufacturing ran on long shifts, agriculture still employed millions at dawn-to-dusk schedules.
By 1970, the ratio had risen to 0.91 as unions negotiated shorter workweeks and the service sector expanded. By 1990 it reached 0.97, and the two measures were nearly indistinguishable. The crossover came around 2010, when the ratio hit 1.00 exactly — also the aftermath of the Great Recession, when the economy relied more heavily on part-time work. Since then, the ratio has stayed above 1.0. By 2025 it reached 1.02: workers are more productive per hour than ever, but work fewer hours per person.
What drove the decline? The rise of the service economy (from 60% of nonfarm employment in 1947 to 85% by 2025, with more part-time schedules). The growth of part-time work (from under 15% of workers in the 1950s to above 17% by the 2010s). The entry of women into the workforce, many initially at part-time schedules, increasing head counts faster than total hours. And overtime regulation, which discouraged excessive hours by requiring premium pay.
Another way to see the divergence is to compare cumulative growth across major periods. In every era, output per hour grew faster than output per worker — the consistent signature of declining average hours. But the size of the gap has varied.
The postwar boom (1947–1973) was the golden age. Output per hour nearly doubled (+98%). Output per worker rose 86%. The 12-point gap reflects modest hours decline as the 40-hour standard became entrenched. About six-sevenths of the hourly efficiency gain translated into higher per-worker output.
The slowdown era (1973–2000) saw sharp deceleration. Output per hour grew 61%; output per worker 51%. Oil shocks, stagflation, and the shift from factory to office weighed on both measures, while the service-sector transition and expanding part-time work kept the gap at 10 points.
The modern era (2000–2025) shows modest revival. Output per hour grew 63%, boosted by technology and e-commerce. Output per worker grew 55%. The 8-point gap is the narrowest of the three periods, suggesting that the decline in average hours may be moderating.
| Period | Output/Hour | Output/Worker | Gap | Annualized Difference |
|---|---|---|---|---|
| 1947–1973 (26 yrs) | +98% | +86% | 12 pts | ~0.5% per year |
| 1973–2000 (27 yrs) | +61% | +51% | 10 pts | ~0.4% per year |
| 2000–2025 (25 yrs) | +63% | +55% | 8 pts | ~0.3% per year |
| Full: 1947–2025 | +420% | +335% | 85 pts | ~0.4% per year |
Over the full period, output per hour has risen 420% while output per worker has risen 335%. Per hour, each worker produces 5.2x as much as in 1947; per worker, the multiple is 4.35x. Both are correct. The 85-point gap is the cumulative effect of a transformed workweek.
Use output per hour when measuring efficiency — technological progress, capital investment, and long-run wage potential. This is the Fed’s preferred measure. When economists cite “productivity growth,” they almost always mean output per hour.
Use output per worker when measuring per-person economic contribution. Living standards depend on total output, not hourly rates. Tax revenue and fiscal planning also track total income more closely than hourly efficiency.
Use both to diagnose the labor market. If output per hour is growing but output per worker is flat, average hours are declining — and you should ask whether that is voluntary or involuntary. The gap is a diagnostic signal that neither measure alone provides.
The BLS publishes two productivity measures for the nonfarm business sector, and over 78 years they have told increasingly different stories. Output per hour — the purer measure of efficiency — has risen 420% since 1947, reaching an index of 119.6 in Q4 2025. Output per worker — the measure of each person’s total contribution — has risen 335%, reaching 117.1.
The gap is the story of American working hours: from the long-shift 1940s (ratio 0.85) through the convergence of the 1990s (0.97) to today’s reversal (1.02). Output per hour captures efficiency. Output per worker captures economic reality. The difference captures the hidden variable: how many hours Americans actually work. The final episode examines the future of productivity — whether AI, remote work, and demographic change will accelerate the trend or reshape what we measure.