Episode 7 of 8 The Productivity Paradox

Two Measures, Two Stories: Output per Hour vs Output per Worker

The BLS publishes two headline productivity measures. One divides output by hours worked. The other divides output by workers employed. Over 78 years, the gap between them has reversed direction entirely — and the reason reveals something fundamental about how the American economy has changed.

Finexus Research • March 27, 2026 • BLS Productivity and Costs, Nonfarm Business Sector

How productive is the American worker? The answer depends on which question you ask. If you ask how much output is produced per hour of labor, you get one number. If you ask how much is produced per worker, regardless of how many hours that person puts in, you get a different number. Both are published by the Bureau of Labor Statistics, indexed to 2017 = 100, covering the nonfarm business sector, and tracked quarterly since 1947.

Over nearly eight decades, the relationship between them has undergone a quiet reversal. In the late 1940s, output per worker was substantially higher than output per hour — a reflection of long working hours in an industrial economy. By the 2010s, output per hour had overtaken output per worker — a reflection of shorter weeks, more part-time employment, and a service economy that runs on different schedules.

That crossover tells you whether productivity gains come from people working more efficiently in each hour, or from changes in how many hours they work. This episode traces both measures across 78 years and explains what the widening gap reveals.

The Two Measures Explained

Definitions

Output per Hour = Total Output ÷ Total Hours Worked   (BLS Series PRS85006093)

The standard measure of labor productivity. It divides real output by total hours worked by all persons — it does not matter whether those hours were worked by 10 people or 1,000.

Output per Worker = Total Output ÷ Number of Workers   (BLS Series PRS85006163)

Divides the same total output by the number of persons employed, regardless of hours. A worker who puts in 20 hours and one who puts in 50 hours each count as one worker.

Key identity: Output per Worker = Output per Hour × Average Hours per Worker

When average hours rise, output per worker grows faster than output per hour. When average hours fall, the reverse happens. The gap between the two measures is the hours effect.

Most economists prefer output per hour because it isolates efficiency — how much is produced from each unit of labor input. Output per worker captures something different: the lived economic reality of how much each employed person actually produces, which depends on both their efficiency and their working time. For wages, tax revenue, and living standards, per-worker output matters. For technological progress and growth potential, per-hour output matters.

78 Years, Two Lines

The chart below plots both measures from 1947 to 2025 (Q4 data, 2017 = 100). Both have roughly quintupled — American workers produce about five times as much as their grandparents did after World War II.

But look at the gap. In the early decades, the blue line (output per worker) sits above the slate line (output per hour) — workers put in longer hours, boosting their total per-person output. By the late 1980s, the two lines converge. From roughly 2010 onward, they reverse: output per hour now exceeds output per worker. That crossover is the story of the American working week — from 40+ hour manufacturing shifts in 1947 to an average of 34 hours across all private industries by 2025.

The Two Productivity Measures: 1947–2025
Nonfarm business sector, Q4 data, index 2017 = 100. Output per hour (slate) overtook output per worker (blue) around 2010.

The most dramatic short-term divergence came in 2020, when both measures spiked — output per hour to 110.6, output per worker to 109.9 — not because workers became more efficient, but because lower-productivity workers disproportionately lost their jobs. The composition of the surviving workforce shifted toward higher-output sectors. By 2025, the structural spread had reasserted itself: output per hour at 119.6, output per worker at 117.1.

The Hours Effect — Why the Measures Diverge

If both measures are expressed as indices with 2017 = 100, then their ratio tells you how average hours per worker have changed relative to 2017. A ratio below 1.0 means workers were putting in more hours than in 2017. Above 1.0 means fewer. The table below traces this relationship at key milestones.

YearOutput/HourOutput/WorkerRatioWhat It Means
194723.026.90.85Workers averaged ~15% more hours than 2017
197041.746.00.91Hours declining; service sector growing
199057.959.60.97Measures nearly converge
200073.475.50.97Dot-com era; composition effects balance out
201095.095.01.00Crossover: hours match 2017 level exactly
2020110.6109.91.01Pandemic composition effect
2025119.6117.11.02Per-hour measure now structurally higher

In 1947, the ratio stood at 0.85 — workers were putting in roughly 15% more hours per week than the 2017 average. The postwar economy was hours-intensive: manufacturing ran on long shifts, agriculture still employed millions at dawn-to-dusk schedules.

By 1970, the ratio had risen to 0.91 as unions negotiated shorter workweeks and the service sector expanded. By 1990 it reached 0.97, and the two measures were nearly indistinguishable. The crossover came around 2010, when the ratio hit 1.00 exactly — also the aftermath of the Great Recession, when the economy relied more heavily on part-time work. Since then, the ratio has stayed above 1.0. By 2025 it reached 1.02: workers are more productive per hour than ever, but work fewer hours per person.

In 1947, a ratio of 0.85 meant Americans worked about 15% more hours per week than they would in 2017. By 2025, a ratio of 1.02 means they work slightly fewer — a quiet revolution in American working life that shows up only when you compare the two measures.

What drove the decline? The rise of the service economy (from 60% of nonfarm employment in 1947 to 85% by 2025, with more part-time schedules). The growth of part-time work (from under 15% of workers in the 1950s to above 17% by the 2010s). The entry of women into the workforce, many initially at part-time schedules, increasing head counts faster than total hours. And overtime regulation, which discouraged excessive hours by requiring premium pay.

Growth Rates — Which Measure Grew Faster?

Another way to see the divergence is to compare cumulative growth across major periods. In every era, output per hour grew faster than output per worker — the consistent signature of declining average hours. But the size of the gap has varied.

Cumulative Productivity Growth by Period
Output per hour (slate) consistently outpaces output per worker (blue), reflecting the steady decline in average working hours across all eras.

The postwar boom (1947–1973) was the golden age. Output per hour nearly doubled (+98%). Output per worker rose 86%. The 12-point gap reflects modest hours decline as the 40-hour standard became entrenched. About six-sevenths of the hourly efficiency gain translated into higher per-worker output.

The slowdown era (1973–2000) saw sharp deceleration. Output per hour grew 61%; output per worker 51%. Oil shocks, stagflation, and the shift from factory to office weighed on both measures, while the service-sector transition and expanding part-time work kept the gap at 10 points.

The modern era (2000–2025) shows modest revival. Output per hour grew 63%, boosted by technology and e-commerce. Output per worker grew 55%. The 8-point gap is the narrowest of the three periods, suggesting that the decline in average hours may be moderating.

PeriodOutput/HourOutput/WorkerGapAnnualized Difference
1947–1973 (26 yrs)+98%+86%12 pts~0.5% per year
1973–2000 (27 yrs)+61%+51%10 pts~0.4% per year
2000–2025 (25 yrs)+63%+55%8 pts~0.3% per year
Full: 1947–2025+420%+335%85 pts~0.4% per year

Over the full period, output per hour has risen 420% while output per worker has risen 335%. Per hour, each worker produces 5.2x as much as in 1947; per worker, the multiple is 4.35x. Both are correct. The 85-point gap is the cumulative effect of a transformed workweek.

Which Measure Should You Use?

Use output per hour when measuring efficiency — technological progress, capital investment, and long-run wage potential. This is the Fed’s preferred measure. When economists cite “productivity growth,” they almost always mean output per hour.

Use output per worker when measuring per-person economic contribution. Living standards depend on total output, not hourly rates. Tax revenue and fiscal planning also track total income more closely than hourly efficiency.

Use both to diagnose the labor market. If output per hour is growing but output per worker is flat, average hours are declining — and you should ask whether that is voluntary or involuntary. The gap is a diagnostic signal that neither measure alone provides.

Since 1947, output per hour has risen 420% while output per worker has risen 335%. The 85-point gap is not a measurement error — it is the cumulative footprint of a transformed American workweek.

The Bottom Line

The BLS publishes two productivity measures for the nonfarm business sector, and over 78 years they have told increasingly different stories. Output per hour — the purer measure of efficiency — has risen 420% since 1947, reaching an index of 119.6 in Q4 2025. Output per worker — the measure of each person’s total contribution — has risen 335%, reaching 117.1.

The gap is the story of American working hours: from the long-shift 1940s (ratio 0.85) through the convergence of the 1990s (0.97) to today’s reversal (1.02). Output per hour captures efficiency. Output per worker captures economic reality. The difference captures the hidden variable: how many hours Americans actually work. The final episode examines the future of productivity — whether AI, remote work, and demographic change will accelerate the trend or reshape what we measure.