View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

Fourth QUARTER 2018

FEDERALRESERVE
RESERVEBANK
BANKOF
OFRICHMOND
RICHMOND
FEDERAL

Navigating Energy
Booms and Busts
How they affect job and educational choices

Are CEOs
Overpaid?

The Fault
in R-Star

Interview with
Preston McAfee

Volume 23
Number 4
Fourth QUARTER 2018

FEATUREs

10

Navigating Energy Booms and Busts
The fracking revolution has created new job opportunities,
but are workers prepared for the fluctuations of the energy
economy?   
   

Director of RESEARCH

Kartik Athreya
Editorial Adviser

Aaron Steelman
Editor

Renee Haltom

14

Are CEOs Overpaid?
Incentives for chief executives have important economic
implications
                   
       

Econ Focus is the
economics magazine of the
Federal Reserve Bank of
Richmond. It covers economic
issues affecting the Fifth Federal
Reserve District and
the nation and is published
on a quarterly basis by the
Bank’s Research Department.
The Fifth District consists of the
District of Columbia,
Maryland, North Carolina,
South Carolina, Virginia,
and most of West Virginia.

Senior Editor

David A. Price
Managing Editor/Design Lead

Kathy Constant
Staff WriterS

Jessie Romero
Tim Sablik
Editorial Associate

Lisa Kenney

DEPARTMENTs

1		 President’s Message/What’s Happening to Productivity Growth?
2		Upfront/Regional News at a Glance
3		 Federal Reserve/The Fault in R-Star
6		 Jargon Alert/Fiscal Multiplier
7		 Research Spotlight/Has AI Improved Productivity?
8		 At the Richmond Fed/What Happens When Bubbles Pop?
9		 The Profession/The Economist’s Apprentice
18		 Interview/R. Preston McAfee
24		 Economic History/When a South Carolina City Tried to
			 Become Motor City
27			Book Review/The Gift of Global Talent: How Migration Shapes
			 Business, Economy & Society
28		 District Digest/Understanding Recent Trends in Labor
			 Market Participation
36		Opinion/Does the Fed Need Room to Cut?

Contributors
­

R. Andrew Bauer
Eric LaRose
Akbar Naqvi
Emma Yeager
Design

Janin/Cliff Design, Inc.
Published quarterly by
the Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261
www.richmondfed.org
www.twitter.com/
RichFedResearch
Subscriptions and additional
copies: Available free of
charge through our website at
www.richmondfed.org/publications or by calling Research
Publications at (800) 322-0565.
Reprints: Text may be reprinted
with the disclaimer in italics
below. Permission from the editor
is required before reprinting
photos, charts, and tables. Credit
Econ Focus and send the editor a
copy of the publication in which
the reprinted material appears.
The views expressed in Econ Focus
are those of the contributors and not
necessarily those of the Federal Reserve Bank
of Richmond or the Federal Reserve System.
ISSN 2327-0241 (Print)
ISSN 2327-025x (Online)

President’s Message

What’s Happening to Productivity Growth?

O

ver the past several years, monetary policymakers
have been gradually raising the target federal
funds rate to align with the “neutral” rate of
interest. As Tim Sablik discusses in “The Fault in R-star”
in this issue, our calculations of the neutral rate are
imprecise; even the economist who helped develop one
widely used estimate has described them as a “fuzzy blur.”
Blurry as our estimates might be, they all point to the
same general trend: a decline in the neutral rate. And if
the neutral rate is the rate consistent with the economy
performing at potential, then a lower rate implies lower
potential as well. What’s holding us back?
One major contributor appears to be a decline in productivity growth. Between 1985 and 2005, the United States
had a productivity boom, with average annual growth of
2.3 percent. Over the past decade or so, however, productivity growth has slowed — with average annual growth of just
1.3 percent between 2006 and the present. I have to admit
I find this very surprising from my perspective as a business
consultant. I didn’t observe any particular cliff around
2005. In fact, I saw management equally motivated to drive
a focus on the bottom line. I saw new, powerful practices
being implemented, such as artificial intelligence, voice recognition, digitization, and offshoring. I saw my individual
clients get more productive.
One possibility is that the mix of businesses has shifted,
for example, because of the growth in services or productive
sectors moving to foreign locales. But the slowdown is widespread. Nearly every sector has experienced some decline
in productivity growth since the mid-2000s (although the
extent varies across sectors).
Another possibility is mismeasurement. Some surely
exists; for example, the leisure value of free apps on a
smartphone isn’t measured, while toys are. (Of course, the
economic statistics do include the ads that pay for many of
those free apps.) But again, the widespread nature of the
decline makes mismeasurement unlikely as an across-theboard explanation.
Productivity growth could also be hampered by regulatory costs and the expense of implementing cybersecurity:
Costs have certainly been created that don’t generate revenues. But back-of-the-envelope calculations suggest these
costs aren’t large enough to explain the slowdown.
So what do I think I know? I believe the productivity
slowdown is real, and part of the explanation is nearly
two decades of business underinvestment. Since 2000,
investment has been low relative to measures of corporate
profitability, driven by industry leaders not investing in
growth the way they once did. Airlines have moderated
capacity growth, banks aren’t adding branches, and even
successful retailers aren’t adding stores. And in my view, it’s

easy to draw a line from lower
investment to lower productivity growth.
Why has investment been low?
My sense is that several things
are going on. Short-termism
has been increasing as CEO
tenure has decreased and corporate activism has escalated.
Share repurchases have become
a compelling alternate use of
capital. Cyclical industries have
learned the lessons of overcapacity. And finally, companies are still feeling skittish after
the Great Recession. For example, I’ve spoken with business leaders who, even if they see opportunities for investment, are reluctant to take them. They continually see the
next recession as “just around the corner.” That’s certainly
true today.
Another factor in slowing productivity growth is declining startup rates. Successful entrants drive innovation,
which drives productivity. But the data show a massive
reduction in entry rates in all states and all sectors. Startups
accounted for 12 percent of all firms in the late 1980s. That
fell to 10.6 percent in the mid-2000s and to 8 percent
after 2008. As with investment, some of this decline might
reflect lingering risk aversion after the Great Recession.
Some might be the impact of regulation. Research also
points to the slow growth of the working-age population as
an explanation. In addition, I hear that there are tangible
impediments — such as acquiring the necessary technology and talent — to building the scale and sophistication
entrants require to be successful.
The good news is that change is possible. As the Great
Recession fades further into memory, economic tailwinds
may give both entrepreneurs and existing firms more
confidence. Technological innovations such as AI aren’t
going away. And policymakers can promote a healthy environment for business investment. American businesses
are practical and innovative. If the rules are clear and the
environment is stable, they will find a way to become more
productive.
EF

Tom Barkin
President
Federal Reserve Bank of Richmond

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

1

UpFront

Regional News at a Glance

By L i s a K e n n e y

MARYLAND — Maryland is ranked the third-best state for science and
technology capabilities related to economic growth, according to the 2018 State
Technology and Science Index from the Milken Institute. The index uses dozens
of measurements in five categories to determine states’ abilities to grow and
sustain a tech sector. The report says Maryland earned its ranking due to its high
concentration of computer science, engineering, and life science employment,
its high level of research and development funding, and its creation of state
programs that foster high-tech business growth.
NORTH CAROLINA — In late January, the North Carolina Department of
Commerce announced it is investing more than $250,000 in Downtown Strong, a
program that provides economic development and revitalization resources to local
governments across the state. The program will be run by the department’s Main
Street and Rural Planning Center and will provide services such as staff training and
technical assistance to towns that want to revitalize or preserve existing downtown
commercial districts.
SOUTH CAROLINA — Greenville launched the seventh year of its Minority
Business Accelerator program in January with 17 participants. The year-long
program aims to encourage the development of minority-, women-, and veteranowned businesses as well as ones operating in low-to-moderate income areas.
Participants receive business development services, mentoring, technical
assistance, and opportunities to partner with larger businesses. Since the program’s
2013 launch, participating companies have created about 200 jobs and have gained
more than $22 million in contract awards and revenue increases.
VIRGINIA — With Amazon headed to Northern Virginia, the University of
Virginia announced in January that it will open a School of Data Science to help
meet tech industry demand. Most of the funding will come from a $120 million
donation by the Quantitative Foundation, a private foundation based in
Charlottesville; it is the largest private gift in UVA’s history. The new school will
build on the university’s existing Data Science Institute and is expected to offer
undergraduate, doctoral, and certificate programs. Elsewhere in the state, Virginia
Tech plans to build a $1 billion tech innovation campus near the Amazon site, and
George Mason University plans to create a school of computing and almost triple
its computer science enrollment over the next five years.
WASHINGTON, D.C. — College athletic programs at eight D.C. universities
generated more than $122 million in sports revenue in 2016, according to a
recent Washington D.C. Business Daily analysis of Department of Education data
from colleges and universities that receive funding through federal financial aid
programs. In D.C., George Washington University had the highest revenue per
athlete, $74,153. Georgetown University had the highest total sports revenue,
with more than $44 million.
WEST VIRGINIA — Out-of-work residents in the Upper Kanawha Valley will
soon be able to apply for a new program that will provide up to $10,000 loans
to help people start small businesses in their communities. UKAN, which was
created in January, aims to assist those who may not be able to get a traditional
business loan. Applicants will need to submit a business plan and agree to audits
to ensure the loan is being spent on business purposes. If a business stays open
continuously for two years, the loan will be forgiven.
2

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

FEDERALRESERVE

The Fault in R-Star
Has the natural rate of interest lost its luster as
a navigation aid for monetary policy?
B y T i m S a bl i k

I

n a 2018 speech at the annual Economic Policy
Symposium in Jackson Hole, Wyo., Fed Chairman
Jerome Powell compared monetary policymakers to
sailors. Like sailors before the advent of radio and satellite
navigation, Powell said, policymakers should navigate by the
stars when plotting a course for the economy. Powell wasn’t
referring to stars in the sky, however. He was talking about
economic concepts such as the natural rate of unemployment and the natural real interest rate. In economic models,
these variables are often denoted by an asterisk, or star.
The natural rate of interest in particular sounds like
the perfect star to guide monetary policy. The real, adjusted-for-inflation interest rate is typically represented in
economic models by a lowercase “r.” The natural rate
of interest, or the real interest rate that would prevail
when the economy is operating at its potential and is in
some form of an equilibrium, is known as r* (pronounced
“r-star”). It is the rate consistent with the absence of any
inflationary or deflationary pressures when the Fed is
achieving its policy goals of maximum employment and
stable prices. Since the financial crisis of 2007-2008, Fed
officials have often invoked r-star to help describe the
stance of monetary policy. But lately, r-star seems to have
lost some of its luster.
“Navigating by the stars can sound straightforward,”
Powell said in his Jackson Hole address. “Guiding policy
by the stars in practice, however, has been quite challenging of late because our best assessments of the location of
the stars have been changing significantly.”
Even New York Fed President John Williams, who
helped pioneer estimating r-star, recently bemoaned the
challenges of using the natural rate as a guide for policy.
“As we have gotten closer to the range of estimates of neutral, what appeared to be a bright point of light is really a
fuzzy blur,” he said in September 2018.
Why did r-star become so prominent in monetary policy discussions following the Great Recession, and why
have its fortunes seem to have waned?
A Star is Born
The concept of the natural rate of interest dates back
more than 100 years. In an 1898 book titled Interest and
Prices: A Study of the Causes Regulating the Value of Money,
Swedish economist Knut Wicksell argued that one could
not judge inflation by looking at interest rates alone. High
market rates did not necessarily mean that inflation was

speeding up, as was commonly believed at the time, nor
did low rates mean that the economy was experiencing
deflation. Rather, inflation depended on where interest
rates stood relative to the natural rate.
Wicksell’s natural rate seemed like an ideal benchmark
for monetary policy. The central bank could slow down an
economy in which inflation was accelerating by steering
interest rates above the natural rate, while aiming below the
natural rate could help stimulate an economy that had fallen
below its potential. Indeed, Fed officials in the past made
occasional reference to the natural rate of interest as a way to
explain monetary policy. During testimony before Congress
in 1993, then-Fed Chairman Alan Greenspan explained that
“in assessing real rates, the central issue is their relationship
to an equilibrium interest rate… Rates persisting above that
level, history tells us, tend to be associated with slack, disinflation, and economic stagnation -- below that level with
eventual resource bottlenecks and rising inflation, which
ultimately engenders economic contraction.”
Despite some passing references to the natural rate of
interest, however, Wicksell’s idea didn’t truly rise to prominence until the early 2000s when Columbia University
economist Michael Woodford incorporated it into a
modern macroeconomic framework to describe how central banks should behave. In his book, titled Interest and
Prices: Foundations of a Theory of Monetary Policy in a nod
to Wicksell’s work, Woodford argued that a central bank
should seek to close the gaps between actual economic
conditions and the economy’s potential for output and
employment (y-star and u-star, respectively) as well as the
gap between actual real interest rates and the natural rate
(r-star) all at the same time to obtain an optimal outcome.
There was just one problem: No one knows exactly what
r-star, or any of the stars, is equal to.
“R-star, just like potential GDP or the natural rate
of unemployment, is fundamentally unobservable,” says
Thomas Lubik, a senior advisor in the research department at the Richmond Fed.
In 2003, New York Fed President Williams, then an
economist at the San Francisco Fed, and Thomas Laubach,
an economist with the Fed Board of Governors, published a paper in the Review of Economics and Statistics that
attempted to estimate the natural rate of interest.
“The paper was highly cited, but it took some time
before policymakers began to view r-star as a potential
operational guide,” says Lubik.
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

3

Finding R-Star

Percent

The Lubik-Matthes Estimate of the Natural Rate of Interest
10
8
6
4
2
0
-2
-4
-6
-8
-10
1966 1970 1974 1978 1982 1986 1990 1994 1998 2002 2006 2010 2014 2018
Upper/Lower Bound

Median

Source: Thomas A. Lubik and Christian Matthes, “Calculating the Natural Rate of Interest: A
Comparison of Two Alternative Approaches,” Federal Reserve Bank of Richmond Economic Brief
No. 15-10, October 2015.

From the perspective of monetary policymakers, a key
problem was that estimates of r-star are highly uncertain.
This can be seen in the r-star measure developed by Lubik
and fellow Richmond Fed economist Christian Matthes.
Their median estimate represents the most likely value of
r-star, which was 1.56 percent at the end of 2018, but that
estimate exists in a range of potential values. (See chart.)
The inability to measure the natural rate of interest precisely seemed to limit its usefulness as a benchmark for
setting monetary policy. But after the Great Recession,
policymakers began to take a closer look at r-star.
The New Normal
Given the severity of the financial crisis of 2007-2008
and the recession that followed, it was not entirely surprising when the Fed dramatically reduced the federal
funds rate to nearly zero. But as the crisis subsided and
the economy slowly started to recover after 2009, interest
rates remained near zero year after year. In part, this was
because the Fed held the federal funds rate low to keep
monetary policy accommodative during the recovery, but
it was also the case that low inflation and weak economic
conditions left little room for rates to rise.
“I think most people expected that as the economy
rebounded, interest rates would also rebound. But that
didn’t happen,” says Andrea Tambalotti, a vice president
in the research and statistics group at the New York Fed.
“So the question became: Why?”
The answer, it turned out, could be found in r-star. In
previous decades, many economists assumed the natural
rate of interest was fairly constant over time. But in the
wake of the Great Recession, new estimates by Laubach
and Williams pointed to a dramatic collapse in the value
of r-star, from 2.5 percent to less than 1 percent.
“It became pretty clear that r-star, at least in the short
run and possibly even in the long run, may not be constant,” says Marco Del Negro, also a vice president in the
research and statistics group at the New York Fed.
Alongside Tambalotti and other New York Fed
4

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

colleagues, Del Negro developed estimates for the natural
rate of interest to complement the earlier work by Laubach
and Williams. Around the same time, Lubik and Matthes in
Richmond also developed their alternative methodology to
estimate r-star. All of these estimates pointed to the same
trend: The natural rate of interest had fallen dramatically
since the financial crisis of 2007-2008, continuing a trend
that had started in the 1990s.
Fed officials stipulated that some of this decline was
likely transitory. On Dec. 2, 2015, then-Chair Janet Yellen
remarked that “the neutral nominal federal funds rate …
is currently low by historical standards and is likely to
rise only gradually over time.” Two weeks later, when the
Federal Open Market Committee (FOMC) voted to raise
the federal funds rate for the first time since the Great
Recession began, it noted that “the neutral short-term real
interest rate was currently close to zero and was expected
to rise only slowly as headwinds restraining the expansion
receded,” according to the minutes from the meeting. But
estimates of r-star also pointed to a longer-run problem.
“The whole world was stuck at low interest rates long
after the financial crisis had passed,” says Tambalotti.
“Researchers began looking at the work that John Williams
and Thomas Laubach had done on r-star in the early
2000s. They realized that there was something unusual
going on. It was not just the financial crisis. Something
else was keeping interest rates low.”
While monetary policy can influence short-term interest rates, economists believe that long-run interest rates
are driven by forces outside the central bank’s control.
One such force is the demand for global savings. Before
becoming chairman of the Fed, Ben Bernanke gave a
speech in 2005 in which he talked about the “global saving
glut.” Increased global demand for safe assets, such as U.S.
Treasuries, was bidding up their price and driving down
interest rates, he said. As long-run interest rates remained
low in the wake of the Great Recession, the global savings
glut re-entered the policy discussion as a possible explanation. Economists also pointed to slowing productivity
growth and aging populations in advanced economies as
additional factors depressing r-star.
If changes in the global economy had caused a longer-run
decline in r-star, then returning monetary policy to neutral
might look quite different from past economic recoveries.
In December 2016, when the FOMC raised the federal
funds rate for only the second time since the financial
crisis of 2007-2008, it signaled that the factors holding
down interest rates might be long-lasting and outside of its
control.
According to the minutes from that meeting, “Many
participants expressed a view that increases in the federal
funds rate over the next few years would likely be gradual
in light of a short-term neutral real interest rate that currently was low — a phenomenon that a number of participants attributed to the persistence of low productivity
growth, continued strength of the dollar, a weak outlook

for economic growth abroad, strong demand for safe longer-term assets, or other factors.”
Fading Light?
Despite the difficulties in estimating r-star, it helped
monetary policymakers identify a decline in the natural
rate of interest. It also proved to be both a useful guide for
policy during the recovery from the Great Recession and
a helpful communication device to explain to the public
why interest rates had been so low for so long. Why, then,
have policymakers recently downplayed r-star’s utility? As
Powell suggested in Jackson Hole, it has to do with the
different context the Fed finds itself in today.
“When interest rates were close to zero, it was pretty
safe to assume that we were far from the long-run natural
rate, regardless of the uncertainty surrounding estimates
of r-star,” says Del Negro. “Now that nominal interest
rates are above 2 percent, pinpointing the actual longrun level for the federal funds rate matters more, and the
uncertainty around estimates of r-star plays a bigger role.”
To be sure, Fed officials have always stressed the
imprecision of r-star in their public communications. In
a January 2017 speech, then-Chair Yellen remarked that
“figuring out what the neutral interest rate is and setting
the right path toward it is not like setting the thermostat
in a house: You can’t just set the temperature at 68 degrees
and walk away. … We must continually reassess and adjust
our policies based on what we learn.”
Failing to stay on top of changes to r-star and other unobservable economic indicators may result in the Fed drawing
the wrong conclusions for monetary policy. During the
Great Inflation of the 1970s, for example, loose monetary
policy contributed to mounting inflation. Some economists
have blamed this on incorrect estimates of the natural rate
of unemployment at the time. On the other hand, the Fed
has correctly interpreted hard-to-measure changes in the
economy before. During the tech boom of the late 1990s,
falling unemployment led many on the FOMC to call for
raising interest rates to head off inflation. Then-Chairman
Greenspan resisted, arguing that the data were pointing to
rising productivity. He was vindicated when unemployment
fell but inflation remained low and stable. During his 2018
Jackson Hole speech, Powell focused on a similar challenge
now facing the Fed.
“The FOMC has been navigating between the shoals of
overheating and premature tightening with only a hazy view
of what seem to be shifting navigational guides,” he said.

Even setting aside questions of measurement, some
economists have questioned whether r-star should be used
as a benchmark for monetary policy at all. While economists have traditionally assumed that long-run interest
rates are driven by fundamental factors in the economy
rather than monetary policy, Claudio Borio and Phurichai
Rungcharoenkitkul of the Bank for International
Settlements and Piti Disyatat of the Bank of Thailand
argued in a 2018 paper that monetary policy decisions in
the short run may in fact influence the long-run natural
rate of interest. Easy policy in the short term may lead to
“financial imbalances,” which can generate losses in the
long run when the economy goes bust. This boom and bust
cycle may influence the natural rate of interest, according
to the authors, compromising its ability to serve as an
independent guide for policy.
One among Many
In a sense, the Fed’s view on r-star hasn’t changed. Early
in the recovery, policymakers used it to help explain why
interest rates were low and why they were likely to remain
low for some time. But they were always careful to communicate the uncertainty surrounding r-star. As the federal
funds rate has risen and that uncertainty has become more
relevant, the Fed’s communications have reflected that
heightened concern. One thing has changed in the last
decade, though. The renewed interest in r-star has spawned
more efforts to better estimate and understand it.
“Multiple Reserve Banks are now contributing to the
effort to measure r-star,” says Lubik. “Some estimates
are on the high end and some are on the low end, but
together they provide a good assessment of the most
likely value for r-star under a variety of assumptions and
methodologies.”
The Fed is making use of these and other data to gain
a better picture of the economy while it shifts monetary
policy into neutral. At the FOMC’s September 2018 meeting following Powell’s Jackson Hole speech, participants
noted that “estimates of the level of the neutral federal
funds rate would be only one among many factors that
the Committee would consider in making its policy decisions,” according to the meeting’s minutes.
R-star has become an important tool in the Fed’s kit
following the Great Recession, but it should not come as a
surprise to see its fortunes wax and wane as economic conditions change over time. It’s a rare kind of navigational
aid, one that becomes blurrier as it gets closer.
EF

Readings
Borio, Claudio, Piti Disyatat, and Phurichai Rungcharoenkitkul.
“What Anchors for the Natural Rate of Interest?” Paper prepared
for the Federal Reserve Bank of Boston 62nd Annual Economic
Conference, Sept. 7-8, 2018.
Del Negro, Marco, Domenico Giannone, Marc P. Giannoni, and
Andrea Tambalotti. “Safety, Liquidity, and the Natural Rate of
Interest.” Brookings Papers on Economic Activity, Spring 2017,
pp. 235-316.

Laubach, Thomas, and John C. Williams. “Measuring the Natural
Rate of Interest.” Review of Economics and Statistics, November 2003,
vol. 85, no. 4, pp. 1063-1070.
Lubik, Thomas A., and Christian Matthes. “Calculating the Natural
Rate of Interest: A Comparison of Two Alternative Approaches.”
Federal Reserve Bank of Richmond Economic Brief No. 15-10,
October 2015.
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

5

JargonAlert

Fiscal Multiplier

W

hen the government spends money or cuts
taxes, by how much does overall economic
output change? The answer is called the “multiplier” on government spending. A multiplier of one, for
example, means that an added dollar of government spending boosts economic output by a dollar.
The size of the multiplier matters because it indicates
the potential effectiveness of the government’s efforts to
boost the economy. But it is exceedingly hard to estimate.
The government tends to undertake stimulus precisely
when the economy is weak — but because the economy
is already behaving in a certain way, it is statistically hard
to isolate the economy’s response to fiscal policy. And if
a spending package is announced far in advance, people
may respond in anticipation, making
it hard to identify the effect of the
actual spending.
One popular way to overcome this
problem is to look at instances where
extra spending took place at the local
level for reasons separate from local
economic conditions — if economic
activity responds in lockstep with the
stimulus, it’s more likely they’re linked.
Using this method, Richmond Fed
economist Marios Karabarbounis and
co-authors used regional variations
in federal spending under the 2009 American Recovery
and Reinvestment Act. The law’s massive fiscal stimulus
package of $840 billion included $228 billion in government contracts, grants, and loans, of which the researchers
identified $46 billion spent locally based on factors like
household characteristics rather than local economic conditions. Aggregating local multipliers to the national level,
they estimated that a one-dollar increase in spending boosts
consumer spending by about 64 cents — for a multiplier of
1.64. Other recent studies have provided a wide range of
estimates of the multiplier, from 0.5 to 2.0.
Economic theory does provide some guidance on which
fiscal policies are likely to produce bigger versus smaller
multipliers. One important insight is that people generally
“smooth” their consumption across time — rather than
spending all of a one-time gain immediately, they’ll spread
it over months or years. For this reason, economists argue
that temporary tax cuts or one-time rebates are likely to
have a smaller multiplier than permanent tax changes.
But one challenge for fiscal policymakers in a recession
is that it might not be credible to announce a permanent
tax change in response to a temporary recession — people
might expect the policy to be undone once the need is

6

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

gone, muting the multiplier. Another factor is that not all
households respond the same to a tax cut or a tax increase
— so who benefits and who bears the cost of a fiscal effort
can affect the size of the multiplier.
Research has typically assumed that a multiplier
is symmetric — that is, the effects of a fiscal change
are the same size whether the spending is going up or
down. But it’s not hard to imagine why there might be
different magnitudes for tightening versus loosening.
Households might be constrained from borrowing when
their incomes fall, or wages and prices may be less likely
to fall in bad times than they are to adjust upward in
good times. Research from Richmond Fed economist
Christian Matthes and the San Francisco Fed’s Regis
Barnichon found that multipliers
from spending contractions might
be twice as big over the business
cycle as they are for spending
increases, peaking in recessions.
Their results both weaken the case
for fiscal stimulus and provide some
caution against fiscal austerity.
The overall economic environment also matters. Standard
economic models predict the government spending multiplier to be
much larger when interest rates are
very, very low. In fact, standard models predict lots
of economic phenomena behave unexpectedly at the
so-called “zero lower bound.” The reason is that at low
interest rates, the real interest rate — the nominal rate
adjusted for inflation — is close to negative territory. If
a fiscal boost produces inflation, the real interest rate
could tip negative and penalize households for saving.
That induces them to spend more today, adding to any
boost in demand resulting from government purchases.
A popular workhorse model suggests the government
spending multiplier might be as large as 3.7 at the zero
lower bound.
The overwhelming conclusion of research on fiscal
multipliers is that they depend critically on the environment and design of the fiscal package. Moreover, economists are quick to caution that the multiplier is not the
only success measure of fiscal policy: The taxes that fund
fiscal stimulus can distort economic activity; the longterm budget impact may reduce future economic activity;
and whether the dollars are spent on things that make the
economy more productive over time can make the longrun multiplier much bigger. Suffice it to say, there is no
“one” multiplier.
EF

Illustration: Timothy Cook

B y Re n ee H a l t o m

Research Spotlight

Has AI Improved Productivity?

E

B y E m m a Ye a g e r

conomists interested in artificial intelligence (AI)
existing production process. The eMT translation is autohave been puzzled by what some call a “productivity
matic and requires no change in behavior from buyers or
paradox.” The paradox is the gap between the optisellers; in fact, they need not even be aware of the tool’s
mistic expectations about the economic effects of AI and
existence to use it.
the effects that appear in the data. On one hand, predictive
Brynjolfsson, Hui, and Liu quantified the effects of
technologies like image and speech recognition have expethe eMT rollout using a natural experiment research
rienced breakthroughs in recent years. The multitude of
design. Much like a scientific experiment performed in
potential uses for such technologies are why AI has been
a laboratory setting, a natural experiment identifies the
called a “general purpose technology” like electricity and
effect of a treatment — in this case, access to the eMT —
the steam engine, whose diverse and far-reaching applithrough a comparison with a control group. The authors
cations changed the ways we work and live. On the other
identify the effect of the eMT rollout using U.S. exports
hand, contributions from AI are nonexistent in measures
over eBay as the measurable outcome. If the eMT reduces
of aggregate productivity.
barriers to international trade
The productivity paradox
as the authors predict, then
“Does Machine Translation Affect
isn’t new. A similar phenomeconsumers in the eMT treatInternational Trade? Evidence from a Large
non accompanied the advances
ment group countries should
Digital Platform.” Erik Brynjolfsson,
of information technology in the
buy more from U.S. exporters
Xiang Hui, and Meng Liu. National Bureau
1970s and ’80s when the Nobel
relative to the control group.
Prize-winning economist Robert
Determining who exactly this
of Economic Research Working Paper
Solow famously remarked, “You
control group should include is
No. 24917, August 2018.
can see the computer age everyimperative to producing meanwhere but in the productivity
ingful results with a natural
statistics.” As productivity in IT-intensive sectors eventuexperiment. A natural experiment employs a “differences
ally picked up around the turn of the millennium, researchin differences” methodology to isolate the effects of a
ers proposed that the paradox was simply an issue of timing.
treatment while controlling for confounding effects. In this
It seemed that IT implementation required the developpaper, the first difference is a comparison of an individual
ment of complementary innovations and the reshaping of
country’s consumption of U.S. exports over eBay before
production processes before its effects could be fully felt.
and after the eMT rollout, regardless of whether the counThe AI productivity story may prove to be much the same.
try is in the treatment group. This controls for differences
If slow implementation of AI is responsible for its
in the magnitude of trade flows by country that had nothing
absence from aggregate productivity numbers, then those
to do with the eMT. The second difference is a comparison
sectors that can most readily adopt AI should be the first to
of the results of the first stage. It measures the change in
experience its economic effects. A recent National Bureau
trade flows among eMT countries versus the change in
of Economic Research paper by Erik Brynjolfsson of MIT
flows among non-eMT countries. This stage controls for
and Xiang Hui and Meng Liu of Washington University in
changes occurring over time that are the same for all counSt. Louis considers whether this may be happening.
tries, like a global expansion or recession that affects trade
Brynjolfsson, Hui, and Liu looked at the effect of machine
flows in both eMT and non-eMT groups.
translation on international trade. Machine translation is
The authors interpreted the increase in trade flows
an AI technology that has become increasingly capable of
due to eMT as an indication of the obstacle to internaproducing near-human-quality translations. The authors
tional trade imposed by language barriers. The effect on
focused on the 2014 rollout of eBay’s in-house translation
trade flows was even larger for buyers and products with
tool, eBay Machine Translation (eMT). The eMT’s implehigher search costs, meaning inexperienced eBay users and
mentation in Russia, Latin America, and the European
consumers in the market for nonstandardized products
Union represented only a moderate quality improvement
responded most to the eMT implementation.
over the platform’s prior translation tool; even so, the
Similar effects in other sectors may become apparent
authors found its introduction was associated with a sizable
as the development of AI technologies continues to make
17.5 to 20.9 percent increase in trade flows between Latin
leaps. Of equal importance is the adjustment of firms to
American consumers and U.S. sellers over eBay.
technological change. Optimism surrounding the producMachine translation makes sense for early AI adoption
tive capabilities of AI may prove true after all: Radical
because it can be easily embedded into a digital platform’s
changes to production processes just take time.
EF
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

7

ATTHERichmondFED

What Happens When Bubbles Pop?
By E r i c L aRo s e

Highlighted Research

“Asset Bubbles and Global Imbalances.” Toan Phan
and Daisuke Ikeda. American Economic Journal:
Macroeconomics, forthcoming.

F

or economist Toan Phan, who joined the Richmond
Fed in 2017, traveling to Japan frequently and witnessing the economic stagnation of the country’s “Lost
Decade” sparked an interest in asset bubbles. During the
1980s, Japan experienced a massive rise in prices of stocks,
housing, and other assets. In 1991, these prices began to fall
sharply, coinciding with the start of a long period of low
growth; annual per capita GDP growth averaged under
1 percent between 1992 and 2000, compared with nearly
4 percent in the preceding 10 years.
During an asset bubble — a sharp rise in the price of an
asset that is unsupported by underlying fundamentals — the
appreciation of the asset increases buyers’ net worth, which
can encourage investment and lead to an expansion of the
economy. Eventually, however, the bubble bursts. As the
price of the asset comes down, investment may decrease
and the economy may contract. Most prior literature on the
topic takes a relatively benign view of this trade-off between
the expansion and subsequent contraction and, Phan says,
“does not necessarily highlight the downside of a bubble
collapse.” That is, this literature predicts that when a bubble bursts, the economy will contract only to its pre-bubble
trend, leaving the economy no worse off than before.
Historically, however, the bursting of asset bubbles has
frequently been followed by deep recessions, such as the
Great Recession and Japan’s Lost Decade, that do leave
the economy worse off. Many of these recessions share
some common characteristics: low inflation (or deflation),
low interest rates, and sharp increases in unemployment
with little, if any, change in wages. Paying particular attention to these similarities, Phan says his research seeks to
“provide theoretical mechanisms as to why collapses of
bubbles tend to precede recessions.”
Nominal wage rigidity — the historical tendency of
wages not to fall, even in a recession — is one factor that
can help explain this historical pattern. In a 2017 article
with Andrew Hanson in Economics Letters and a 2018
working paper with Hanson and Siddhartha Biswas (both
doctoral students at the University of North Carolina
at Chapel Hill, where Phan was on the faculty prior to
joining the Richmond Fed), Phan and his co-authors
embedded wage rigidity into a model of asset bubbles. In
the absence of such rigidity, they found that once a bubble
collapses, contractions in investment and credit decrease
wages while maintaining full employment, consistent with

8

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

prior literature on asset bubbles. The presence of wage
rigidity, however, leads to unemployment instead of
wage decreases. Such unemployment reduces the returns
to capital, creating a cycle in which investors’ net worth
declines, reducing investment and leading to additional
unemployment. This cycle can produce what Phan calls
“a long period of unemployment and low growth.”
Low interest rates coinciding with a bubble bursting can
also exacerbate the resulting economic downturn. Phan,
Hanson, and Biswas showed that overinvestment in capital
during a bubble’s expansion can, once the bubble collapses,
potentially push interest rates down. If rates go low enough,
the economy may reach a liquidity trap in which there is
little room for central banks to lower interest rates, greatly
reducing the ability of expansionary monetary policy to
stimulate the economy. Phan argues that the 2001 recession
following the bursting of the dot-com bubble was relatively
mild in part because “the Fed had a lot of room to lower
interest rates” without entering a liquidity trap.
Additionally, when investment in a bubbly asset is financed
through borrowing, as in the case of banks financing mortgage lending through mortgage-backed securities during the
U.S. housing bubble preceding the Great Recession, the
resulting bubble can reduce economic well-being. In a 2016
article in Economic Theory, Phan and Daisuke Ikeda at the
Bank of Japan developed a model featuring such “leveraged”
bubble investment. Leveraging shifts the risks associated
with the bubble collapsing from borrowers to lenders, and
the possibility of default can cause borrowers to focus only
on the potential gains from investment and ignore possible
losses. As the authors show, this risk shifting can make the
bubble larger and more risky.
“Overall, it has been surprisingly hard to formalize
the idea of a boom-bust trade-off, especially in the case
of Japan,” says Phan. “This has motivated me to keep
thinking about the effects of bubbles bursting through
future work.” He is especially interested in further
exploring asset bubbles in an open economy (one shaped
by the economies of other countries). In a forthcoming paper with Ikeda in the American Economic Journal:
Macroeconomics, the two build a framework in which flows
of credit between a developing economy, such as China,
and a developed one, such as the United States, can lead
to a bubble in the developed economy by decreasing its
interest rates. “There has been relatively little literature
exploring asset bubbles from an open economy perspective,” Phan observes. “But investigating the consequences
of large capital flows into the U.S. from China, which
occurred during the recent U.S. housing bubble, can further our understanding of how bubbles can form.” EF

TheProfession

The Economist’s Apprentice

T

his summer, a new class of a half-dozen or so recent
college grads will enter a two-year boot camp in
economics research, joining the Richmond Fed’s
Research Department as research associates, or RAs.
Bearing degrees in economics, math, or statistics, they
will work with Richmond Fed economists studying a
variety of fields, including monetary policy, labor markets,
and payments systems. Other recent grads will start as
RAs at the 11 other regional Reserve Banks, at the Fed’s
Board of Governors in Washington, and at nonprofits
such as the Brookings Institution and the American
Enterprise Institute. (At some institutions, RAs have the
title “research assistant,” not to be confused with student
research assistants.)
According to Arantxa Jarque, a microeconomist who
also manages the Richmond Fed’s RA program, most of
them come for a couple of reasons. Some are interested
in economics research as a career, but they aren’t sure
enough to make the five- or six-year commitment to pursue a Ph.D. “They come to figure out whether they really
like it,” she said. And both they and the ones who are
already sure come “to beef up their applications to have
more of a chance of getting into a top school.”
An RA stint is a popular path to economics grad
school — and from there, to jobs in academia, public
policy, and finance. The prevalence of the RA path has
been documented by, appropriately enough, economists’
research: According to a 2005 article in the Journal of
Economic Perspectives by Middlebury College economist
David Colander, a “slight majority” of students at highly
selective graduate programs in economics worked after
college and before grad school, most of them as RAs. In
later research, Colander and co-authors found that students at those programs are more likely to have worked
as RAs than students in middle-tier programs.
The institutions, for their part, get top-flight junior
staff members whose labors help the economists to be
more productive.
As usual, the transition from college to full-time work
involves some adjustments. “When RAs arrive, they’re
good at getting good grades,” Jarque says. “But they’re
unused to the lack of structure in their time. They may
have to learn to balance their time among multiple significant projects.”
At the Richmond Fed, RAs commonly assist economists by writing code to analyze data with statistical software packages such as Stata. On other projects, they may
work on code for constructing model economies that
are used in frontier macro research. While RAs at some
institutions are hired to work with one economist or just

By Dav i d A . P r i c e

a few of them, those at others, including the Richmond
Fed, potentially may work with economists in multiple
subject areas based on the RA’s interests and the institution’s needs.
Sara Ho, an RA nearing the end of her second year,
says that in her first year and a half, she did mostly empirical work. “I worked with the National Establishment
Time Series dataset for Nico Trachter and Pierre Sarte’s
paper [with Esteban Rossi-Hansberg at Princeton]
‘Diverging Trends in National and Local Concentration.’
I also contributed to Nico and Bruno Sultanum’s paper
[with Zachary Bethune at the University of Virginia] on
financial intermediation by analyzing big data on credit
default swap trades.” More recently, she says, she has
been focused on banking-related research.
When an RA makes an exceptional contribution to
a project, he or she may be named a co-author on the
resulting paper or journal article. Since 2010, nineteen
RAs have been named co-authors on articles in the
Richmond Fed’s economics research journal, Economic
Quarterly. A co-author credit on an article submitted to
an outside peer-reviewed journal is uncommon but does
occur every so often.
During an RA’s second summer and fall, the question
of grad school becomes more concrete as winter application deadlines draw closer. “They come to the one-year
mark and they have to decide in a couple of months
whether they should apply to grad school,” Jarque says.
“Suddenly, the other shoe drops and they know what they
want. That’s one thing that changes in them: They’re a lot
more informed about what they like and what they aspire
to do in the future.”
A little more than half do go on to doctoral study.
From 2010 to 2018, some 55 percent of Richmond Fed
RAs went to Ph.D. programs and another 31 percent
went directly into employment. The rest went into
master’s programs, law school, or, in one case, full-time
motherhood.
But the experience may change more than their
career outlook. “I’ve gotten more confident over the
two years since being here,” Ho says. “I really learned
how to up my technical standards and anticipate other
people’s questions and be able to think those through
in advance.”
By the end of the two-year cycle, Jarque says, an RA
typically has honed his or her skills to the point that the
economists rue them departing. “When they’re ready to
leave, we’re thinking, ‘No, please, don’t leave!’ But at the
same time, they need to keep growing and move on with
their lives.”
EF
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

9

Navigating Energy
Booms and Busts
The fracking revolution has created new job opportunities, but
are workers prepared for the fluctuations of the energy economy?
By Tim Sablik

I

n 1956, Shell Oil Co. researcher M. King Hubbert
predicted that U.S. oil and gas production would begin
to decline after 1970. This theory of “peak oil” caught
on quickly when it seemed that Hubbert was spot on.
According to the U.S. Energy Information Administration,
crude oil production grew to just shy of 10 million barrels
per day in 1970 and then declined to roughly half that over
the next three decades. Natural gas production kept growing a bit longer, until 1973, before declining as well.
Recently, however, oil and gas drilling have been making a comeback. Oil production is nearly back to its previous peak, and natural gas production has surpassed its 1973
high point. In 2017 and 2018, the United States extracted
so much oil and gas that it became a net exporter for the
first time in over half a century. The twin developments of
hydraulic fracturing (“fracking”) and horizontal drilling are
responsible for this boom. They have allowed firms to tap
into previously difficult to reach deposits of oil and natural

10 E c o n F o c u s | F o u r t h Q u a r t e r | 2 0 1 8

gas in shale rock formations throughout the country. (See
“The Once and Future Fuel,” Region Focus, Second/Third
Quarter 2012.)
For states sitting on top of rich shale oil and gas
reserves, such as North Dakota, Texas, Pennsylvania,
Ohio, and West Virginia, the fracking boom has brought
huge job opportunities. From 2007 through 2014, the oil
and gas industry added roughly 60,000 jobs on net during
a period when many industries were still reeling from the
Great Recession.
Much of the boom in natural gas extraction has been
driven by activity along the Marcellus shale formation underlying where Pennsylvania, Ohio, and West Virginia meet.
From the beginning of 2007 to the end of 2018, the Marcellus
shale region went from producing a million cubic feet of gas
per day to over 21 billion cubic feet per day, a 21,000-fold
increase. (See chart.) The shale revolution has resulted in huge
economic opportunities in energy extraction, construction,

Energy Boom, Empty Classrooms?
To fuel the boom in the Marcellus region, firms have been
willing to pay a premium for drillers and construction
crews to build wells and lay pipelines. A 2017 study by the
RAND Corporation found that in 2010-2014, wages for
construction and extraction in the Ohio, Pennsylvania,
and West Virginia counties affected by the shale boom
were about $10,000 higher on average than for the rest of
the country. While this represents a good opportunity for
workers in those areas, one concern is that this premium
might draw students away from school, potentially harming their long-term employment prospects as well as the
overall human capital of the region.
“In southwestern Pennsylvania, there was a surge in
low-skill employment over a very short period when the
fracking pads were being constructed,” says Jim Denova,
vice president of the Claude Worthington Benedum
Foundation, a nonprofit that promotes education in West
Virginia and southwestern Pennsylvania. “I think those
jobs tended to draw students out of community colleges.”
It’s a problem the region is all too familiar with. Coal
mined in the Appalachians helped fuel the Industrial
Revolution in America in the 18th and 19th centuries and
production across two world wars, but since then, the industry has mostly been in decline. About nine out of 10 West
Virginia coal mining jobs disappeared between 1940 and
2000. (See “The Future of Coal,” Econ Focus, Fourth Quarter
2013.) Coal did enjoy a bit of a comeback in the 1970s as oil
and natural gas declined. This led to a sudden increase in
demand for coal miners in West Virginia, Pennsylvania,
Kentucky, and Ohio that lasted about a decade.
Dan Black of the University of Chicago, Terra
McKinnish of the University of Colorado Boulder, and
Seth Sanders of Duke University found that during this
coal boom, the wage gap between high school graduates
and non-graduates shrank. In economics, the potential
loss associated with choosing one investment over another
is known as the opportunity cost. In this case, the opportunity cost of staying in school went up as wages for miners increased. As this happened, Black, McKinnish, and
Sanders found that high school enrollment rates declined.
A similar dynamic played out in Alberta, Canada, during
the same period. There, rising oil prices driven by the
OPEC embargoes increased oil production and demand
for workers. J.C. Herbert Emery of the University of New
Brunswick, Ana Ferrer of the University of Waterloo, and
David Green of the University of British Columbia found
that enrollment in postsecondary education fell in Alberta
during the 1973-1981 oil boom.
Early evidence suggests that the shale boom may be
having a similar effect on students’ decisions about staying in school. Elizabeth Cascio of Dartmouth College
and Ayushi Narayan, a Ph.D. candidate in economics at

Shale Gas Production Takes Off in Appalachia

The Marcellus and Utica Shale Formations Lie Under PA, OH, and WV
25
NATURAL GAS PRODUCTION
(CUBIC FT/DAY, BILLIONS)

and related fields. But are some workers giving up their education, and future opportunities, to get in on the boom?

20
15
10
5
0
2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019
Marcellus Shale

Utica Shale

Source: U.S. Energy Information Administration, “Natural Gas Weekly Update”

Harvard University, found that fracking increased high
school dropout rates, particularly for young men. And
another study by Dan Rickman and Hongbo Wang of
Oklahoma State University and John Winters of Iowa
State University found that the shale boom reduced
high school and college attainment among residents of
Montana, North Dakota, and West Virginia. And declining student attendance isn’t the only way energy booms
could hurt education outcomes. Even students who remain
in school may be affected.
“In the case of Texas, we saw no effect on completion
rates and some small decline in student attendance,”
says Jeremy Weber, an economist at the University of
Pittsburgh. “But changes in the labor market brought
about by the shale boom influenced whether teachers
stayed in the classroom.”
In a recent paper with Joseph Marchand of the
University of Alberta, Weber found that the average experience of teachers fell during the boom as teacher turnover
went up. Some teachers may have been drawn to other
private sector opportunities made more attractive by the
boom, while others may have been able to retire thanks
to royalties on property connected to drilling. Indeed, in
another paper with Jason Brown of the Kansas City Fed
and Timothy Fitzgerald of Texas Tech University, Weber
found that the largest shale oil and gas regions generated
$39 billion in private royalties in 2014. Whatever the cause,
as the turnover of experienced teachers went up, standardized test performance at Texas schools went down.
This evidence seems to suggest that energy booms
reduce educational attainment, at least in the short run.
But in the case of the shale revolution, there may also be
other factors pushing in the opposite direction.
A Different Kind of Boom
Early on, fracking companies needed a lot of labor to
transport materials and build the wells and pipelines.
But Denova says that in Pennsylvania those jobs were
short lived. Dropping out of school to work may be less
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

11

Boom and Bust

National employment in oil and gas extraction

West Virginia employment in oil and gas extraction

200

EMPLOYEES (THOUSANDS)

EMPLOYEES (THOUSANDS)

250

150
100
50
0
2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019

Source: Bureau of Labor Statistics

attractive if the job is expected to only last about a year
rather than a decade, as in the case of the coal and oil
booms of the 1970s.
Additionally, many of the shale well construction jobs
don’t always go to locals, says Jen Giovannitti, president
of the Benedum Foundation and a former community
development manager at the Richmond Fed. “The companies doing the initial drilling and exploration are often
out-of-town companies that have the ability to move their
workforce from site to site.”
A study by Riley Wilson of Brigham Young University
confirmed that the surge in demand for fracking workers
generated a “sizable migration response” across shale
regions. These effects may have muted some of the incentives for local students to drop out and work. Once the
wells were constructed, shale firms needed workers to
operate them, but those positions are not low skill.
“The technicians who run the wells all need at least
two years of training to operate the complex systems,”
says Paul Schreffler. From 2011 to 2016, he served as
dean of the School of Workforce Education at Pierpont
Community and Technical College in Fairmont, W. Va.
“Companies couldn’t find enough of those workers, no
matter how much they were willing to pay.”
Firms began turning to local community colleges and
technical schools, like Pierpont, to train workers for those
jobs. Pierpont was an early participant in ShaleNET, an
effort to develop those training and certification programs
across shale oil and gas regions. The program received initial federal funding from the U.S. Department of Labor in
2010. Energy companies helped to develop curricula and
also provided funding, instructors, and apprenticeship
opportunities for students. Although Schreffler says firms
were committed to student development, some students
were still lured away from their studies by the opportunities in the industry.
“The companies right now are so eager for workers
that they are hiring students right out of programs,” says
Elizabeth McIntyre, director of the Tristate Energy and
12

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

18
16
14
12
10
8
6
4
2
0
2007

2008

2009

2010

2011

2012

2013

2014

2015

2016

2017

Source: Bureau of Economic Analysis

Advanced Manufacturing (TEAM) Consortium that connects schools and employers across western Pennsylvania,
eastern Ohio, and northern West Virginia.
In the case of both technicians and lower-skilled
positions, though, students who left school to work in
the shale industry may not be out for good. The study by
Emery, Ferrer, and Green that looked at the oil boom in
Alberta during the 1970s found that while postsecondary
education attainment fell initially, it later recovered after
the boom ended. The authors hypothesized that individuals who went to work in the oil fields instead of going to
school were able to save enough money to make it easier
to go back to school once the boom ended. In contrast,
they found that the cohorts of students who came of age
after the oil boom had gone bust were less likely to go
to college, perhaps because they did not have the same
opportunity to earn the premium wages in the energy
sector that would have helped them cover the costs of
higher education.
“Are people worse off for having not pursued college
because of an energy boom?” asks University of Pittsburgh’s
Weber. “Suppose I graduate from high school and instead
of going to college, I go to work in a shale-related industry.
When the boom goes bust, maybe I get a two-year degree
in a field I’m interested in and see a demand for, or maybe
I go to college with a clearer focus and more money so I
don’t need to borrow as much. It’s not clear to me that that
scenario is so problematic.”
Of course, that partly depends on the drive and circumstances of each individual and may also depend on
his or her age when the boom ends. Kerwin Charles and
Erik Hurst of the University of Chicago and Matthew
Notowidigdo of Northwestern University studied the
educational effects of the U.S. housing boom and bust that
lasted from the late 1990s to the late 2000s. They found
that the boom in housing demand drew many young people into related sectors, including construction and real
estate. But unlike the case of the Alberta oil workers, after
the housing market collapsed, educational attainment for

individuals who had deferred school remained low, suggesting that many did not return to their studies.
“Once you start working and start a family, it can
become very difficult to go back to college,” says Weber.
“So I could see different scenarios playing out.”
Preparing for the Future
Fulfilling the boom demand for workers is important but
so is having a plan for the bust.
“I think everyone knows that the energy sector is very
volatile when it comes to employment,” says Gabriella
Gonzalez, a researcher at the RAND Corporation who
studies the energy sector in Pennsylvania, Ohio, and West
Virginia. She has also been involved in promoting education and industry partnerships in that region.
At the national level, signs of a slowdown are already
here. Employment in shale oil and gas extraction peaked
in 2014 and has now declined to pre-boom levels. In places
like West Virginia, where the shale boom started a bit
later, employment has held steady so far, but growth has
largely plateaued. (See charts.) Both signs point to one
truth that experienced workers in the energy sector know
well: Booms don’t last forever.
The Appalachian region has been through slumps before.
Past declines in coal mining and manufacturing displaced
workers who came from long lines of coal miners or steel
workers and strongly identified with that work. Despite
efforts to retrain those workers for new advanced manufacturing or shale-related energy jobs, some reports suggest
that the take-up rate of those programs has been low.
“Many people are still looking to find that one company
where they can get hired and work until retirement,” says
John Goberish, the dean of workforce and continuing
education at the Community College of Beaver County
(CCBC) in Pennsylvania, where TEAM is headquartered.
“But that’s just not as likely as it was 30 or 40 years ago.”
To that end, programs developed under ShaleNET and
TEAM aim to give students a foundation of basic skills
such as problem solving and teamwork while also teaching them the technical skills to meet a variety of industry
needs. For example, a degree in mechatronics combines
skills from mechanical and electrical engineering that
apply to jobs in advanced manufacturing as well as natural
gas extraction and processing.

“Those basic skills are critical across industries,” says
Schreffler. “I would tell my students all the time that once
they understand the basic properties of mechanical or
electronic systems, it’s very easy to jump from one sector
to another.”
And to encourage students to stick with their training
until they graduate, schools like CCBC and Pierpont
offer flexible programs that allow students to take classes
piecemeal and build toward certifications and a degree
over time.
“We are trying to give students a lot of options, including an ‘earn and learn’ approach that includes internships,
apprenticeships, and other on-the-job training where they
don’t have to choose between going to school and going to
work. They can do both,” says TEAM’s McIntyre.
Companies have also expressed their support. “Firms
want our students to have that associate’s degree,” says
Goberish. “Many of our instructors are from industry and
they know it will be beneficial to everyone if students finish their training.”
Firms and schools are also looking ahead to the jobs
to come and finding ways to ensure that the activity
surrounding the shale boom doesn’t just disappear once
the wells are in place and the gas is flowing. Shell is building an ethane cracker plant in Beaver County to turn the
ethane gas extracted from the shale there into plastics
that can be used in a variety of products. Construction
of the plant has employed thousands of workers, and
once the plant is in place, it will represent hundreds of
advanced manufacturing jobs for graduates of CCBC’s
programs. There have been discussions about building
additional cracker plants along the Ohio River Valley,
including in West Virginia.
By collaborating with industry, educators are trying to
provide relevant and flexible programs to prepare workers
for the next jobs. That constant change requires both students and teachers to be nimble.
“No one really knows what’s coming down the pipeline
next,” says Gonzalez. “There’s continuous innovation in
technology, and it makes it hard for educators to keep
pace with those changes. Likewise, employers may not
know how many people they will need next year because
the economy or the price of oil and gas could change. So
everyone is just trying to do the best they can.”
EF

Readings
Black, Dan A., Terra G. McKinnish, and Seth G. Sanders. “Tight
Labor Markets and the Demand for Education: Evidence from
the Coal Boom and Bust.” Industrial and Labor Relations Review,
October 2005, vol. 59, no. 1, pp. 3-15.
Cascio, Elizabeth U., and Ayushi Narayan. “Who Needs a Fracking
Education? The Educational Response to Low-Skilled Biased
Technological Change.” Manuscript, Feb. 21, 2019.
Emery, J.C. Herbert, Ana Ferrer, and David Green. “Long-term
Consequences of Natural Resource Booms for Human Capital

Accumulation.” Industrial and Labor Relations Review, June 2012,
vol. 65, no. 3, pp. 708-734.
Marchand, Joseph, and Jeremy G. Weber. “The Local Effects of the
Texas Shale Boom on Schools, Students, and Teachers.” University
of Alberta Working Paper No. 2017-12, January 2019.
Rickman, Dan S., Hongbo Wang, and John V. Winters. “Is Shale
Development Drilling Holes in the Human Capital Pipeline?”
Energy Economics, February 2017, vol. 62, pp. 283-290.

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

13

PAY

Are CEOs Overpaid?
Incentives for chief executives have
important economic implications
By Jessie Romero

T

he 2019 proxy season will mark the second year firms have to disclose how their
CEOs’ compensation compares to the pay of their median employee. The ratios are
likely to generate quite a few headlines, as they did last year, and perhaps some outrage,
especially in light of relatively stagnant wage increase for most workers in recent years.
(See “Will America Get a Raise?” Econ Focus, First Quarter 2016.) But do CEOs actually
earn hundreds, or even thousands, of times more money than their employees? And does
that necessarily mean they’re paid too much?
Calculating CEO Pay
It should be easy to determine how much CEOs
earn. The Securities and Exchange Commission
(SEC) requires publicly traded companies to
disclose in detail how they compensate their
chief executives, including base salaries, bonuses,
stock options, stock grants, lump-sum payments
such as signing bonuses, and retirement benefits.
Firms are also required to report any perks worth
more than $10,000, such as use of the corporate
jet or club memberships, that aren’t directly
related to the executive’s job duties.
The challenge for researchers is that some
forms of compensation, such as stock options,
can’t be turned into cash until some later date.
That means there’s a difference between
expected pay — the value of compensation
on the day it’s granted, which depends on the
current market value of the stock and expected
value of stock options — and realized pay, or

14

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

what a CEO actually receives as a result of selling
stock or exercising options.
One widely used measure of expected pay
comes from the Execucomp database, which is
published by a division of Standard and Poor’s.
The database includes about 3,000 firms,
including current and former members of the
S&P 1500, and contains information gleaned
from firms’ proxy statements. Between 1993
and 2017, according to Execucomp, median
CEO pay increased more than 120 percent in
inflation-adjusted terms. (The Execucomp data
begin in 1992, the year before Congress passed
a law limiting the tax deductibility of CEO
compensation.) The increase was greater for
bigger firms: Median CEO pay in the S&P 500
increased 275 percent, from $3.2 million to $12.1
million, in 2017 dollars. (See chart.) Over the
same time period, median wages for workers
overall increased just 10 percent.

The CEO Multiplier
How much do CEOs earn relative to their employees?
According to the AFL-CIO, the average CEO of a company in the S&P 500 earned $13.94 million in 2017 —
361 times more than the average worker’s salary of $38,613.
(To calculate the average worker’s salary, the AFL-CIO
uses Bureau of Labor Statistics data on the wages of
production and nonsupervisory workers, who make up
about four-fifths of the workforce.) But this ratio may be
overstated for several reasons. One is that the AFL-CIO
uses average CEO pay, which is typically much higher than
median pay because of outliers. Another is that the data
used for CEO compensation include nonsalary benefits,
while the data for average workers include only salary. In
addition, workers’ salaries aren’t adjusted for firm size,
industry, or hours worked, so a CEO who works 60-hour
weeks at a company employing 50,000 people is compared to, say, a part-time bookkeeper at a firm employing
10 people. Still, even adjusting for hours worked and
fringe benefits, CEOs earn between 104 and 177 times
more than the average worker, according to Mark Perry
of the University of Michigan-Flint and the American
Enterprise Institute, a conservative think tank.
The pay ratio required by the Dodd-Frank Act does take
into account firm size and industry, since it compares CEOs
to the median workers at their own companies. According
to the corporate governance consultancy Equilar, the
median pay ratio in 2017 was 166 to 1 for the 500 largest
publicly traded companies by revenue. Among a broader
group of 3,000 publicly traded firms, the median pay ratio
was 70 to 1. But this comparison can be skewed in one
direction if a CEO receives a large one-time payment, or in
the other direction if a CEO declines all compensation, as
do the chief executives at Twitter, fashion company Fossil,
and several other firms. (Twitter’s and Fossil’s CEOs both

Bigger is Better

Compensation for CEOs at large firms has increased faster than for CEOs more broadly.
14
Millions of 2017 Dollars

Calculating realized pay is tricky, since data about the
specific vesting schedules for stock grants or the exercise
dates of options can be difficult to obtain. One approach
is to approximate realized pay using the information in
Execucomp about the value of a CEO’s stock and options
holdings at the end of a year, as Richmond Fed economist Arantxa Jarque and former research associate John
Muth did in a 2013 article. They found that between 1993
and 2012, median realized pay followed the same general
upward trend as expected pay, although it was usually a
little bit lower.
Both measures are valuable for researchers to study.
“Expected pay is based on the history of returns of the
firm’s stock,” says Jarque. “But the insiders — the board
of directors setting the pay, and the CEO — have private information about how those future outcomes may
change. That is, they calculate expected pay using a private
distribution that we researchers cannot observe. Because
of this difficulty, complementing a measure of expected
pay with a measure of realized pay can be informative.”

12
10
8
6
4
2
0
1992 1994 1996 1998 2000 2002 2004 2006 2008 2010 2012 2014 2016
All Execucomp Firms

S&P 500

NOTE: Figure displays median direct compensation as reported on firms’ proxy statements. Shaded
areas denote recessions.
Source: For all firms, Execucomp and Econ Focus calculations. For the S&P 500, Murphy (2013)/
Execucomp; Wall Street Journal/MyLogic IQ; The Conference Board/Arthur J. Gallagher & Co;
Associated Press/Equilar

have significant stock holdings in the firms.) In addition,
pay ratios might appear especially high at companies with a
large number of part-time or overseas employees, who tend
to earn lower annual wages.
Power and Stealth
Until the turn of the 20th century, most firms were
small and run by their owners. But between 1895 and
1904, nearly 2,000 small manufacturing firms merged into
157 large corporations, which needed executives with
specialized management skills. These executives didn’t
have equity stakes in the companies, which created a “separation of ownership and control,” as lawyer Adolf Berle
and economist Gardiner Means described in their seminal
1932 book, The Modern Corporation and Private Property.
In modern economics terms, this creates what is known
as a “principal-agent” problem. A manager, or agent, has
wide discretion operating a firm but doesn’t necessarily
have the same incentives as the owners, or principals,
and monitoring is unfeasible or too costly. For example, a
CEO might try to avoid a takeover even if that takeover is
in the shareholders’ best interest. Certainly, managers are
motivated by career concerns, that is, proving their value
to the labor market to influence their future wages. But
the primary approach to aligning managers’ and shareholders’ interests has been to make the executive’s pay vary
with the results of the firm, for example via stock ownership or performance bonuses. (Of course, this can go awry,
as it famously did when executives at the energy company
Enron engaged in fraudulent accounting to boost shortterm results.)
At the same time, a talented CEO is unlikely to want to
work for a company without some guarantee of compensation in the event of circumstances beyond his or her control, such as regulatory changes or swings in the business
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

15

cycle. So firms also have to provide some insurance, such
as a base salary or guaranteed pension.
In theory, a company’s board of directors acts in the
best interest of shareholders and dispassionately negotiates a contract that efficiently balances incentives and
insurance, which economists refer to as “arm’s-length
bargaining.” But as Lucian Bebchuk and Jesse Fried of
Harvard Law School described in their 2004 book Pay
Without Performance, there may be circumstances when
CEOs are able to exert significant influence over their pay
packages. This might happen because the CEO and the
directors are friends, or because the chief executive has a
say in setting board compensation and perks, or because
the directors simply don’t have enough information about
the firm’s operations. And if the shareholders’ power is
relatively weak, they are unlikely to check the directors.
Bebchuk and Fried cited research finding that CEO pay
is lower when investors have larger stakes, and thus more
control, and when there are more institutional investors,
who are likely to spend more time on oversight.
Even when CEOs have a lot of power, they and their
boards might still be constrained by what Bebchuk
and Fried called “outrage costs,” or the potential for
obviously inefficient pay packages to damage the firm’s
reputation. That can lead to “stealth compensation,”
or compensation that is difficult for investors or other
outsiders to discern. In the 1990s, for example, it was
common for firms to give their CEOs below-market-rate
loans or even to forgive those loans. (These practices
were outlawed by the Sarbanes-Oxley Act in 2002.) And
until the SEC tightened pension disclosure rules in 2006,
firms could give CEOs generous retirement benefits
without reporting their value. CEOs might also receive
stealth compensation in the form of dividends paid on
unvested shares.
Stealth compensation does face some constraints, as
Camelia Kuhnen of the University of North Carolina at
Chapel Hill and Jeffrey Zwiebel of Stanford University
found in a 2009 article. For example, hidden compensation
could be sufficiently large and inefficient to weaken a firm’s
performance and lead the shareholders to fire the CEO.
Kuhnen and Zwiebel concluded that CEOs are more likely
to earn stealth compensation when a firm’s production process is “noisy,” meaning it’s difficult to determine the factors
that contribute to the firm’s performance.
Talent and Value
While some research suggests that CEOs’ pay reflects
their power over their boards, other research suggests
they’re worth it. (The two explanations aren’t necessarily mutually exclusive — a CEO could significantly
increase shareholder value while still influencing a board
to pay more than the market rate.) In a 2016 article, Alex
Edmans of London Business School and Xavier Gabaix of
Harvard University summarize the research on the latter
perspective as the “shareholder value” view. In short, from
16

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

this perspective, CEOs’ contracts reflect their significant
influence on a company relative to rank-and-file employees and the fact that it may be necessary to pay a premium
to attract talent in a competitive market.
In this view, one explanation for high and rising CEO
pay might be technological change. In a 2006 article,
Luis Garicano of the London School of Economics
and Political Science and Esteban Rossi-Hansberg of
Princeton University described firms as “knowledge hierarchies,” in which workers specialize in either production
or problem solving. The hardest problems eventually filter
up to the workers with the most knowledge, and new
tools that make it cheaper to communicate means that
firms rely more on problem-solvers, which decreases the
knowledge necessary for production work. The end result
is higher pay for those at the top of the hierarchy.
It’s also well-documented that CEO pay increases
with firm size, which could be the result of a CEO’s
ability. For example, more-talented CEOs might be able
to hire more people and purchase more capital equipment, enlarging their firms. In addition, the dollar value
of a more-talented CEO is higher at a larger firm. So
when firms get bigger on average, the competition for
talented CEOs increases. In a 2008 article, Gabaix and
Augustin Landier of HEC Paris and the Toulouse School
of Economics concluded that the increase in CEO pay
in the United States between 1980 and 2003 was fully
attributable to large companies’ increase in market capitalization over the same time period. In addition, in a
market where both CEO positions and talented CEOs
are rare, even very small differences in talent can lead to
large differences in pay, according to research by Marko
Terviö of Aalto University in Finland — although Terviö
also notes this does not necessarily mean that CEOs
aren’t “overpaid.”
Unintended Consequences
The composition and level of CEO pay might reflect
not only power and talent, but also the consequences
— often unintended — of government intervention.
Between 1993 and 2001, median CEO pay more than
tripled, driven almost entirely by increases in stock
options, according to research by Kevin J. Murphy of
the University of Southern California. The increase
in stock options, in turn, was fueled by several tax and
accounting changes that made options more valuable
to the executive and less costly to the firm. In 1991, for
example, the SEC made a rule change that allowed CEOs
to immediately sell shares acquired from exercising
options. Previously, CEOs were required to hold the
shares for six months and could owe taxes on the gain
from exercising the option even if the shares themselves
had fallen in value. And in 1993, Congress capped the
amount of executive compensation publicly held firms
could deduct from their tax liability at $1 million unless
it was performance based, with the goal of reducing

“excessive” compensation. (The cap applied to the five
highest-paid executives.) But stock options were considered performance based and thus were deductible. The
cap also induced some companies to raise CEO salaries
from less than $1 million to exactly $1 million.
Regulators took steps that curbed the use of option
grants in 2002, when the Sarbanes-Oxley Act tightened
the reporting standards, and again in 2006, when the
Financial Accounting Standards Board mandated that
they be expensed. Both of these changes decreased the
attractiveness of stock options relative to stock grants,
which led some firms to stop awarding options and others
to start granting stock in addition to options, according to
research by Jarque with former Richmond Fed research
associate Brian Gaines.
Regulation might also have increased the use of perquisites in the 1980s. In the late 1970s, the SEC started
requiring more disclosure of perks such as entertainment
and first-class air travel; one SEC official said the “excesses
just got to the point where it became a scandal.” But as
Murphy and others have documented, the disclosure rules
actually increased the use of perquisites (although they
remained a fairly small portion of total compensation),
as executives learned what their peers at other firms were
receiving.
Since 2011, large publicly traded firms have been required
to allow their shareholders a nonbinding vote on executive
pay packages. The goal of “Say on Pay,” which was part
of the Dodd-Frank Act after the financial crisis, was to
rein in executive compensation and enable shareholders
to tie pay more closely to performance. (See “Checking
the Paychecks,” Region Focus, Fourth Quarter 2011.) But
research by Jill Fisch of the University of Pennsylvania
Law School, Darius Palia of Rutgers Business School,
and Steven Davidoff Solomon of Berkeley Law suggests
shareholders are highly influenced by the company’s performance; that is, they tend to approve pay packages when
the stock is doing well. That could encourage executives to
focus on the short-term stock price rather than the firm’s
long-term value. Other research has found that Say on
Pay has made firms more reliant on outside compensation
experts, who tend to design homogenous pay packages
geared toward shareholder approval rather than what’s
most effective for the firm.

Does CEO Pay Matter?
“CEO pay can have substantial effects, which spill over
into wider society,” says Edmans. “Incentives can backfire with severe societal consequences. In contrast,
well-designed incentives can encourage CEOs to create
value — and hold accountable those who do not.”
In the 1970s, for example, CEOs were largely rewarded
for making their companies bigger — at the expense of
their firms’ value, according to Murphy. “The implicit
incentives to increase company revenue help explain the
unproductive diversification, expansion, and investment
programs in the 1970s, which in turn further depressed
company share prices,” he wrote in a 2013 article.
More recently, many observers and researchers believe
that compensation practices played a role in the financial
crisis. As Scott Alvarez, former general counsel of the
Fed, observed in 2009 testimony before the U.S. House
Committee on Financial Services, “Recent events have
highlighted that improper compensation practices can
contribute to safety and soundness problems at financial institutions and to financial instability.” Many of
these practices also applied to lower-level executives and
employees, but CEOs might have been incentivized to
ignore the risks their employees were taking.
There is also the question of fairness. To the extent
high pay is the result of managerial power or efforts to
take advantage of tax laws, rather than the result of higher
output or performance, workers might not be getting their
share of the fruits of economic growth. This is an opinion
that’s been voiced since at least the early 1930s, when the
public first started to learn what executives were paid as
the result of a series of lawsuits. Recently, some research
attributes the rise in income inequality at least in part
to executive compensation, although, as Edmans notes,
the top 1 percent comprises many professions, including
lawyers, bankers, athletes, authors, pop stars, and actors,
to name a few. In Edmans’ view, fairness isn’t necessarily
the right reason to be concerned about CEO pay. “Often
people care about CEO pay because there’s a pie-splitting
mentality — the idea that there’s a fixed pie and anything
given to the CEO is at the expense of others,” he says. “But
if we have a pie-growing mentality, we should care because
the correct incentives affect the extent to which the CEO
creates value for society.”
EF

Readings
Bebchuk, Lucien Arye, and Jessie M. Fried. “Executive
Compensation as an Agency Problem.” Journal of Economic
Perspectives, Summer 2003, vol. 17, no. 3, pp. 71-92.
Edmans, Alex, and Xavier Gabaix. “Executive Compensation: A
Modern Primer.” Journal of Economic Literature, December 2016,
vol. 54, no. 4, pp. 1232–1287.
Jarque, Arantxa, and John Muth. “Evaluating Executive
Compensation Packages.” Federal Reserve Bank of Richmond
Economic Quarterly, Fourth Quarter 2013, vol. 99, no. 4, pp. 251-285.

Murphy, Kevin J. “Executive Compensation: Where We Are,
and How We Got There.” In Constantinides, George M., Milton
Harris, and Rene M. Stulz (eds.), Handbook of the Economics of
Finance, vol. 2, part A. Amsterdam: North Holland, 2013.
Wells, Harwell. “‘No Man Can Be Worth $1,000,000 A Year’: The
Fight over Executive Compensation in 1930s America.” University of
Richmond Law Review, January 2010, vol. 44, pp. 689-769.

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

17

Interview

R. Preston McAfee

18

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

EF: How did you become interested in economics?
McAfee: When I was a high school student, I read The
Worldly Philosophers by Robert Heilbroner. It’s a highly
readable history of economic thought. I didn’t know anything about economics — I didn’t even know who Adam
Smith was — and I found it fascinating. I was pretty familiar
with the science of atoms and electrons and planets and
stars, but the idea of a science of people was not something
I had encountered.
EF: You were one of the first academic economists to
move to a major technology company when you joined
Yahoo as chief economist. You’ve since spent more
than a decade as an economist at major technology
companies. What has changed in the way that economic research is used in these firms?
McAfee: The major change is the relevance of microeconomics — the study of individual markets.
Economists have had a big role in companies doing macroeconomics for forever, worrying about inflation, GDP,
and how those broad aggregates influenced demand for the
firm’s products. Microeconomists bring a very different
skill set and answer very different questions.
That’s a major change in roles. Amazon, for instance, has
more than 150 microeconomists. A really big thing there,
and at Microsoft and at Google, is the problem of causality.

P h o t o g r a p h y : Lis a H e l f e r t

Just about every economist, of course, is excited about
economics. And many economists are excited about
technology. Few, however, have mashed those two
interests together as thoroughly as Preston McAfee.
Following a quarter-century career in academia at the
California Institute of Technology, the University of
Texas, and other universities, McAfee was among the
first economists to move from academia to a major
technology firm when he joined Yahoo in 2007 as
chief economist. Many of the younger economists he
recruited to Yahoo are now prominent in the technology sector. He moved to Google in 2012 as director of
strategic technologies; in 2014, he joined Microsoft,
where he served as chief economist until last year.
McAfee combined his leadership roles in the industry with continued research, including on the economics of pricing, auctions, antitrust, and digital
advertising. He is also an inventor or co-inventor on
11 patents in such wide-ranging areas as search engine
advertising, automatically organizing collections of
digital photographs, and adding user-defined gestures
to mobile devices. While McAfee was still a professor
in the 1990s, he and two Stanford University economists, Paul Milgrom and Robert Wilson, designed the
first Federal Communications Commission auctions
of spectrum.
Among his current activities, McAfee advises the
FCC on repurposing satellite communications spectrum and advises early-stage companies. The latter
include Telescent, a network switching company; Prysm
Group, a blockchain governance company; Merlin,
an online employment market; CG, a digital security
company in stealth mode; OpenX, a digital advertising
exchange; and the Luohan Academy, a not-for-profit
research institute created by Alibaba. He also serves on
the visiting committee of the MIT Institute for Data,
Systems, and Society and on the boards of the Pardee
RAND Graduate School and the Mathematical Sciences
Research Institute.
McAfee served as editor of Economic Inquiry for
six years and co-editor of the American Economic
Review for nine years and is a founding co-editor of
the economics and computer science journal ACM
[Association for Computing Machinery] Transactions
on Economics and Computation.
He is also a confirmed iconoclast. In the pages of
the Journal of Economic Literature, he opined that “the
most important reason for China’s success” was that
“China ignored the advice of Harvard economists.”
David A. Price interviewed McAfee in Washington,
D.C., in November 2018.

Microeconomists have been
psychology, all of those disciFrom an economic perspective, the
studying how to get at causalplines, to make markets work
frequently encountered goal of recreating better.
ity — what caused something as
a market, entrepreneurial or otherwise,
opposed to what’s just correlated
In politics, you have people
inside
a firm involves a misunderstanding who don’t want to use markets,
with it — for 40 or 50 years, and
we have the best toolset.
and then you have people who
of the reason for a firm to exist. If a
Let me give an example: Like
say just let the market do it — as
market can work inside a firm, there
most computer firms, Microsoft
if that didn’t have any choices
shouldn’t be a firm in the first place!
runs sales on its Surface computattached to it. But in fact, often
ers during back-to-school and
how you make a market work
the December holidays, which are also the periods when
determines whether it works well or poorly. Setting the
demand is highest. As a result, it is challenging to disenrules of the game to make markets more efficient is what
tangle the effects of the price change from the seasonal
market design is all about. Thus, whether to hold an aucchange since the two are so closely correlated. My team at
tion, whether to sell or lease, who bears responsibility for
Microsoft developed and continues to use a technology to
problems, and what information is communicated to whom
do exactly that and it works well. This technology is called
are all questions answered by market design. At least four
“double ML,” double machine learning, meaning it uses
Nobel Prizes have gone for developments in this area.
machine learning not once but twice.
One thing we learned is to design for mistakes by parThis technique was originally created by some academic
ticipants. People will make mistakes, and to encourage
economists. Of course, as with everything that’s created by
participation and efficient outcomes, it is desirable that
academic economists, including me, when you go to apply
those mistakes not be catastrophic.
it, it doesn’t quite work. It almost works, but it doesn’t quite
Moreover, there is a trade-off between the potential
work, so you have to change it to suit the circumstances.
efficiency of a market and the generation of mistakes. Give
What we do is first we build a model of ourselves, of
people the ability to express complex demands, for examhow we set our prices. So our first model is going to not
ple, and the potential efficiency rises, because people can
predict demand; it’s just going to predict what decisionexpress exactly what they want. But the number of mistakes
makers were doing in the past. It incorporates everything
will rise as well, and the actual performance can decline. I
we know: prices of competing products, news stories, and
often find myself supporting a simpler design for this realots of other data. That’s the first ML. We’re not predictson; I push back on complexity unless that complexity buys
ing what demand or sales will look like, we’re just modeling
a lot of efficiency.
how we behaved in the past. Then we look at deviations
When we designed the PCS [personal communications
between what happened in the market and what the model
services] auctions, the spectrum auctions, we were aware
says we would have done. For instance, if it predicted we
that if you made them complicated, people weren’t likely
would charge $1,110, but we actually charged $1,000, that
to function that well. We had empirical evidence of that.
$110 difference is an experiment. Those instances are like
Take a situation where you have seven properties up for
controlled experiments, and we use them in the second
auction. One regime is that I bid independently on each of
process of machine learning to predict the actual demand.
the properties, and if I am the winning bidder on all seven,
In practice, this has worked astoundingly well.
I get the seven. Another is to allow the bidder to submit a
The pace at which other companies like Amazon have
contingent bid — to say I only want all seven. That’s called
been expanding their microeconomics teams suggests that
package bidding or combinatorial bidding. We were aware
they’re also answering questions that the companies weren’t
that in practice those don’t work so well, because it winds
getting answered in any other way. So what’s snowballing at
up taking a long time to figure out who should win what.
the moment is the acceptance of the perspective of econBut there is some potential loss from not having a packomists. When I joined Yahoo, that was still fairly fragile.
age. Because if, let’s say, I’m selling shoes, most people
don’t have much use for a single shoe. So you would not
EF: In both your academic work and in your published
want to sell the shoes individually, even though there are
work as a corporate economist, you’ve done a lot of
a few people who want only the left shoe or the right shoe.
research on market design, including auction design.
And in fact, I am a person who would like to get different
And of course, you collaborated on the design of the
sizes in a left shoe and a right shoe. So there’s this trade-off
FCC wireless spectrum auctions. What are some of the
between simplicity, which makes it easier for most, and
main things you’ve learned about designing markets?
expressiveness. There is value in that simplicity not only
in terms of getting to an answer more quickly, but also in
McAfee: First, let’s talk about just what market design
helping bidders avoid mistakes.
is. It’s a set of techniques for improving the functioning
Another example is a second-price auction, where you
of markets. Specifically, it uses game theory, economic
don’t pay what you bid; if you’re the highest bidder, you
theory, experimental research, behavioral economics, and
pay the second-highest bid, as opposed to paying your
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

19

R. Preston McAfee
own bid. It has a certain resilience
to it. There was a guy who actually
submitted a bid that was 1,000 times
higher than he intended. Just added
three zeroes by accident. But in that
auction, if you’re paying not your bid
but the next highest bid, it takes two
to make the mistake in order for that
to actually cause him to go broke.
He wouldn’t have gone broke under
the second-price auction, whereas he
would under the first-price auction.
In that specific instance, we had put
in a withdrawal rule that allowed him,
at some penalty but not a ruinous
penalty, to withdraw.

➤ Selected Positions
Chief Economist and Vice President,
Microsoft, 2014-2018; Director, Google
Strategic Technologies, 2012-2014;
Chief Economist, Vice President, and
Research Fellow, Yahoo Research,
2007-2012; J. Stanley Johnson
Professor of Business, Economics, and
Management, California Institute of
Technology, 2004-2009; chaired
professor, University of Texas at
Austin, 1990-2003

promotion, recruiting, collaborating
— anything but compensation.
EF: Based on the literature and on
your own experiences at Google
and Microsoft especially, what
is the role of entrepreneurship
within large tech companies and
has it been evolving?

McAfee: Most tech companies have
been extremely skewed toward try➤ Education
ing to encourage entrepreneurship,
Ph.D. (1980), Purdue University; B.A.
as well as giving a lot of lip service
(1976), University of Florida
to tolerating failure, so as to recreate entrepreneurial activity inside the
➤ Selected Publications
“Machine Learning in an Auction
firm. The “maximize entrepreneurEnvironment,” Journal of Machine
EF: Much of the economic research
ship” approach works pretty well for
Learning Research, 2016 (with Patrick
that has been publicly discussed by
certain kinds of projects, in particular
Hummel); “Evaluating the ‘Big Deal’
technology companies has focused
the kind where a small team can build
Journal Bundles,” Proceedings of the
on outward-facing decisions such
a functioning product. But there are
National Academy of Sciences, 2014
as pricing and, as we discussed,
other products where it is a terri(with Ted Bergstrom, Paul Courant,
market design. Are tech companies
ble idea — do you really want to fly
and Michael Williams); “The Cost of
also using research to structure the
in an airplane where each piece was
Annoying Ads,” Proceedings of the 22nd
incentives of their employees, and
designed and built by separate entreInternational Conference on the World
is there more they can be doing?
preneurial teams aiming to maximize
Wide Web, 2013 (with Dan Goldstein
their own success?
and Sidharth Suri); “Capacity Choice
McAfee: I’ve hired a lot of people
Indeed, the economic theory of
Counters the Coase Conjecture,”
over the years, more than 50 anyway,
the
firm suggests that firms arise
Review of Economic Studies, 2008
probably more than 60. And among
when markets don’t work well. We
(with Thomas Wiseman); “Signaling
Character in Electoral Competition,”
those have been several people, some
know markets work well when comAmerican Economic Review, 2007 (with
quite distinguished economists, who
plementarities are weak and tend
Navin Kartik); “Coarse Matching,”
decided that the first thing they
to fail when complementarities are
Econometrica, 2002; “Bidding Rings,”
wanted to do was get involved in
strong. The term “complementarity”
American Economic Review, 1992
compensation.
is economics jargon for synergy. As
(with John McMillan); “Correlated
Your leverage regarding compena result, the economic theory of the
Information and Mechanism Design,”
sation is greatest in the sales force.
firm suggests that when complemenEconometrica, 1992 (with Philip Reny)
If you’ve got a salaried engineer, let’s
tarities are strong, we should see firms
say, there’s not as much you can do.
arise to internalize these complemenBut in sales, the financial incentives are large and strong. I
tarities and use nonmarket control — dictators, hierarchies,
try to prevent economists on my teams from ever messing
committees, and so on — to direct activities. Thus, from an
with sales force compensation, because there’s no quicker
economic perspective, the frequently encountered goal of
way to be fired. The sales force is very persuasive. That’s
recreating a market, entrepreneurial or otherwise, inside a
their job; they’re supposed to be persuasive.
firm involves a misunderstanding of the reason for a firm to
There was a case where we had an executive vice presexist. If a market can work inside a firm, there shouldn’t be
ident come to us and say, “We really want to run some
a firm in the first place!
experiments and learn about the sales force.” As I said,
Four firms — General Motors, Standard Oil, DuPont,
I did my best to keep my team out of such matters, but
maybe Sears — developed the multidivisional firm. These
when management comes to me and asks for help, I feel
were firms where pieces of the firm operated as separate
I have to oblige. Not only that, I had people chomping at
firms. And they were doing that just because they had gotthe bit wanting to get involved. We designed some incenten to the stage where they were was too large for any one
tives and then what happened next was fully predictable,
person to operate. It’s unsurprising that Silicon Valley’s
which is that the EVP got fired. Fortunately, my team was
version of the multidivisional firm is to say we’re going to
safe because it hadn’t come from them.
run a venture capital firm inside.
My teams have worked with HR on other issues.
I’m generally a voice, not all that successful a voice,
There’s always some ongoing work with HR. It can be on
against this trend. And the reason is, first, Silicon Valley’s
20

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

venture capital is an extremely finely tuned machine. It
works extraordinarily well. And if you think about business strategy 101, one of the first rules is that if you’ve got
a competitive market doing something, buy it from them,
don’t do it yourself.
There are a few exceptions. You might want to do
it yourself if the market won’t produce the quality you
need. Also, we’ve had actually a long-running challenge
where American companies like Cisco will subcontract to
Chinese manufacturers that eventually go into business
against them — so you might not want buy it where you’re
going to create future competitors.
But otherwise, in general, no. Venture capital does a
great job, and it’s a competitive market. So the idea of
trying to replicate venture capital inside the company is
usually misguided.
EF: How do you expect the exploitation of big data
and machine learning to affect market structure and
competition?
McAfee: AI is going to create lots of opportunities for
firms in every industry. By AI, I mean machine learning,
usually machine learning that has access to large volumes
of data, which enables it to be very clever.
We’re going to see changes everywhere: from L’Oréal
giving teenagers advice about what makeup works best for
them to airplane design to logistics, everywhere you look
within the economy.
Take agriculture. With AI, you can start spot-treating
farms for insect infestation if you can detect insect infestations, rather than what we do today, which is spread
the treatment broadly. With that ability to finely target,
you may be able to reduce pesticides to 1 percent of what
you’re currently using, yet still make them more effective
than they are today and have them not deteriorate so rapidly in terms of the bugs evolving around them.
If you look back at the history of big firms, what you see
is that when there are these big innovations — electricity
and the automobile are good examples — these innovations fundamentally change the way things are done. So
what we see and will continue to see is that companies
in the face of AI technology have to change their way of
doing things. We expect to see a lot of entry into these
spaces from firms that have mastered an adjacent technology and can use AI to push themselves into a business.
Meanwhile, the existing firms of course are going to fight
back, and in some cases they’ll push into other areas. This
will likely be very disruptive. You’ll also get the creation of
completely new markets.
Some of those markets are likely to be ones in which a
single firm becomes dominant. Digital commerce was an
example of this; there was a period when there were lots
of companies in digital commerce, but Amazon has clearly
stepped out as the leader.
We will also see a lot of mergers and acquisitions. If you

look at the history of merger waves, they tend to follow disruptive technologies. Indeed, all of them followed extensive
technological change except the 1980s merger wave, which
came about from deregulation. Such merger waves arise as
firms struggle to change their business model, due to the
changing environment the technological change brought
about, and purchase new capabilities via merger. I expect
to see a large merger wave from AI, lasting a decade or
more, that could change competition in many or even most
sectors.
The provision of AI technology is itself quite competitive. Google, Microsoft, Amazon, and IBM offer general
AI technologies that, while somewhat differentiated, are
competitive with each other, and a plethora of small firms
offer more specialized technologies. When electricity
disrupted industry, typically there was only one local
provider. When business machines disrupted industry,
there was one dominant vendor, IBM. But with AI, there
are three or four strong vendors. That is positive both
for advancing the technology and for maintaining competition. Competition among AI vendors will limit the
antitrust problems in other verticals as they adapt to AI.
Indeed, the shortage today is in humans: ML experts to
implement and operate AI and data scientists to clean the
data, prepare pipelines, and structure the output.
EF: What are the implications of machine learning, if
any, for regulators?
McAfee: It is likely to get a lot harder to say why a firm
made a particular decision when that decision was driven
by machine learning. As companies come more and more
to be run by what amount to black box mechanisms, the
government needs more capability to deconstruct what
those black box mechanisms are doing. Are they illegally
colluding? Are they engaging in predatory pricing? Are
they committing illegal discrimination and redlining?
So the government’s going to have to develop the capability to take some of those black box mechanisms and
simulate them. This, by the way, is a nontrivial thing. It’s
not like a flight recorder; it’s distributed among potentially thousands of machines, it could be hundreds of
interacting algorithms, and there might be hidden places
where thumbs can be put on the scale.
I think another interesting issue now is that price-fixing
historically has been the making of an agreement. In fact,
what’s specifically illegal is the agreement. You don’t have
to actually succeed in rigging the prices, you just have to
agree to rig the prices.
The courts have recognized that a wink and a nod is an
agreement. That is, we can agree without writing out a contract. So what’s the wink and a nod equivalent for machines?
I think this is going somewhat into uncharted territory.
EF: Is part of the difficulty that’s emerging the result
of machine learning in particular? As opposed to a
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

21

company making decisions based on an algorithm
that’s in code or using an econometric model?
McAfee: Yes. If you’re using a deep neural net, which is a
way of simulating how brains might work, it’s really hard
to say what the factor was, and actually you’re seeing a
bunch of interesting examples of this.
Deep neural nets are what have gotten people excited
about artificial intelligence now. AI is a field that came
and went repeatedly. People were excited in 1980. They
get excited and then it never delivers. But this time was
different, and what was different was the deep neural net
and its capabilities.
Let me give the example of classifying photos. With
deep neural nets, both Google and Microsoft can classify
photos better than humans. The way we measure this is
that we first have humans classify the photos — this the
Golden Gate Bridge, that’s a dog running in a field. We
have humans do it and then we have machines do it. Then
we show a human the photo and the two answers, and we
ask which one is better. And the machines win. That is,
the human picks the machine’s interpretation over the
human interpretation.
So they use a deep neural net, which is a kind of statistical process that’s just wildly complicated because it has
multiple layers — 150, 170, 200 of these layers that each
have numerical weights attached, so there may be thousands of parameters in each layer and hundreds of layers.
It’s a wildly complicated system. It doesn’t look like a
regression where I can say, “Oh yeah, the coefficient on
income in a loan is 0.2.”
EF: What should antitrust policy be doing more generally, if anything, to respond to the dominance of
some online firms in terms of market share?
McAfee: I disagree with those who find the antitrust laws
inadequate. With few exceptions, I find our laws adequate
for preventing monopolistic mergers, sanctioning anticompetitive behavior, and potentially offering the powerful ability to break up a firm that abuses its dominance.
I do sometimes question the application of the laws.
There have been many tech acquisitions where the target might have grown into a serious competitor for the
acquirer. Facebook, Instagram, and WhatsApp all offer
competing services. Perhaps more of a recognition that
tech firms in adjacent markets grow into challengers is
warranted, though even the merger guidelines recognize
the potential for entry.
We can address monopoly power, even when legally
acquired, with regulation. I realize this is incredibly
unpopular at the moment, but regulation is a pendulum
that swings back and forth. When electricity generation,
with its sizeable scale economies, was subject to monopolization, we responded both by regulating private provision and by creating municipal utilities. We should
22

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

do the same with Internet provision and for exactly the
same reasons.
Of course, a lot of the discussion today is focused
on FAANG — Facebook, Apple, Amazon, Netflix, and
Google. I see the issues somewhat differently. First, let’s
be clear about what Facebook and Google monopolize:
digital advertising. The accurate phrase is “exercise market
power,” rather than monopolize, but life is short. Both
companies give away their consumer product; the product
they sell is advertising. While digital advertising is probably
a market for antitrust purposes, it is not in the top 10 social
issues we face and possibly not in the top thousand. Indeed,
insofar as advertising is bad for consumers, monopolization,
by increasing the price of advertising, does a social good.
Amazon is in several businesses. In retail, Walmart’s
revenue is still twice Amazon’s. In cloud services, Amazon
invented the market and faces stiff competition from
Microsoft and Google and some competition from others.
In streaming video, they face competition from Netflix,
Hulu, and the verticals like Disney and CBS. Moreover,
there is a lot of great content being created; I conclude
that Netflix’s and Amazon’s entry into content creation
has been fantastic for the consumer. Who would have
thought that tech geeks could actually teach Hollywood,
with a century of experience, a thing or two?
That leaves Apple, and the two places where I think we
have a serious tech antitrust problem. We have become
dependent on our phones, and Apple does a lot of things
to lock in its users. The iMessage program and FaceTime
are designed to force people into the Apple ecosystem.
Also, Apple’s app store is wielded strategically to lock in
users (apps aren’t portable), to prevent competition with
Apple services, and to prevent apps that would facilitate
a move to Android. My concern is that phones, on which
we are incredibly dependent, are dominated by two firms
that don’t compete very strongly. While Android is clearly
much more open than Apple, and has competing handset
suppliers, consumers face switching costs that render
them effectively monopolized.
So there are issues as to how the antitrust laws should
be applied, but by and large, the framework of antitrust
is fine. We shouldn’t want competition for competition’s
sake; we want competition because it delivers innovation
and good and cheap products. That’s how the antitrust
laws have been interpreted, and so I’m happy with that.
Going back to Facebook and Google, the reason people
are worried is along the lines that our ability to communicate with Grandma is through only this one company.
That’s what we’re worried about. It’s not actually an
antitrust issue, though. The same with fake news: We
want companies to be more responsible, but I don’t think
the antitrust laws are a solution to that. That’s a place
where we should, as a society, look at what regulations are
appropriate.
A good way to arrive at what those regulations should
look like is by doing experiments. The fact that Europe

and California have adopted forms of data protection is a
good idea. It’s good for us to see some experiments.
The second place I’m worried about significant monopolization is Internet service. In many places, broadband
service is effectively monopolized. For instance, I have
only one company that can deliver what anyone would reasonably describe as broadband to my house. The FCC says
I have two, but one of these companies does not actually
come to my street.
I’m worried about that because I think broadband is
a utility. You can’t be an informed voter, you can’t shop
online, and you probably can’t get through high school
without decent Internet service today. So that’s become a
utility in the same way that electricity was in the 1950s. Our
response to electricity was we either did municipal electricity or we did regulation of private provision. Either one of
those works. That’s what we need to do for broadband.
EF: The notion of regulation or public provision
makes sense from your perspective in the broadband
market. Does it also make sense in the provision of,
let’s say, social media?
McAfee: I’d be pretty leery about government provision
of social media. Partly because it’s a scale play — you need
to run a pretty large network. With electricity and with
broadband, you can actually run a municipal-level service
and you can have local control and you can meet the needs
of the local community, but that doesn’t really work for a
phone system or a social media system. So I would tend to
look more toward regulation for that reason, to make sure
it serves the national interest.
EF: What was the most surprising part of your transition from being an academic economist to being an
economist in a high-tech corporate setting?
McAfee: There’s a school of thought that government is
inefficient because it can be, while firms, subject to markets, are forced to be efficient. The thing that shocked
me the most was how inefficient large firms can be. Sure,
there is government waste, but it is commensurate with
size and clarity of mission. In one sense, I already knew
that large firms could be inefficient — the failure of Kodak
and Blockbuster are examples — but it is another thing to
live through it.
I have a much deeper appreciation that slow optimization is a better model of human behavior than full optimization, and indeed, I’ve often used evolutionary models
rather than optimization models in my work. People do
respond to incentives, and they respond faster to stronger
incentives, but along the way there are lots of mistakes and
bad choices and hysteresis.
EF: What are the best and worst things about working
in a place like Microsoft or Google?

McAfee: The thing I liked best was access to real problems.
As a professor, I would dream up problems and solve them.
I tried to pick problems whose solutions were likely to be
valuable, and I had reasonable success at doing that. But it is
another thing entirely when a multibillion-dollar business is
measurably improved by a change your research suggested.
Indeed, one way of framing the answer is that, 300 years
ago, scientists wrote each other letters of their findings,
and these letters came to be reprinted in volumes for others to see. Eventually, these volumes become journals, and
universities start to hire people who wrote lots of these
letters. At that point, the writing of letters, as opposed to
the making of discoveries, becomes a way of advancing in
a scientific career, and you start to see “literature-driven”
contributions, which are often uninteresting or not useful
or both. As a corporate economist, in contrast, I and my
team would typically be handed an existing problem, and
if we made substantial progress in resolving it, we would
write something up for a journal. In that way, I felt much
more grounded in reality and actual success rather than
academic success.
The worst aspect was firing people. Universities fire a
lot of assistant professors, but the process is structured so
that committees make decisions and there is no individual
responsibility. Firing people is awful, even when it turns out
they needed the change and are ultimately better off for it.
EF: Who have been your main influences?
McAfee: I learned to be a modern economist from
John McMillan, my long-term co-author and author of
Reinventing the Bazaar, which I think is the best book on
market design. John made ideas operational and was a
fabulous expositor. I now spend a full third of my research
time on exposition -- ideas will never persuade if not articulated well.
Paul Milgrom’s perspective on economic theory — his
relentless focus on high-value insights, his often uncanny
ability to simplify and get at the root cause, and his mastery of statistics underlying economic analysis and its role
in economics — continues to be a crucial influence. I
would be happy to produce even 1 percent of his theoretical insights.
And I learned a great deal from my boss at Yahoo, who
I followed to Google, Prabhakar Raghavan. Prabhakar now
leads advertising engineering at Google. Let me describe an
outstanding thing he taught me. A manager’s job is to make
his or her team successful. Full stop. It isn’t even to get a job
done, though the team’s success may require getting some
job done. By defining your job as making the team succeed,
you focus on what is blocking the team and how to remove
those blocks. You acknowledge and advertise the team’s
contributions within the company. You are no longer the
leader but the cheerleader. Upper management loves managers whose teams are successful, and I was well-rewarded
for the success of my teams.
EF
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

23

EconomicHIStory

When a South Carolina City Tried to Become Motor City
The Fifth District’s automotive entrepreneurs eventually lost out
to the forces of agglomeration

I

In the early 1900s, hundreds of entrepreneurs across
the United States tried to get into the car-making
business. Most of them produced only a few cars at
best — but buggy maker John Gary Anderson of Rock
Hill, S.C., thought he had a real shot at giving Henry Ford
a run for his money. “These [Detroit] factories are turning
out five thousand cars per annum,” he wrote in an appeal
to potential shareholders. “Why can’t this be done in the
South — even in Rock Hill? It can and we believe it will.”
The Anderson Automobile Co. did achieve national
distribution and produced more than 6,000 cars between
1916 and 1926, far more than any other Southern auto
manufacturer. It eventually failed due to faulty engines,
not to mention price competition from the Ford Motor
Co. But Anderson’s dream to turn Rock Hill into the car
capital of America — and the aspirations of many other
manufacturers — may have been doomed from the start,
as the forces that contributed to the concentration of the
auto industry in Detroit were well underway by the time
he entered the race.

John Gary Anderson was a big proponent of advertising and
designed his own ad campaigns. He manufactured a wide range of
cars (including the 1919 Allen convertible Roadster pictured here)
and painted them any color a customer wanted.
24

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

Made in Dixie!
Anderson was born in 1861 in Lawsonville, N.C., and raised
by his grandparents after both his mother and father died
of tuberculosis. In his teens, Anderson relocated to Rock
Hill, then a town of fewer than 1,000 people just south of
the North Carolina state line. (Today, Rock Hill is considered part of the Charlotte metro area.) Anderson was
intent on climbing the economic ladder, and in 1881, with
only a few months of formal schooling, he managed to
purchase an interest in a grocery store. Three years later,
he married Alice Holler, the daughter of a prominent local
businessman, and started a successful buggy company with
his new father-in-law.
As historian J. Edward Lee describes in his 2007 book
John Gary Anderson and His Maverick Motor Company,
Anderson was an enthusiastic booster of his adopted city.
He formed its first chamber of commerce and played
a major role in convincing the Winthrop Normal and
Industrial College, today Winthrop University, to relocate
there from Columbia in 1895. He also advocated diversifying the South’s economy away from cotton — in no small
part because farmers dependent on the crop couldn’t
afford to buy buggies when crop prices fell. Transforming
Rock Hill would require “leaders of vision, courage and
enterprise that are rarely found in small towns,” Anderson
wrote in his autobiography. Not lacking in self-esteem, he
believed he was up to the task.
In 1910, two years after Ford launched the Model T,
Anderson and his sons started tinkering with gasoline
engines. At the turn of the century, many cars had electric
engines, but within a few years the internal combustion
engine dominated the market. (See “Car Wars,” Econ Focus,
Fourth Quarter 2014.) Six years later, they introduced the
Anderson Motor Co. to the world with a week-long open
house for prospective dealers and customers. The cars
received favorable reviews; Automobile magazine described
the “Anderson Six” as a “new car manufactured in a new
territory… a good unit assembled in a neat chassis with extra
lavish equipment.” It sold for $1,250.
Anderson emphasized that lavishness, hoping customers would choose quality over cost. A brochure proclaimed,
“You will find the upholstery deep and wide, stuffed with
real curled hair and carefully tailored in real leather. You
will find the finish of lasting luster, hand applied and hand
rubbed, involving twenty-one distinct operations in all.”
Anderson also appealed to regional pride, adopting the
slogan, “A little higher in price, but made in Dixie!”

I m a g e : C o u r t e s y o f D r . J . Ed w a r d L e e

By J e s s i e Ro m e ro

For several years, the strategy appeared to be working;
investors were eager, and 200 workers produced as many
as 22 cars per day. The company wasn’t a match for Ford,
however, which had introduced the assembly line in late
1913 and by 1915 could produce between 50 and 250 cars
per day in a single plant. Across more than two dozen facilities (including one that opened in 1914 in Charlotte), Ford
was manufacturing more than 45,000 cars per month.
The U.S. economy entered a severe recession at the
beginning of 1920. Many automakers had invested heavily
in new equipment, anticipating a postwar surge in demand,
but found themselves with excess capacity and debts they
couldn’t pay when that demand dried up. General Motors
survived courtesy of an investment by the du Pont family;
Ford survived by cutting prices even further (and by forcing dealers to accept — and pay cash for — shipments they
hadn’t ordered).
Anderson didn’t have that kind of leverage, and he
“seemed perplexed about the problems facing the industry,”
according to Lee. He didn’t start lowering prices until 1921,
and even then, his cars cost two to four times more than a
Ford. It turned out most customers cared more about price
than quality. “To be sure, [the Model T] didn’t have many
of the extras one got with the local product, such as silver
fittings, satin-covered rope and twin vanity sets, but [it]
usually got passengers to their destinations,” Lee wrote.
Anderson persevered for several more years, urging
local consumers to “buy at home” and warning “what
a hole would be left in Rock Hill should the Motor
Company be taken away.” In 1922, he launched a cheaper
touring car called the “Light Aluminum Six,” which cost
$1,195. But a basic Ford touring car cost just $298, and the
new Anderson model turned out to have a major defect in
its engine. The company had to shut down production to
fix the problem and never recovered. Anderson appealed
to the city for help, but in 1926 the Anderson Motor Co.
and its assets were sold at auction for $53,000, just enough
to pay the back taxes. The Rock Hill Record reported the
news on Sept. 9, 1926: “And thus comes to an end the most
ambitious enterprise ever launched in Rock Hill.”
Why Not Richmond?
Anderson wasn’t the only automotive entrepreneur hoping to get in on the burgeoning car craze. By 1909,
there were around 270 automobile manufacturing companies across the United States — and hundreds of other
enthusiasts experimenting who never managed to actually
produce anything. Nor was Anderson the only person
optimistic about the South’s prospects. In 1910, a writer
for the Richmond Times-Dispatch gushed about the “vigorous and far-seeing young men” at the Richmond Iron
Works, a cooperative of several small foundries, who were
starting to manufacture cars in the city. “Why should not
Richmond make automobiles just as good as any that ever
came from the factories in Detroit or any other town?” he
wrote. He added a prediction: “The automobile industry is

going to be a big thing for Greater Richmond.”
The Richmond Iron Works ceased car production in
1912.
But it wasn’t the end for Virginia auto manufacturing.
Around the same time, a group of businessmen persuaded
James Kline to move his company from Pennsylvania to
Richmond. He set up on a plant on the Boulevard —
today the site of a Greyhound bus station — where he
assembled around 3,700 cars between 1912 and 1923. A
little over 100 miles west, in Lynchburg, the Piedmont
Motor Co. started producing cars in 1917. It manufactured between 2,500 and 3,000 cars, most of which were
purchased by other companies and sold under other
names, before going bankrupt in the early 1920s.
Many automotive entrepreneurs were, like Anderson,
former buggy makers. In Baltimore, Charles and Jacob
Spoerer, the sons of carriage and wagon builder Carl
Spoerer, started making cars in 1907. Until deciding in
1914 to focus instead on tire and auto accessory sales,
they manufactured, among others, a roadster, a touring
car, and a landaulet, essentially a limousine with a convertible top. Richard Corbitt of Henderson, N.C., also
was a carriage builder; his company, Corbitt Automobile
Co., was the only North Carolina firm that managed to
build a production model, although he sold at most 100
vehicles between 1907 and 1912. Corbitt continued building trucks and farm equipment until the company was
liquidated in 1952.
Other manufacturers’ connection to the auto industry
was less clear. Baltimore’s Sinclair-Scott was known for
apple peelers and food-canning machines before it started
producing a roadster called the “Maryland” in 1907. (The
Maryland was originally manufactured in Boston under
the name Ariel; Sinclair-Scott acquired the rights when
Ariel went bankrupt.) Sinclair-Scott built close to 900 cars
before going back to food canning in 1910.
One source of publicity for these early manufacturers
was multiday driving tours, in which cars had to reach
checkpoints within specific timeframes and were penalized for repairs. In these, the “Washington” automobile,
manufactured in Hyattsville, Md., by the Washington,
D.C.-based Carter Motor Car Corp., performed quite well.
In the 1910 Munsey Historic Tour, a 12-day, 1,500-mile
race, two Washingtons finished with perfect scores. An
advertisement later that year proclaimed the Washington
the “Victor of Victors.” But Carter couldn’t scale up and
went bankrupt in 1912.
Automotive Agglomeration
Despite the flurry of activity in the Fifth District and
across the country, the American automotive industry was
highly concentrated nearly from the beginning. By most
accounts, the industry got its start in New England in
1895. Within 10 years, 68 percent of auto manufacturing
firms were located in just six cities: Detroit, New York,
Chicago, Indianapolis, Rochester, N.Y., and St. Louis.
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

25

Detroit had the highest share, with 25 percent, followed by
New York with 15 percent and Chicago with 10 percent.
Indianapolis, Rochester, and St. Louis each had between
2 percent and 8 percent of firms. Concentration increased
dramatically over the next four decades. Between the
mid-1910s and the mid-1920s, the number of firms fell
from around 200 to just 40, and Detroit’s share increased
substantially. By the 1940s, only eight auto manufacturers
remained and nearly all of them were in Detroit.
Broadly speaking, there are four factors that could contribute to such geographic clustering, or what economists
call “agglomeration.” The first is intra-industry spillovers,
which occur when firms located near other firms in the
same industry share knowledge and inputs. There may
also be inter-industry spillovers, when knowledge is shared
across firms in related industries. Agglomeration might
also occur when employees leave an incumbent firm and
start another firm in the same industry, known as “family
network” or “spinout” effects. Finally, a cluster might be
the result of a location’s unique attributes, such as natural
resources or a favorable regulatory environment.
What explains the agglomeration of the U.S. auto industry? That question was explored by Richmond Fed economist Zhu Wang, Luís Cabral of New York University,
and Daniel Yi Xu of Duke University in a 2018 article in
the Review of Economic Dynamics. The researchers ran a
“horse race” between the potential contributing factors
and concluded that in the short run, the most significant
were spinouts and inter-industry spillovers from local carriage and wagon manufacturers. Local inputs, such as iron
and lumber, played a smaller role. “This finding highlights
how human capital, accumulated at a location by working
in the same or a related industry, contributes to industry
agglomeration,” says Wang.
From a long-run perspective, however, the location
of the carriage and wagon industry in the first place was
determined by the availability of local inputs. In addition,
spinouts are influenced by the local regulatory environment; one reason there were so many spinouts in Detroit
was that Michigan had passed a law banning noncompete
clauses in 1905. In this sense, Wang says, “It is fair to say
that location-specific effects accounted for the lion’s share
of the auto industry’s agglomeration.”
Wang and his co-authors distinguished two different
phenomena: the agglomeration of the auto industry in a
few cities, particularly Detroit, which had already occurred
by the early 1900s, and the industry shakeout that led to
the marked decline in the number of firms by the 1940s.
“Before the assembly line, you needed a lot of producers to
meet the demand,” says Wang. “But the scale economies

created by the assembly line meant you only needed a
few firms. Detroit had already built up an advantage that
enabled it to capitalize on the new technology — and that
agglomeration occurred before the industry consolidated.”
Full Circle
After his company failed, Anderson spent most of his
time in Lakeland, Fla., with his wife until his death in
1937. He never forgave Rock Hill for “abandoning” his
company; he devoted nearly 100 pages of his 900-page
biography to criticizing the leaders who hadn’t returned
his loyalty.
After the bankruptcy, Manhattan-based M. Lowenstein
and Sons Co. purchased the vacant car factory and built a
textile processing facility. Known locally as “the Bleachery,”
the Rock Hill Printing and Finishing Co. opened in 1930.
Residents viewed the opening as “proof that the ‘Good
Town’ [as Rock Hill was popularly known] was Getting
Better,” according to a 1953 history of Rock Hill by the late
historian Douglas Summers Brown. The facility eventually expanded to 31 buildings over more than 30 acres and
helped foster the economic growth Anderson had hoped
to provide. In 1952 and 1960, Rock Hill residents had the
highest per-capita income of any South Carolinians. At the
peak in the mid-1960s, nearly 5,000 people — 70 percent
of Rock Hill’s workforce — worked there. With another
33 textile factories in Rock Hill, the Bleachery was at the
center of an agglomeration of its own.
During the 1980s and 1990s, many textile manufacturers moved overseas. M. Lowenstein and Sons sold the
Rock Hill Printing and Finishing Co. in 1985, and the new
owners closed the facility in 1998. The building sat vacant
for more than a decade, subject to fires and vandalism. The
city purchased most of the site in 2011 and has partnered
with developers to create a new complex called University
Center, part of a broader revitalization effort known as
Knowledge Park. Scheduled to be completely open by
2020, the mixed-use center will feature restaurants, apartments, office space, a hotel, an indoor sports complex, and
housing for students at Winthrop University, the school
John Gary Anderson worked so hard to bring to the city.
Detroit’s “Big Three” auto manufacturers began to
face serious foreign competition themselves in the 1980s.
Today, eight of the top 10 automakers by U.S. market share are based overseas (including Chrysler, which
merged with Italy’s Fiat in 2014). And car and truck
manufacturers, including BMW, Mercedes, Toyota, and
the Japanese company Hino, operate plants in the Fifth
District. BMW’s plant is in Spartanburg, S.C., a little
more than an hour’s drive from Rock Hill.
EF

Readings
Cabral, Luís, Zhu Wang, and Daniel Yi Xu. “Competitors,
Complementors, Parents, and Places: Explaining Regional
Agglomeration in the U.S. Auto Industry.” Review of Economic
Dynamics, October 2018, vol. 30, pp. 1-29.
26

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

Lee, J. Edward. John Gary Anderson and His Maverick Motor
Company. Charleston, S.C.: The History Press, 2007.

BOOKREVIEW

A Welcome for the Talented
The Gift of Global Talent: How
Migration Shapes Business,
Economy & Society
By William R. Kerr, Stanford,
Calif.: Stanford University Press,
2019, 237 Pages
Reviewed by Aaron Steelman

I

mmigration skeptics argue that newcomers are taking
jobs Americans would otherwise fill and that immigration is having a divisive effect on the country’s
culture. Proponents argue that the net economic effects
of immigration are overwhelmingly positive and that it’s
not plain that immigrants are assimilating at a lesser rate
than in the past.
In The Gift of Global Talent, William Kerr, an economist at Harvard Business School, addresses these issues
— although exclusively through the lens of “high-skilled”
immigration. He doesn’t attempt to analyze effects of
“lower-skilled” immigration, which drives many, though
certainly not all, of the concerns of immigration skeptics.
Kerr favors more high-skilled immigration to the
United States. “Some may cheer at the prospect of
reduced inflows of talented immigrants, but they should
not,” he writes in the book’s preface. “Ceding U.S. talent
leadership would hurt Middle America as much as it
would harm Manhattan or Silicon Valley, as a result of
lost tax revenues, weakened colleges, and more. It would
diminish America, not make it whole again.”
The book builds on three propositions. First, talent
is the world’s most important resource. Second, talent
is a resource that is quite movable, unlike, say, a harbor
or coal mine. Third, talent is significantly shaped by the
environment around it. Some might quarrel with the first
proposition, and in some parts of the world this may not
yet be true, but as a general statement it seems quite sensible. The second proposition seems inarguable. It’s the
third proposition that may seem most dubious to some.
The notion that proximity is important to the development of talent in a world in which many people work
remotely and see their colleagues relatively infrequently
may seem outdated. But Kerr argues quite convincingly
that being close to those with complementary ideas
remains very important. Ideas tend to build on each
other, whether in a university setting or a commercial
one – and often those overlap. Think of tech clusters in
Northern California and Boston, for instance. But such
clusters can sprout up in less predictable areas as well. For
instance, Olathe, Kan., has become home to a thriving
tech community. (It also was the site of a 2017 shooting of

two Indian-born engineers, one fatal, who were targeted
because they were immigrants, demonstrating, tragically,
the anger that immigration can stir among some people,
especially those already disgruntled or prone to violence.)
Such clusters benefit greatly from high-skilled immigrant labor, particularly that from India and China. And
these clusters improve the well-being of not only the immigrants themselves and the companies they help to thrive
but of Americans as a whole. There are some people who
are made worse off, though, and Kerr argues for finding
ways to help buffer them from those shocks. Perhaps paradoxically, immigrant-fueled tech clusters also can benefit
the talent-sending countries themselves. Those countries
reap gains from the inventions and innovations produced
by such talent clusters, in the same way as Americans.
But the overseas workers also often “provide their home
countries with special insights and business linkages,”
Kerr writes. For instance, India has launched programs
to bring Indians working abroad (and who often received
their higher education abroad) back to India’s research and
development institutions for months at a time.
Kerr maintains that the United States will continue
to remain the destination of choice for the world’s most
talented workers, but as countries such as India and China
further develop, fewer people likely will opt to leave them.
Also, in order to continue to attract the type of skills that
have benefited the U.S. economy, policymakers will need to
consider changes to the H-1B visa program, the primary
entryway for high-skilled foreign workers. First, he argues,
the United States should raise its annual cap on H-1B visas
from its present level of 85,000 and then index future
increases to population growth or to the national employment growth for skilled workers. But it should also consider
reforms such as replacing the current lottery system for
selection with a wage ranking system: Applicants earning
the highest salaries from their sponsoring employers, and
therefore arguably demonstrating the greatest economic
value, would move up the queue. In addition, to “complement wage ranking and to preserve scarce visas for the
best uses,” he favors raising the H-1B minimum wage from
$60,000 to $100,000. This would leave some important but
lesser-paying professions, such as social work, at a disadvantage, and exceptions should be considered in those instances.
Kerr’s book is readable and his arguments are generally reasonable, but they are not as fully developed as one
might hope, a sacrifice often made to ensure accessibility.
It also would have benefited from more than just glancing
attention to lower-skilled immigration, the benefits of
which often are not clearly seen while the costs often are
widely lamented. Bringing insight to such cases is something economists are particularly well-positioned to do. EF
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

27

DISTRICTDIGEST

Economic Trends Across the Region

Understanding Recent Trends in Labor Market Participation
By R . A n d r e w B a u e r

B

y many metrics, the labor market is very tight.
The national unemployment rate ended 2018
at a level not seen since the 1960s, while the
unemployment rate for the Fifth District reached its
lowest level since the first half of 2000. The number of job openings in the United Sates exceeds the
number of workers looking for jobs, and the level of
initial claims for unemployment insurance is near
a 50-year low. Businesses indicate that finding and
retraining workers is difficult. Yet the percentage of
working-age adults in the country who are active in the
labor market — the labor force participation rate — is
below where it was prior to the Great Recession. A similarly broad metric of the labor market that compares
the number of employed persons in the country to the
working-age population, the employment-to-population
ratio, also remains well below prerecession levels. Do
these metrics imply that the labor metric is not as tight
as thought — that there is additional slack? Are there
workers who left the labor market and are available to
return should the right opportunity arise?
Some point to the fact that wages have increased only
moderately and wage growth remains below rates during
other expansion periods as an indication that there is
some additional slack in the labor market. The lack of
wage growth has been unexpected — particularly given
the drop in the unemployment rate from 10 percent to
under 4 percent. When something becomes scarce or
less abundant, all other things being equal, the price
would be expected to rise. Perhaps what is muting the
price increase is the availability of labor that is currently
out of the labor force.

Labor Force Participation Rates by Gender
100
90
80
70
60
50
40
30
20
10
0

A Look at the Trends
In the latter half of the 20th century, the percentage
of workers engaged in the labor force rose considerably.
The labor force participation rate increased by roughly
8 percentage points from the 1960s to 2000 —
from just under 59 percent to just over 67 percent. The
employment-to-population ratio experienced a similar
increase over the same period. Underlying the
increase in employment and the labor force were
several factors: (1) a large demographic group
entering the labor force — the baby boomers,
(2) an increase in educational attainment, and
(3) women entering the workforce in greater
numbers. After peaking at 67.3 percent in early
2000, the labor force participation rate declined
in two stages: gradually during the first half of the
2000s before leveling off just prior to the Great
Recession and then more sharply during and after
the Great Recession until reaching a 40-year low
of 62.5 in 2015. It is notable that in 2017, the U.S.
labor force participation rate for prime-age workers (aged 25 to 54) ranked 40th out of 50 among
countries in the Organization for Economic
Co-operation and Development — a fact that
would perhaps surprise some as American culture

1948
1951
1954
1957
1960
1963
1966
1969
1972
1975
1978
1981
1984
1987
1990
1993
1996
1999
2002
2005
2008
2011
2014
2017

PERCENT

16 years+

Another unexpected fact of the labor market in recent
years has been the strength of the monthly job gains.
Given population and labor force growth, the number of
monthly job gains necessary to incorporate new entrants
into the labor market is estimated to be between 50,000
and 110,000 jobs. Actual job growth in 2018 far surpassed
this level at close to 225,000. In a tight labor market, with
a low unemployment rate and labor scarcity, one would
have expected to see greater moderation in the monthly
job gains — but that has not happened. Perhaps the explanation, once again, is hidden slack: workers not in the
labor market who are entering as opportunities arise.
In response to these questions, there has been a lot
of research devoted to understanding movements in the
labor force participation rate. It has been in decline since
the late 1990s and that decline accelerated during the
Great Recession and afterward. Is the accelerated decline
due to transitory factors associated with the business
cycle, changing trends in the demand for labor, changes in
the demographic composition of the labor force, or some
combination thereof? This article will review some of the
research that examines the decline in these metrics and
then look to see if this research helps explain the trends in
the Fifth District.

Overall

Men

Women

NOTE: Seasonally adjusted
Source: Bureau Labor of Statistics

28

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

Labor Force Participation Rates by Education
25 years+
90
80
PERCENT

70
60
50
40
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018

30

HS Diploma
Bachelor’s Degree or Higher

Less than HS Diploma
Some College
NOTE: Seasonally adjusted
Source: Bureau Labor of Statistics

Labor Force Participation Rates by Age Group
90
80
70
60
50
40
30
20
10
0
1948
1951
1954
1957
1960
1963
1966
1969
1972
1975
1978
1981
1984
1987
1990
1993
1996
1999
2002
2005
2008
2011
2014
2017

PERCENT OF GDP

is sometimes associated with a stronger emphasis
on work and less on leisure than other cultures.
Underlying the overall decline are movements
by various subgroups within the labor market.
There are notable difference in trends by age
group, gender, and educational achievement. The
labor force participation rate for men has been
in decline for many decades, while the rate for
women rose consistently from 1960 to 1980 before
slowing during the 1990s. (See chart.) The participation rate for women peaked at 60.3 in early 2000
before declining to 56.4 in 2015 and has edged
slowly higher in recent years.
The more educated a worker, the more likely
he or she will be participating in the labor market.
The labor force participation rate for workers with
less than a high school diploma was 46.1 percent
at the end of 2018, while the participation rate for
workers with a bachelor’s degree or higher was 73.6
percent. (See chart.) Note that for workers with a
high school diploma or higher, the participation
rate has been steadily declining in recent decades.
In contrast, the participation rate for workers who
have not finished high school rose from 39 percent
in 1995 to just over 48 percent in 2008. It then
declined until 2014 and has moved higher in recent
years but has not regained its previous high.
With respect to age, while there was a fairly
steady decline for prime-age workers from 2000
to 2015 (with the exception of 2005 to 2008), there
was a much larger decline for younger workers
— particularly workers aged 16-19. (See chart.) In
contrast, the participation of older workers (55 and
older) increased from 1990 to 2010 and has held
steady since.

16-19

20-24

25-54

55+

NOTE: Seasonally adjusted
Source: Bureau Labor of Statistics

Explaining the Changes in Labor
Force Participation
There has been a considerable amount of research looking at these trends. Much of the work concludes that
longer-term secular trends are responsible for the decline
as opposed to temporary cyclical factors. One of the key
drivers in the decline in the U.S. labor force participation
rate is demographics. As mentioned above, a key trend
in recent decades has been in the increase in the share of
older workers (55 and older). Not surprisingly, this is due
to the population getting older — specifically, the aging
of the baby boomer generation. Given that the labor
force participation rate of older workers is considerably
lower, the increase lowers the overall participation rate.
Researchers who have looked at this have found that this
accounts for a sizeable portion of the overall decline.
Andreas Hornstein of the Richmond Fed, Marianna
Kudlyak of the San Francisco Fed, and Annemarie
Schweinert, formerly of the San Francisco Fed, constructed a hypothetical labor force participation rate by

fixing the educational composition of the population and
the participation rate of each group at their 2000 levels and
using the actual age-gender population shares as weights.
In a 2018 San Francisco Fed Economic Letter, they found
that changes in age-gender composition of the population
caused about three-fourths of the decline in the overall
rate. Similarly, in a 2017 article in the Brookings Papers on
Economic Activity, Alan Krueger of Princeton analyzed the
participation rate using a similar methodology and found
that the shift in population shares accounted for 65 percent of the decline in the participation rate between 1997
and 2017. Moreover, because the aging of the population
is expected to continue, its downward effect on labor
participation will most likely continue. In a 2017 article
in Economic Insights, Michael Dotsey, Shigeru Fujita, and
Leena Rudanko of the Philadelphia Fed projected that
rising retirements will continue through the late 2020s,
which would imply a roughly 4 percentage point decline in
the participation rate over that period.
E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

29

Fifth District Labor Force Participation Rates
16 years+
75

PERCENT OF GDP

70
65
60
55
50

1976
1978
1980
1982
1984
1986
1988
1990
1992
1994
1996
1998
2000
2002
2004
2006
2008
2010
2012
2014
2016
2018

45

considerably over the past two decades, while the
share of low-skilled and high-skilled occupations
increased.
Research has looked at the impact of trade on
employment and found that dislocations due to
increased imports may have pushed down labor
participation rates. In a 2016 article in the Journal
of Labor Economics, “Import Competition and the
Great U.S. Employment Sag of the 2000s,” Daron
Acemoglu and David Autor of MIT, Brendan
Price of the University of California, Davis, David
Dorn of the University of Zurich, and Gordon
Hanson of the University of California, San Diego
argued that slow employment growth between
2000 and 2007 was due to greater import competition from China. They estimated the direct
and indirect impact of Chinese imports on U.S.
manufacturing and found sizeable negative effects
on employment — for industries directly exposed
to import competition as well as indirectly for upstream
industries. In theory, the employment lost to import
competition would be expected to be reallocated to other
industries, but they found no evidence that this occurred.
They argued that the reallocation into nonexposed industries is overwhelmed by a negative adverse demand effect.
Prime-age males comprise the majority of manufacturing
employment, so as a result, the negative impact of trade
could be a factor explaining the decline in participation by
prime-age males.
Two other factors cited by research are the rise in disability and the opioid crisis. Dotsey, Fujita, and Rudanko
noted that the decrease in the overall participation rate
since 2000 has been due to roughly equal increases in the
number of nonparticipants citing “in school,” “retired,” or
“disabled.” Krueger analyzed the effect of the opioid crisis
by looking at survey data and opioid prescription rates to
see if the sharp rise in prescription rates had an impact
on labor markets. His results suggest a link between the
opioid crisis and depressed labor force participation. Still,
the effects of the opioid crisis remain difficult to isolate; it
could be that poor labor market outcomes result in opioid
usage in some instances, while opioid use drives poor labor
market outcomes in others. Or it could be that some other
factor is related to both. (See “The Opioid Epidemic, the
Fifth District, and the Labor Force,” Econ Focus, Second
Quarter 2018.)

MD

DC

VA

WV

NC

SC

NOTE: Seasonally adjusted
Source: Bureau Labor of Statistics

Other factors besides demographics are at work, however. The decline in the labor force participation rate
among prime-age workers over the past two decades has
been particularly pronounced for prime-age males, whose
participation rate declined by 2.6 percentage points from
2000 to 2018. There have been a number of explanations
put forth to explain this decline.
John Coglianese, a Ph.D. candidate at Harvard
University, argued that a change in how men are attached
to the labor market is a factor. In his paper “The Rise
of In-and-Outs: Declining Labor Force Participation of
Prime Age Men,” he found that one-third of the decline
in the labor force participation rate of prime-age males is
due to an increase in occasional short breaks between jobs.
He argued that despite these breaks, these individuals are
highly attached to the labor force and work typical jobs but
are notable in that they take brief breaks outs of the labor
force. He found that married or cohabitating men are taking more breaks and account for about half of the increase
in “in-and-outs.” He attributed this rise to a wealth effect
from their partners’ growing incomes. Young men increasingly living with their parents accounted for much of the
rest of the increase.
An article by an economist at the Kansas City Fed,
Diden Tuzemen, argued that a decline in the demand
for middle-skill workers due to job polarization along
with increased international trade and weakened unions
accounted for most of the decline in participation among
prime-age men. He looked at the increase in the nonparticipation rate (out of the labor force) for prime-age males by
education level and noted while there is an increase across
all education levels, the increase was largest for males with
a high school degree and those with an associate’s degree
or some college (middle-skill workers). He also pointed
out that at the same time that more middle-skill workers
were not participating in the labor force, the share of
employment by occupations with middle skills declined
30

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

Fifth District Trends
We see similar trends within Fifth District labor markets.
As in the national data, the labor force participation rate
declined in each of the district jurisdictions from 1997 to
2017 — with the exception of the District of Columbia,
where the rate increased sharply. The largest declines were
in the Carolinas, where the participation rate dropped
close to 7 percentage points; declines in other states were
much less severe — 3.6 percentage points in Maryland, and

2.2 percentage points in Virginia and West Virginia. (See
chart.) The participation rate itself also varies considerably, from West Virginia’s 53.3 percent to the District of
Columbia’s 70.4 percent.
What is driving the differences among jurisdictions?
Not surprisingly, many of the same demographic factors
as on the national level are at work. One is education.
As noted earlier, workers with higher levels of education
are more likely to be in the labor force and employed. In
terms of education, West Virginia stands out in that the
percentage of the population aged 25 or older with less
than a high school education is the highest in the district,
although South Carolina is not far behind, and that the
percentage with only a high school degree is the highest
— and by a considerable margin (41 percent versus an
average of 25 percent for the other five jurisdictions). At
the same time, the percentage of workers with college or
advanced degrees is the lowest. Still, other factors must
be at work as well. Even when looking at participation
rates by education level, West Virginia is still lower
than the rest of the district, and this is true across all
education levels. Most notably, only 36 percent of West
Virginians with less than a high school diploma were in
the labor force versus an average of 60 percent for the
rest of the district. In contrast, the District of Columba
has the highest participation rate and the highest percentage of people with college and advanced degrees -- as
well as the lowest percentage of the population with high
school diplomas or less.
Much like the national picture, changes in participation rates by age and gender as well as the aging population help to account for recent Fifth District trends. The
aging of the baby boomer generation is at work within
district jurisdictions with one notable exception, the
District of Columbia, which has been getting younger.
From 2005 to 2017, the percentage of the population
55 or older increased between 5.6 percent in Virginia to
nearly 7 percent in South Carolina. Moreover, within
the 55 and older age group, the larger increase has been
for the population above the age of 64 — whose participation rate is considerably lower. At the same time, in
contrast, the median age in the District of Columbia fell
by almost two years.
With regard to gender, too, the Fifth District’s economies largely parallel the nation’s. In the district, the
participation rate for males aged 20 to 64 declined by
2.7 percentage points from 2005 to 2017. This was partially
offset by an increase in the participation rate of females by
2.2 percentage points. The male participation rate remained
considerably higher than the female participation rate,

though the gap has declined — the average difference across
district jurisdictions was 7.3 percentage points in 2017, down
from 12.2 percent in 2005.
In addition to demographics, what other factors may be
influencing labor market outcomes in the Fifth District?
Job polarization within the district appears to be a factor behind the decline in the participation rate of males.
Richmond Fed research has found that with the exception
of the District of Columbia, the middle-salary occupation
group has grown more slowly than higher- and lower-salary
occupations — consistent with the notion that increases
in technology were displacing middle-skill employment.
(See “Post-Recession Labor Market Trends in the Fifth
District,” Econ Focus, Third Quarter 2015.)
Another factor cited earlier is the opioid crisis. The hardest-hit jurisdiction in the Fifth District, West Virginia, has
seen improvements. The usage rate there was exceedingly
high in the late 2000s, peaking at 146.9 prescriptions per
100 people in 2009 — 1.8 times greater than the national
average. It has since dropped sharply to 81.3 in 2017, which
is still significantly greater than the U.S. average, but the
gap has shrunk.
Did opioid usage contribute to a decline in the participation rate in the Fifth District? The high usage rates along
with anecdotal information from businesses, nonprofits,
and hospitals within the district suggest opioid usage did
hurt the supply of labor. As noted earlier, however, the scale
of this effect is difficult to assess.
Conclusion
The labor force participation rate peaked in the late
1990s and had been in decline until the last few years.
The labor market continues to tighten, with strong job
growth and an unemployment rate nearing lows not seen
since the late 2000s and 1960s. Much of the explanation
for the changes in participation lies in long-term secular
trends, demographics in particular. An aging population has had an enormous impact, but the participation
rates of young workers and older workers have had a
noticeable impact as well. The long-term decline in the
participation rate of men is less well understood. Job
polarization, the impact of trade on manufacturing, the
rise in disability, and the opioid crisis have been looked
at as possible explanations. There is some suggestive evidence that job polarization and opioid usage are factors
affecting the district’s labor market. The changing age
profile of the Fifth District, changes in participation
rates by age and gender, and differences in educational
attainment are large factors underlying participation
rates across the district.
EF

u

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

31

State Data, Q2:18
	DC	MD	NC	SC	VA	

WV

Nonfarm Employment (000s)
794.9
2,735.1
4,501.4
Q/Q Percent Change
0.1
0.1
0.8
Y/Y Percent Change
0.6
0.5
2.2
				
Manufacturing Employment (000s)
1.4
107.8
474.9
Q/Q Percent Change
5.1
-0.4
1.1
Y/Y Percent Change
5.1
1.0
1.6

2,123.1
0.3
1.8

4,000.1
0.5
1.2

753.3
0.3
1.3

243.7
0.1
1.4

239.1
0.3
2.1

47.0
-0.6
0.6

Professional/Business Services Employment (000s) 167.1
Q/Q Percent Change
0.1
Y/Y Percent Change
0.4

280.5
2.2
1.3

739.4
0.4
1.4

66.6
0.7
-0.1

Government Employment (000s)
237.9
501.6
737.4
367.9
Q/Q Percent Change
-0.2
-0.4
0.2
0.1
Y/Y Percent Change
-1.3
-0.8
0.8
0.9
					
Civilian Labor Force (000s)
406.9
3,231.5
4,988.9
2,318.2
Q/Q Percent Change
1.1
0.2
0.3
-0.3
Y/Y Percent Change
1.4
0.3
1.2
0.4
			
Unemployment Rate (%)
5.6
4.3
4.3
4.0
Q1:18
5.7
4.2
4.5
4.4
Q2:17
6.2
4.1
4.5
4.2

714.2
-0.2
-0.4

156.6
1.6
1.8

4,339.4
0.4
0.8

785.4
0.1
1.2

3.2
3.5
3.8

5.4
5.4
5.0

451.7
1.0
1.6

637.5
0.8
3.6

Real Personal Income ($Bil)
53.1
351.7
437.2
200.3
448.1
66.7
Q/Q Percent Change
0.5
0.4
0.5
0.4
0.6
0.4
Y/Y Percent Change
1.8
1.4
2.3
1.7
2.2
1.8
							
New Housing Units
974
4,280
18,536
9,729
8,227
900
Q/Q Percent Change
35.1
-2.8
2.6
11.1
-2.0
45.4
Y/Y Percent Change
-9.9
-10.9
26.9
12.2
-1.8
17.8
			
House Price Index (1980=100)
899.7
479.7
383.6
388.4
466.5
240.9
Q/Q Percent Change
2.5
1.2
3.1
1.6
1.8
2.0
Y/Y Percent Change
7.4
4.2
6.8
6.8
3.9
4.3
Notes:

Sources:

1) FRB-Richmond survey indexes are diffusion indexes representing the percentage of responding firms
reporting increase minus the percentage reporting decrease.
The manufacturing composite index is a weighted average of the shipments, new orders, and
employment indexes.
2) Building permits and house prices are not seasonally adjusted; all other series are seasonally adjusted.
3) Manufacturing employment for DC is not seasonally adjusted

Real Personal Income: Bureau of Economic Analysis/Haver Analytics.
Unemployment Rate: LAUS Program, Bureau of Labor Statistics, U.S. Department of Labor/Haver
Analytics
Employment: CES Survey, Bureau of Labor Statistics, U.S. Department of Labor/Haver Analytics
Building Permits: U.S. Census Bureau/Haver Analytics
House Prices: Federal Housing Finance Agency/Haver Analytics

For more information, contact Akbar Naqvi at (804) 697-8437 or e-mail Akbar.Naqvi@rich.frb.org

32

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

Nonfarm Employment

Unemployment Rate

Real Personal Income

Change From Prior Year

Second Quarter 2007 – Second Quarter 2018

Change From Prior Year

Second Quarter 2007 – Second Quarter 2018

4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%
07 08 09 10

Second Quarter 2007 – Second Quarter 2018

10%
9%
8%
7%
6%
5%
4%
11

12

13

14

15

16

17

18

3%
07 08 09 10

11

12

13

14

15

16

Fifth District

17

18

8%
7%
6%
5%
4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
07 08 09 10

11

12

13

14

15

16

17

18

15

16

17

18

17

18

United States

Nonfarm Employment
Major Metro Areas

Unemployment Rate
Major Metro Areas

New Housing Units

Change From Prior Year

Second Quarter 2007 – Second Quarter 2018

Second Quarter 2007 – Second Quarter 2018

Change From Prior Year

Second Quarter 2007 – Second Quarter 2018

7%
6%
5%
4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%
-7%
-8%
07 08 09 10

11

Charlotte

12

13

14

Baltimore

15

16

17

18

13%
12%
11%
10%
9%
8%
7%
6%
5%
4%
3%
2%
1%
07 08 09 10

Washington

40%
30%
20%
10%
0%
-10%
-20%
-30%
-40%
11

Charlotte

12

13

14

Baltimore

15

16

17

18

-50%
07 08 09 10

FRB—Richmond
Services Revenues Index

FRB—Richmond
Manufacturing Composite Index

Second Quarter 2007 – Second Quarter 2018

Second Quarter 2007 – Second Quarter 2018

8%

20

6%

20

10

4%

0

2%

-10

0%

-20

-2%

-10

-30

-4%

-20

-40

-6%

-30
07 08 09 10

-50
07 08 09 10

-8%

12

13

14

15

16

17

18

14

United States

Second Quarter 2007 – Second Quarter 2018

30

11

13

Change From Prior Year

30

0

12

House Prices

40

10

11

Fifth District

Washington

11

12

13

14

15

16

17

18

07 08 09 10

11

Fifth District

12

13

14

15

16

United States

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

33

Metropolitan Area Data, Q2:18
Washington, DC	

Baltimore, MD	

Hagerstown-Martinsburg, MD-WV

Nonfarm Employment (000s)
2,726.9
1,426.2
106.0			
Q/Q Percent Change
1.6
2.5
2.1			
Y/Y Percent Change
1.4
1.8
0.8			
						
Unemployment Rate (%)
3.4
4.2
4.4			
Q1:18
3.6
4.4
4.4			
Q2:17
3.7
4.3
4.1			
						
New Housing Units
5,838
1,966
413			
Q/Q Percent Change
-9.4
-8.3
60.7			
Y/Y Percent Change
-12.6
8.5
35.0			
						
				
	Asheville, NC	Charlotte, NC	Durham, NC	
Nonfarm Employment (000s)
194.5
1,212.7
316.4			
Q/Q Percent Change
1.8
1.7
1.9			
Y/Y Percent Change
2.0
2.8
1.5			
						
Unemployment Rate (%)
3.3
3.8
3.6			
Q1:18
3.4
4.2
3.9			
Q2:17
3.6
4.2
3.9			
						
New Housing Units
814
5,988
2,023			
Q/Q Percent Change
2.3
-15.7
78.6			
Y/Y Percent Change
4.4
41.9
69.9			
					
						
Greensboro-High Point, NC	Raleigh, NC	
Wilmington, NC
Nonfarm Employment (000s)
364.1
633.8
128.1			
Q/Q Percent Change
1.5
2.1
3.1			
Y/Y Percent Change
1.1
3.1
1.0			
						
Unemployment Rate (%)
4.3
3.5
3.9			
Q1:18
4.6
3.9
4.1			
Q2:17
4.7
3.9
4.2			
						
New Housing Units
656
4,421
556			
Q/Q Percent Change
15.3
1.3
5.7			
Y/Y Percent Change
-19.2
20.5
8.6			
						
					
Note: Nonfarm employment and new housing units are not seasonally adjusted. Unemployment rates are seasonally adjusted.

34

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

Winston-Salem, NC	Charleston, SC	Columbia, SC
Nonfarm Employment (000s)
268.1
360.0
399.3		
Q/Q Percent Change
1.5
2.1
1.3		
Y/Y Percent Change
1.8
1.4
0.1		
					
Unemployment Rate (%)
3.9
2.8
3.2		
Q1:18
4.2
3.8
4.4		
Q2:17
4.3
3.5
3.9		
					
New Housing Units
626
2,072
1,301		
Q/Q Percent Change
5.6
38.0
6.8		
Y/Y Percent Change
2.5
25.0
-13.0		
					
				
Greenville, SC	Richmond, VA	Roanoke, VA	
Nonfarm Employment (000s)
424.2
680.3
161.8		
Q/Q Percent Change
1.3
2.0
2.0		
Y/Y Percent Change
2.2
1.0
0.8		
					
Unemployment Rate (%)
2.9
3.2
3.1		
Q1:18
4.0
3.4
3.3		
Q2:17
3.7
3.9
3.8		
					
New Housing Units
1,748
1,404	N/A		
Q/Q Percent Change
21.9
-15.8	N/A		
Y/Y Percent Change
32.3
-9.2	N/A		
					
				
Virginia Beach-Norfolk, VA	Charleston, WV	
Huntington, WV	
Nonfarm Employment (000s)
788.6
117.6
138.0		
Q/Q Percent Change
2.7
1.9
1.9		
Y/Y Percent Change
0.4
0.3
-0.6		
					
Unemployment Rate (%)
3.3
5.2
5.5		
Q1:18
3.5
5.4
5.4		
Q2:17
4.2
5.0
5.7		
					
New Housing Units
1,563
22
67		
Q/Q Percent Change
8.5
0.0
0.0		
Y/Y Percent Change
-6.1
0.0
0.0		
					
					
For
more information, contact Akbar Naqvi at (804) 697-8437 or e-mail Akbar.Naqvi@rich.frb.org
				

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

35

Opinion

Does the Fed Need Room to Cut?
B y J o h n A . We i n be r g

T

he U.S. economy has been growing steadily since
the end of the Great Recession, and during most
of that period, the target rate set by the Federal
Open Market Committee (FOMC) remained exceptionally low. It has only been in the past few years that the
FOMC has gradually raised the target rate to its current
range of 2.25 to 2.5 percent, which is still low by historical
standards.
Some have criticized those increases, arguing that
despite the unemployment rate falling to unusually low
levels, signs of incipient inflation are hard to find. Why
risk potentially dampening the recovery in the face of a
nonexistent threat, they have asked.
Recently, however, a different argument has been made
by some other critics of FOMC policy actions. The target
rate is too low, they claim. But not for the reason you
might initially expect — namely, that they do see inflation
on the horizon and believe the FOMC should act more
aggressively than it has. Rather, they say the FOMC effectively needs to put more ammunition into its toolkit than
it currently has to fight the next recession.
The argument goes something like this. When the
economy has contracted in the past, the target rate has
been substantially higher than it currently stands. As a
result, the FOMC had room to cut to help foster a recovery. Writing in the Wall Street Journal, Harvard economist
Martin Feldstein noted that the United States has experienced 11 recessions since 1945. With the exception of the
Great Recession, most of those have been short and shallow. The reason, according to Feldstein? “[B]ecause the
Federal Reserve historically has responded to downturns
by sharply reducing the fed-funds rate.”
Feldstein is correct that the Fed has in the past cut the
target rate substantially during recessions. For instance,
in response to the slowdown of the early 2000s, the Fed
cut the target rate from 6.5 percent in December 2000
to 1.75 percent by December 2001. The magnitude of
this reduction, about 5 percentage points, is roughly on
par with the Fed’s response to previous post-World War
II recessions. Such historical comparisons suggest that
the Fed is at risk of not being able to cut enough should
a recession occur in coming years.
But in the standard models used for assessing interest
rate policy, it is the level of the real rate that matters, not
the change in the rate per se. With inflation expectations
anchored around 2 percent and an effective zero lower
bound for the nominal rate, the lowest you can bring down
the real rate to is about -2 percent — no matter how high
the nominal rate is when the Fed begins to cut.
It’s not plain that increasing the nominal rate would
36

E co n F o c u s | F o u rt h Q u a rt e r | 2 0 1 8

be meaningful in the way that Feldstein and others
have suggested, because it’s not plain that a real rate of
-2 percent wouldn’t bolster the macroeconomy in the case
of a typical downturn. Furthermore, it’s true that rates are
low by historical standards for an economy that has been
expanding for nearly a decade. But relatively low rates are
consistent with relatively modest growth, and annual real
economic growth has been about 2 percent since the end
of the Great Recession, roughly 1 percentage point lower
than the rest of the post-World War II period. In a lower
growth environment, it seems reasonable to believe that
the Fed would not have to lower rates as sharply as it has
in the past to achieve a real rate that would help bring the
economy out of recession.
In addition, research done by my Richmond Fed colleague Christian Matthes, in conjunction with Regis
Barnichon of the San Francisco Fed, tells me that we
should not underestimate the costs of raising the target
rate. Their research suggests that contractionary monetary policy shocks raise unemployment more strongly
than expansionary monetary policy shocks lower it. That
means, if anything, the cost of pushing rates in an expansion a little higher than would otherwise be expected
could be greater than any benefit of being able to take
rates down a little bit more in a recession.
One objection proponents of the “room to cut”
argument might raise is that the rate increases they
advocate would not be shocks, what Matthes and
Barnichon discuss, at least not in the way that term is
generally used. That is, those increases would be following an expected path. But raising rates higher than you
otherwise would based on current economic conditions
and the near-term outlook in order to create room to cut
could act as a shock.
All that said, we never really know with high precision
what the “correct” target rate is for any given set of economic conditions, and small differences in rates appear to
make relatively little difference most of the time. Also, the
efficacy of monetary policy is strongly affected by whether
it instills confidence. So it’s possible that if the public
believes that having room to cut will be important in a
future downturn, there might be some benefit to a slightly
higher rate in the present at relatively little cost. But I
suspect that any such benefit wouldn’t be significant.
And, importantly, the types of increases that current
room-to-cut advocates favor are far from small and could
bring with them considerable costs.
EF
John A. Weinberg is a policy advisor at the Federal
Reserve Bank of Richmond.

NextIsSue
Opportunity Zones

“Opportunity Zones,” which were created by the 2017 Tax Cuts
and Jobs Act, are intended to draw long-term investment to
distressed areas. More than 800 have been designated in the Fifth
District. Many policymakers and community leaders are excited
about their potential, but others are worried about unintended
consequences.

Initial Coin Offerings

In recent years, firms have raised billions of dollars in capital by
selling digital tokens or coins. These initial coin offerings, or ICOs,
may have some advantages over traditional corporate fundraising,
but they also raise new questions for regulators.

Rural Hospitals

Hospitals in rural areas across the country, especially in more
distressed rural areas, are closing at an increasing rate. What
challenges do rural hospitals face that are different from those
of hospitals elsewhere? And what do the closures mean for
access to health care, economic activity, and upward mobility in
the affected communities?

Federal Reserve
The Fed has a mandate to meet domestic
economic goals of maximum employment
and stable prices. But changes in U.S.
monetary policy can have effects on other
countries too. As financial markets become
increasingly global, should central banks
worry about monetary policy spillovers?

Economic History
The U.S. capital was originally Philadelphia,
but Congress fled when angry Continental
Army soldiers marched on Independence
Hall in 1783 to demand back pay. Eventually,
the capital was relocated to a special district
carved from Virginia and Maryland. The move
had massive long-run implications for the
economic development of those two states.

Interview
Enrico Moretti of the University of
California, Berkeley on why rich cities are
becoming richer, the role of universities
in a region’s development, Amazon’s HQ2
decision, and word-of-mouth about movies
as a case study in information sharing.

Visit us online:
www.richmondfed.org
•	To view each issue’s articles
and Web-exclusive content
• To view related Web links of
additional readings and
references
• To subscribe to our magazine
•	To request an email alert of
our online issue postings

Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261

Change Service Requested

To subscribe or make subscription changes, please email us at research.publications@rich.frb.org or call 800-322-0565.

Economic Brief publishes an
online essay each month about
a current economic issue

Economic Brief

The Persistence of
Financial Distress

By Kartik B. Athreya

and Jessie Romero

Household financia
l distress is per vas
ive. Is this pattern
share of individu
driven by a small
als experiencing
persistent distress
more occasional dist
, by the majority
ress, or something
facing
in between? Recent
that over a lifetime
research indicate
, financial distress
s
is unlikely for mos
for some. Models
t but very persist
that account for
ent
the uncertain evo
earnings over tim
lution of consum
e and the availab
ers’
ility of formal con
not explain — by
sumer bankruptcy
themselves — this
canpattern, but a mod
for informal default
el that also allows
and variation in con
sumers’ willingness
wealth for current
to sacrifice future
spending can.

March 2019
The Persistence of Financial Distress
Household financial distress is pervasive. Is this pattern
driven by a small share of individuals experiencing
persistent distress, by the majority facing more
occasional distress, or something in between? Recent
research indicates that over a lifetime, financial
distress is unlikely for most but very persistent for
some. Models that account for the uncertain evolution
of consumers’ earnings over time and the availability
of formal consumer bankruptcy cannot explain this
pattern by themselves, but a model that also allows for
informal default and variation in consumers’ willingness
to sacrifice future wealth for current spending can.

March 2019, EB19-03

At any point in time
, many households
in the
United States are
In a 2017 working
in precarious finan
paper (revis
est ed in July 2018
cial positions. According
7, the old
Kartik B.rter
),
201of
ofeya
to a 2018 report from
qua Athr
the Richmond Fed,
second
the
Federal Reserve, four
ree and
José
Mustre-de
y-th
entof
period, in the
l-Río
in ten adulttion
the Kans
were sev
s
ple
woul
as itCity Fed, and Juan
d
sam
the
not
in
able to pay
the
archers lim
M. Sanc
an on
rese
variati
unex
hez
ividuals inbe
pect
The
lied
of
.
ind
the
ed
ree
imp
expe
St.
Louis
nse
y-th
the
of $400 st
Fed provide a nove
re fort
s note,
would cove
a variety
ugh the age
and deta
l
r it for
throriptio
by sellin
younge orwe
As the author
ualsdesc
the or
g
is a stand-in
some
s’ thing
n of the incidence
to individiled
old
borro rements conc
of the future
ing money.
and
to househ
t and delinaul
entration
ir measuwdefof
“discounting”
repo
the
financial distress amo
t contributeThe sameapp
that more in order to focus on
ro-rt found
d forces thathan oneng U.S.
consumerem
antly,
y-five
. also
ofthe
adults are not able
rs. 3ent
of unobserve
They
of sixt
n. Importfifth
reti
ptio
deve
ore
eslop
to
sum
bef
pay
a
nec
r
simu
of their curre
con
allavio model that
not
lation
s ismon
beh
demands for
successfully repro ly
th’s bills in full.
que
ir findingnt
1 ncy
duces these facts
etation of the
a severe
ate levels
as arising from
having
priate interpr
e different inn
distress ishous
eholdorborro
wing and refinancialent
ividuals hav
ers are
Do these ny
s more
sumns
proportio
120inday
e measure of paymone
decis
sarily that ind
ofoth
that isions
“fina
that ma con
thecen
erindiv
ernciallyOn
face
—
distr
t
rath
of
of
essed
t
risks
of
oun
but
t
”
to
acc
idua
their
ce,
t
hos
per
incomes. Aaro
ls or a
uen
14ent
und
of patien
represent delinq
elem
impatient by families
of theress
. el’s succ
nition, key
l group this defi
mod
renderedthat
re detail on a smal
is chro for mo
ess
is allowing varia
financial dist
ly distr
persistently
past due. By
essed, or
allows nical
thatexpo
s experiencetion in the
rateear
refle-fiv
at- which housework the
cte-year-old
d bargain- do they
-five-y
sureuse
hol
holds effectively
fifty
of
nty
a
for
much
t
factors. Future
aho
twe
cen
large
intr
seem
,
to prefe spending
cksmore
10 per
el shoto
seholdr set of household
s belowspen
today
of adultsr are
temporary
in hou
share
household-lev
risks? The answer The sharesfall over tive
smalllater.
of variation
lyding
Athreya
sdecid
rce
matt
and
rela
sou
per
a
his
ers
for
er
h
coau
hly
oth
thors
hig
oug
research cont
is
’
ing, and
ial.how to appropriately interolds. Alth
distress
enting
es to the grow
time, ribut
re ess
pret, and
uent
therefoperh
any given
aps devise policy
erely delinq ing body of
ture
resources is
thate conc
a sevlude
in distress at literawh
responses to, num
o hav
s difference a
bers
uals ures
such as those abov
re ”likely to hav es in such
e. In addition,
sistent. Individmeasthre
es mo
ence
tim“pati
e of
are tha
an n
impo
indi-rtant feature
dings
many fiscal polic
Bank of the effects
oft todayofare
e
Fin
l
erv
rs
ica
the
Res
yea
pir
l
data
oun
ies
Em
depend on
. More
t in six
broadly, theirework
oun
the Federa
knowingacc
who
e from
is cons
Data. The
also adds
inquent acc
trainedel/E
ed. (Se Figure
quifaxacce
in their
The data com
distress
ress
severely del to the prog
tlyecon
dit Pan
ss tohisren
omists have
credit and
er Cre
erse in intro
sum
mad
by
2
how
are not cur
te credit
much.wit
t of consum
centype
man diffe
New York Con
viduals who ducing
per
s h comple
rent
s of
re thany 30
-five in
es individual
n, mo
additiointo
through fifty
macroeconomic mod 4 heterogeneity
In
e
sample includ
1.)
-fiv
nty
els.
ervare age twe
end of the obs
tories who we
of 1999. By the
rter
qua
EB19-0
t
3 – Federal Reserve
the firs
Bank
l Distress
of Richm
cia ) Recurrence
anond
Distress (FD
obabilityancofialFin
ure 1: Pr
bability of Fin
Fig
Figure 1: Pro

Page 1

0.6
0.6

0.5
0.5

0.4
0.4

bability
Pro
Probability

February 2019
Large Excess Reserves and the Relationship
Between Money and Prices

0.3
0.3

0.2
0.2

0.1
0.1

January 2019
It’s a Wonderful Loan: A Short History
of Building and Loan Associations
To access Economic Brief and other research publications,
visit: www.richmondfed.org/publications/research/

0

25
25

30
30
r FD
Two Years afte
rs after FD
Eight Yea
Uncondi�onal

35
35

40
40

Age

Age
r FD
Four Years afte
r FD
rs afte
Years
a�er FD
Ten
2 yea

45
45

50
50

55
55

r FD
Six Years afte
al
ition
r FD
Uncrsond
a�e
4 yea

l/Equifax Data
r Credit Pane
FD due,
rs a�e
daysr past
York Consume
yeaplus
10120rve Bank of New
ng an account
ty.
r FD
7); Federal Rese years as
sured by havi
mea
Sanchez (201
ional probabili
8 ess, a�e
financial distr
s the uncondit
tre-del-Río, and
ing
Mus
show
rienc
eya,
line
FD
dashed
rs a�e
abilirty of expe
Sources: Athr
yeaprob
6the
in the past. The
lays
ess
disp
e
distr
cial
Notes: The figur ng experienced finan
on havi
conditional

Page 2