View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

Does Online
Education Work?

Where the Newly
Created Money Went

Baltimore
Steel

VOLUME 17
NUMBER 1
FIRST QUARTER 2013

12

COVER STORY

Drawing the Line: New measures of poverty illustrate just how
hard it is to define who is poor

Econ Focus is the
economics magazine of the
Federal Reserve Bank of
Richmond. It covers economic
issues affecting the Fifth Federal
Reserve District and
the nation and is published
on a quarterly basis by the
Bank’s Research Department.
The Fifth District consists of the
District of Columbia,
Maryland, North Carolina,
South Carolina, Virginia,
and most of West Virginia.
DIRECTOR OF RESEARCH

John A. Weinberg

FEATURES

EDITORIAL ADVISER

17

Kartik Athreya
EDITOR

To Be Clear: Muddy language can be costly

Aaron Steelman
SENIOR EDITOR

David A. Price

20
MANAGING EDITOR /DESIGN LEAD

Kathy Constant

Taking College Out of the Classroom: Are online classes
the future of higher education?

STA F F W R I T E R S

24

Red Skies and Blue Collars: Sparrows Point shaped Baltimore

Renee Haltom
Betty Joyce Nash
Jessie Romero
E D I TO R I A L A S S O C I AT E

Tim Sablik
CONTRIBUTORS

28

Where the Newly Created Money Went: Monetary expansion
has led banks to park huge excess reserves at the Fed — for now

R. Andrew Bauer
Jamie Feik
Charles Gerena
Keith Goodwin
Karl Rhodes
Sonya Ravindranath Waddell
DESIGN

ShazDesign

DEPARTMENTS

1 President’s Message/Big Banks Need “Living Wills”
2 Upfront/Regional News at a Glance
6 Federal Reserve/When Talk Isn’t Cheap
10 Jargon Alert/Sticky Wages
1 1 Research Spotlight/From Housing Bust to Baby Bust?
30 Interview/Christopher Carroll
35 Economic History/Disney’s America
38 Policy Update/Under Pressure
39 Book Review/A Capitalism for the People
40 District Digest/Economic Trends Across the Region
48 Opinion/On Economic History and Humility

Published quarterly by
the Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261
www.richmondfed.org
www.twitter.com/RichFedResearch

Subscriptions and additional
copies: Available free of
charge through our website at
www.richmondfed.org/publications
or by calling Research
Publications at (800) 322-0565.
Reprints: Text may be reprinted
with the disclaimer in italics below.
Permission from the editor is
required before reprinting photos,
charts, and tables. Credit Econ
Focus and send the editor a copy of
the publication in which the
reprinted material appears.
The views expressed in Econ Focus
are those of the contributors and not
necessarily those of the Federal Reserve Bank
of Richmond or the Federal Reserve System.
ISSN 2327-0241 (Print)
ISSN 2327-025X (Online)

PHOTOGRAPHY: RICHARD NEWSTEAD/FLICKR /GETTY IMAGES

PRESIDENT’SMESSAGE

Big Banks Need “Living Wills”
conomists and policymakers are still debating the
causes of and responses to the financial crisis of
2007-2008, but there is one clear point of consensus: We cannot continue to treat certain financial institutions as being “too big to fail.” Many provisions of the
Dodd-Frank Act of 2010 were written with this goal in
mind, and we have yet to see how effective they will be.
But I believe that the provision requiring large and complex
financial institutions to craft “living wills” offers the greatest potential for curtailing the ambiguous government
safety net for financial institutions and putting an end to
government bailouts.
Living wills are detailed plans that explain how a financial
institution could be wound down under U.S. bankruptcy
laws without threatening the rest of the financial system or
requiring a public bailout. The plans explain how to disentangle the numerous different legal entities — sometimes
numbering in the thousands — that make up a large financial
firm. Under the Dodd-Frank Act, large banks and other
“systemically important” firms are required to submit these
plans on an annual basis for review by the Fed and the
Federal Deposit Insurance Corporation (FDIC). The largest
banks submitted the first drafts of their plans last summer.
Regulators then pored over the thousands of pages of documents, focusing primarily on evaluating how well the firms
had identified potential obstacles to resolution and understanding the key assumptions in the plans.
Planning for the resolution of a large, complex firm is
difficult, painstaking work. But it is critical that regulators
invest the time and energy necessary to ensure that the
plans are workable and credible. Only if the plans are
credible will regulators and policymakers be willing to use
them in a future crisis. That willingness is essential to ending
investors’ expectations of government rescues, which
encouraged many firms to take on excessive risk prior to
the crisis.
The Dodd-Frank Act also created the Orderly
Liquidation Authority (OLA), which allows the FDIC to
wind down certain troubled institutions in cases where the
bankruptcy process is deemed to pose a great risk to the
financial system. This authority was intended as an alternative to government rescues. But instead, the OLA still
affords policymakers and regulators a great deal of discretion in determining how to treat different creditors, which
further weakens the market discipline that would prevent
institutions from taking on excessive risks. For this reason,
I believe the use of living wills within bankruptcy is the
better course. Should the creation of those plans reveal that
bankruptcy would pose a risk to the system as a whole, firms
may be subject to more stringent capital requirements or

E

required to change their
structure and operations such
that bankruptcy is workable.
An example of this would be
divesting certain subsidiaries.
Some have proposed that
the first step should be to
break up the banks — that the
way to prevent “too big to fail”
is simply to make sure that the
banks aren’t too big. But how
do we define “too big”? The process of having firms create
detailed resolution plans will enable us to map out the risks
and interdependencies, and determine whether or not an
institution’s size and complexity would prohibit an unassisted resolution. Living wills will provide us with an
actionable roadmap.
Skeptics also have argued that living wills are little more
than window dressing, an exercise that will be ignored
should an institution actually become distressed. This claim,
however, only reinforces the point that it is vitally important
to do the work necessary to ensure that the plans offer
attractive and realistic options for regulators.
The process for creating and having the Fed and FDIC
assess living wills is not intended to place inordinate restrictions on an institution’s ability to take appropriate risks, or
to try to make them perfectly safe from failure. Failures are
going to happen despite the best efforts of regulators.
With living wills, however, robust contingency planning
takes place to ensure that they can occur without major
disruptions to the financial system. Living wills are an
important tool to help us restore market discipline, rein in
the government safety net, and truly end the problem of
too big to fail.
EF

JEFFREY M. LACKER
PRESIDENT
FEDERAL RESERVE BANK OF RICHMOND

You’ll notice that this magazine, formerly Region Focus, is
now called Econ Focus, a name that we believe better reflects
the magazine’s mix of both national and regional coverage.
But only the name has changed — Econ Focus will continue to
bring you clear explanations of economic trends and important policy questions. Thank you for reading and for your
continued engagement with the work of the Richmond Fed.
ECON FOCUS | FIRST QUARTER | 2013

1

UPFRONT

Regional News at a Glance

Skin in the Game

Redskins’ Complex Deal Involves Many Players

PHOTOGRAPHY: WASHINGTON REDSKINS

Last year, the Washington Redskins football team held its summer training camp at
Redskins Park in Ashburn, Va., while, 100 miles to the south, dozens of preschoolers
romped on the playground at Tot Lot Park in Richmond, Va.
The level of play was vastly different, but the pros
and the peewees were connected by a complex chain
of economic development incentives.
Beginning this summer, the Redskins are moving
their summer training camp from Ashburn (in
Northern Virginia’s Loudoun County) to Richmond
for at least the next eight years. The team selected
Richmond after the city agreed to build a $10 million complex that the Redskins will use for about
one month each summer. In an overlapping deal, the
team agreed to keep its headquarters and year-round
training facilities in Ashburn in exchange for
$4 million from the state and $2 million from
Loudoun County. The money will go toward renovating the existing facilities.
Richmond courted the Redskins primarily to
generate tax revenues, directly and indirectly, from
visitors who will come to watch the team practice.

Redskins quarterback Robert Griffin III signs
autographs on fan appreciation day during
last year’s summer training camp.

2

ECON FOCUS | FIRST QUARTER | 2013

An economic impact study by Richmond-based
Chmura Economics and Analytics projected that
visitors to the training complex will spend
$4.3 million in the city during the 2013 camp.
The study assumed that 100,000 people will visit
the facility and that 40 percent of those people will
stay an average of two nights in the area.
Richmond had little more than one year after the
Redskins’ announcement to develop facilities for
the training camp. The city quickly leased 16 acres
behind the Science Museum of Virginia from the
state and started looking for private sector partners
to help defray the cost of building the complex and
moving the team back and forth between Richmond
and Loudoun County each summer.
In October, the city unveiled plans for the Bon
Secours Washington Redskins Training Center, a
joint venture between the city’s economic development authority and Maryland-based Bon Secours, a
nonprofit health care company that owns four hospitals in the Richmond area. The training center will
include football fields, parking spaces, observation
areas, and a 40,000-square-foot building with a
field house for the Redskins and a sports medicine
and rehabilitation center for Bon Secours.
The city transferred $10 million to its economic
development authority to pay for the construction.
It plans to recoup its investment primarily via lease
revenues and sponsorships — including more than
$6.3 million from Bon Secours over 10 years.
In exchange for Bon Secours’ commitment, the
city agreed to grant the health care company a 60year lease on a vacant school property — the location of Tot Lot Park — near Bon Secours St. Mary’s
Hospital. The city also agreed to help the company
develop a medical office building and fitness center
near Bon Secours Richmond Community Hospital
in a medically underserved area of the city.
On the vacant school property, Bon Secours
plans to spend $24 million to build a 75,000-squarefoot medical building. The company will lease the
site for $33,000 per year, including $28,000 that the

city will use to maintain public playing fields on the
property. The company and the city will work together to
relocate Tot Lot Park and make it more accessible for
disabled children.
The mayor’s announcement of the deal said the money
Bon Secours would save by leasing the school site instead of
buying it would be “directed to the construction costs of
the Washington Redskins practice fields and the development of Richmond Community Hospital in the East End.”
The statement prompted criticism that the lease
arrangement was designed to circumvent the city’s policy
of returning money from the sale of school properties to

the school system. In response, the city agreed to dedicate
much of the projected tax revenue from Redskins-related
projects (nearly $5.4 million over 10 years) to the school
system. Bon Secours promised to contribute an additional
$1 million to Richmond schools over 10 years to fund
projects related to the company’s mission of promoting
health and fitness.
As of early March, the Bon Secours Washington
Redskins Training Center was taking shape quickly, and the
medical building on the school property was in the preliminary planning stage. The new location for Tot Lot Park was
still undetermined.
— KARL RHODES

This One Goes to 11

Eight More States Join the Court Battle Against Dodd-Frank
nother Fifth District state has joined the lawsuit
against the federal government challenging the constitutionality of the Dodd-Frank Act. On February 13, the
attorney general of West Virginia, along with attorneys
general from seven other states, signed on as a plaintiff.
They join South Carolina, Michigan, and Oklahoma, which
became plaintiffs last September. This latest action brings
the total number of states involved to 11.
The case began last June, when State National Bank of
Big Spring, a small Texas bank, and a pair of nonprofit advocacy groups, the 60 Plus Association and the Competitive
Enterprise Institute, filed suit in federal court in
Washington, D.C. These three plaintiffs contend that
Congress violated the U.S. Constitution’s separation of
powers by delegating too much power to the Bureau of
Consumer Financial Protection and the Financial Stability
Oversight Council, new regulatory bodies created by the
Dodd-Frank Act. The Council, for example, determines
which financial firms are so big or complex that they should
be designated “systemically important financial institutions,” or “SIFIs.” According to these plaintiffs, the SIFI
designation codifies “too big to fail,” the belief that regulators will never allow very large banks to fail for fear of
economic upheaval. Market participants who understand
this might lend to SIFIs more cheaply than they would to
small banks like State National Bank of Big Spring.
The states have limited their challenge to the Orderly
Liquidation Authority, the FDIC’s new power to unwind
systemically important financial firms on the brink of

A

failure, including those that had not previously received
SIFI designation. They contend that this new regime
allows the federal government to take over and dismantle
failing companies without any opportunity for interested
parties, such as creditors and shareholders, to object in
court. This, the attorneys general insist, is a government
taking of private property without due process of law,
a violation of the Fifth Amendment. They base their
takings argument on potential risk to their respective
states’ pension funds. If the FDIC were to liquidate a
financial firm through Orderly Liquidation, they argue,
the process would sacrifice the traditional creditor rights
and safeguards of the Bankruptcy Code. Pension fund
holdings would likely be “arbitrarily and discriminatorily
extinguished.”
The federal government has moved to dismiss the
lawsuit, noting that many of the injuries the plaintiffs have
described are either too indirect or too speculative. Some
observers say the states’ takings and due process arguments
might prove to be the plaintiffs’ strongest. If these
arguments withstand the motion to dismiss, the states
will need to show that the judicial review of the Orderly
Liquidation process is too limited. In the meantime, the
original plaintiffs reasserted their opposition to the
government’s motion, insisting that their injuries are
“concrete” and “imminent.” The government responded
on April 9 that the plaintiffs lack standing because their
injuries are “entirely hypothetical.” A decision is expected
later this year.
—KEITH GOODWIN

ECON FOCUS | FIRST QUARTER | 2013

3

Employment After Deployment

Transferring Military Skills to the Private Sector
t the end of their enlistments, members
of the military leave the front lines of
war only to face the front lines of the labor
market. There are more than 11 million
veterans in the U.S. labor force, and 783,000
of them are without work.
The job search might get easier for some
veterans, thanks to a new training program
that seeks to connect 100,000 service men
and women to skilled manufacturing jobs
by 2015. The “Get Skills to Work” initiative,
a collaboration between the nonprofit
Manufacturing Institute and major manufacturers, is an attempt to solve two problems at
once: a shortage of skilled manufacturing
workers and high unemployment rates
among certain veteran groups. According
to the institute, 600,000 skilled manufacturing jobs are unfilled. Partners include
manufacturing giants General Electric,
Alcoa, Boeing, and Lockheed Martin, which
together already employ 64,000 veterans.
The training sessions are scheduled to occur
in 10 states in 2013, including at technical
and community colleges in Greenville
and Charleston, S.C., and Durham, N.C.
The Carolinas are home to almost 1.2 million
veterans.

A

UNEMPLOYMENT RATE

Young Vets Are Prone to Unemployment
35
30
25
20
15
10
5
0
Total
Nonveterans
Population

All
Gulf War-era Nonveterans Veterans
Veterans
II Veterans 18-24 Years 18-24 Years
NOTE: Gulf War-era II veterans are those who have served since 2001.
SOURCE: Bureau of Labor Statistics, March 2013 data

4

ECON FOCUS | FIRST QUARTER | 2013

“From a veteran’s perspective, the problem is being able to translate their skills into
civilian terms,” says Bryan Goettel of the
U.S. Chamber of Commerce. To address this
issue, Get Skills to Work is creating an online
badge program that equates the military’s
skills codes to manufacturing occupation
codes and matches veterans with employers.
While the overall veteran unemployment
rate is below that of the population as a
whole — 7.1 percent compared to 7.6 percent
as of March 2013 — certain subgroups are
more at risk. Members of the “Gulf War-era
II,” which includes veterans of Iraq and
Afghanistan, face a 9.2 percent unemployment rate. For veterans under 25, a group that
includes many of the Gulf War-era II vets,
32.9 percent are jobless. (See chart.) Many of
these veterans have not graduated from
college and have only a military career on
their resume.
Get Skills to Work joins several recent
public and private efforts to boost postmilitary hiring. In 2009, President Barack
Obama approved a hiring initiative for the
federal government, after which the share of
veterans as a percent of civilian hires went
from 24 percent in 2009 to more than 28 percent two years later. In 2011, Congress
approved tax credits of up to $9,600
for businesses that hire veterans. The
Chamber of Commerce’s “Hiring Our
Heroes” program hosts job fairs, trains veterans on skill marketing, and encourages
businesses to hire veterans. Within a year
of launching its hiring campaign in March
2012, the program garnered commitments
from businesses to hire more than 210,000
veterans. Separately, Wal-Mart, the world’s
largest private employer, pledged in
January 2013 to hire 100,000 veterans over
the next five years. Still, more than
160,000 people leave active military duty
each year — and many of them will be joining their fellow service members looking
for work.
— R E N E E H A LTO M

More Than a Campfire

The Fuel of the Future…Is Wood?
igh-tech solar panels and wind turbines get most
of the attention as sources of renewable energy,
but relatively low-tech wood is gaining traction. Energydense wood pellets are made of wood scraps and
compressed sawdust, and burn more cleanly than firewood. Some power companies, including in the Fifth
District, have started converting plants to run on wood
pellets, while consumers continue to heat their homes by
burning pellets in specially designed stoves.
Bethesda, Md.-based Enviva is playing a big role in
meeting the demand for wood as a renewable fuel. The
company operates one wood pellet plant in Hertford
County, N.C., and is building two additional plants in
Northampton County, N.C., and Southampton County,
Va., that should be completed this year.
Enviva has a contract to supply wood chips to
Dominion Virginia Power, which is converting three of
its coal-fired power stations in Virginia to use wood by
the end of 2013. This will help Dominion meet the state’s
voluntary goal for 15 percent of the company’s electricity
sales (as of 2007) to come from renewable resources
by 2025.
Most of the production from Enviva’s new pellet
production plants will be sent to Europe, however.
In general, the demand for wood pellets is growing outside of the United States. American pellet producers
exported 1.96 million metric tons of their product in
2012, a 52 percent increase from 2011.
“Displacing coal with biomass power from wood
pellets is one of the most cost-effective ways to meet
renewable energy targets related to the European
Union’s [goals] for energy and climate change,” noted
economist Seth Walker in the February 2013 edition of
the RISI Wood Biomass Market Report. By 2020, the EU
aims to reduce greenhouse gas emissions by 20 percent
from 1990 levels, increase the share of renewable energy
to 20 percent, and increase energy efficiency by 20 percent. In contrast, the United States provides tax
incentives for the production of renewable energy but
has set no federal production goals for utilities.
Where do Enviva and other pellet producers get their
raw material? A lot of it comes from harvesting hardwood trees. “Typically, the smaller parts of the tree —
tops, limbs and branches — have been left in the woods
as scrap,” says Ronnie James, a senior vice president at
First Citizens Bank who regularly works with agri-

PHOTOGRAPHY: DOMINION VIRGINIA POWER

H

Dominion’s
power station in
Altavista, Va.,
is one of three
that the utility is
converting to burn
wood instead
of coal.

business firms in the Greenville, N.C, metropolitan area.
For wood pellet manufacturers, these scraps are just the
right size to be chipped and hauled away for their use.
Smaller trees that had been pushed down or run over
also are being snatched up by pellet manufacturers, adds
James. “This not only leaves a cleaner site, but also provides additional income for the landowner and logger.”
Normally, logging companies don’t sell to wood pellet
manufacturers. But when they were hurting from the
housing market slump, the demand from pellet manufacturers “came at an opportune time,” says Mary Ellen
Aronow, senior forest economist at Hancock Timber
Resource Group, which owns timberland in Virginia and
the Carolinas. “When you take the smaller trees out to
make room for the bigger trees, we need a market for
that thinning material.”
Recently, interest in converting to bioenergy has
slowed, according to Aronow. With natural gas production rising and prices falling, thanks to hydraulic
fracturing, power plants configured to use either wood
pellets or natural gas are increasingly choosing the latter,
noted the RISI Wood Biomass Report. As a result, there
was an oversupply of pellets and other wood-based
biomass as of the first quarter of 2013.
But analysts still predict that wood pellet demand
will at least double by 2020. If that happens, timber
producers aren’t the only ones who could benefit. “New
equipment is needed to chip wood in order to meet
processor demands,” explains James. “Transportation
also could be affected positively as more drivers for
road tractors may be needed to haul chips.”
— CHARLES GERENA

ECON FOCUS | FIRST QUARTER | 2013

5

FEDERAL RESERVE

When Talk Isn’t Cheap
BY R E N E E H A LT O M

Can the Fed create economic
growth … just by talking?
or all the obsessive attention given to the fed funds
rate, the short-term interest rate that is the Fed’s
primary tool for influencing the economy, the rate
is relatively unimportant in the scheme of things. Just ask
Fed Chairman Ben Bernanke.
“Other than managers of bank reserves and some other
traders in short-term funds, few people in the private sector
have much interest in the funds rate per se,” he explained in
2004. Instead, he said, what drives the bulk of economic
activity is long-term interest rates, which are determined by
markets rather than directly by the Fed. Those range from
five-year car loans to 30-year mortgages, as well as corporate
bond rates and the prices of interest-sensitive long-term
assets such as housing and equities.
So how does the Fed have such powerful influence over
the economy if its main policy lever is not directly relevant
to most economic transactions? The answer is expectations.
Long-term interest rates are determined in part by what
financial markets expect monetary policy to do in the future,
since the interest rate on a long-term loan depends on the
short-term rates that are expected to prevail over the loan’s
life. That makes expectations for fed funds rates of the
future more relevant to economic activity than the rate’s
level in the present. That also means most of the effect of
changes to the fed funds rate comes before the decisions are
actually made, when private forecasters start to anticipate
them and build them into long-term rates.
As a result, the Fed is very careful about its communication with the public, providing as much information as
possible about future policy through speeches, policy statements, and press releases without unduly committing to a
course of action that could change and therefore disrupt
financial markets.
Lately, Fed communications have had an even more
important role. The target fed funds rate has been set essentially to zero since December 2008 in response to the Great
Recession. The Fed has limited scope to push the fed funds
rate lower; negative nominal interest rates are technically
possible, but some argue they would significantly disrupt
financial markets. Instead, with the economic recovery still
weak, the Fed has tried to keep long-term interest rates low
by creating the expectation that the fed funds rate will stay
at zero for a long time to come, through what’s known as
“forward guidance” about future policy. But communications are an inherently imprecise tool, so a central bank’s
words can hurt if policymakers are not careful.

F

6

ECON FOCUS | FIRST QUARTER | 2013

Embracing Expectations
To speak clearly about policy, a central bank must have a
coherent framework for thinking about it. The lack of such
a framework kept monetary policymakers more or less silent
in the decades after the gold standard collapsed, according
to Fed history expert and former Richmond Fed director
of research Marvin Goodfriend, now at Carnegie Mellon
University. Many central banks engaged in virtually no communication with the public until the 1990s, giving the Fed a
reputation it is still trying to shake for running the economy
by pulling intentionally mysterious policy levers like the wizard in Oz. The Fed has fought that perception over the last
20 years by being increasingly open about its views on policy. Areas of disagreement used to include the root causes of
inflation and how much power policymakers had to manage
business cycles. What helped resolve these and other questions was a greater appreciation among economists for the
role of expectations in driving economic activity.
It wasn’t that economists didn’t always believe expectations were important; it’s just that they are exceedingly
difficult to model mathematically. To model any decision
that spans time, as virtually all economic questions do, one
needs a theory of how expectations are formed. But expectations are unobservable and shaped by countless,
sometimes subtle bits of information. And then one has
to factor in the effects of policy on a person or a firm’s
behavior, which requires a way to capture the circularity in
which people’s knowledge of policy changes behavior, but
policy’s effect on behavior might in turn change policy.
Early economists wanted to deal with expectations but
didn’t know how. As a result, expectations didn’t appear in
the first formal theories of macroeconomic stabilization
policy, with economists figuring, as John Maynard Keynes
did, that the economy was beholden to “waves of optimism
and pessimism” that were important but undefinable. But
theories that didn’t deal with expectations sometimes led
economic policy astray. In the 1960s and 1970s, monetary
and fiscal policies were based on the Phillips curve, the
empirical regularity in that period where inflation and
unemployment usually moved in opposite directions. This
pattern in the data suggested to policymakers that they
could always achieve a lower rate of joblessness simply by
bumping up the rate of price increases. Unfortunately, those
policies only showed, contrary to the Phillips curve, that
inflation could rise without any beneficial effect on unemployment, in the 1960s as policymakers failed to anticipate
the inflationary effects of some combined efforts to simulate the economy, and in the 1970s as the Fed failed to
adequately tighten policy in response to oil price shocks,
cementing inflation into the public’s expectations.

An impressive number of Nobel Prizes were awarded to
economists — Milton Friedman, Edmund Phelps, Robert
Lucas, Edward C. Prescott, Finn Kydland, and Thomas
Sargent — who developed theories of expectations in the
1960s, 1970s, and 1980s. This body of work provides some of
the best examples of how economic theory can improve
real-world policy. Expectations in models went from nonexistent to “adaptive” — people expecting what happened in
the past to continue — and from there to “rational.”
Rational expectations, still the dominant model today, suggests that people form expectations for some future variable
by looking at the relevant decisionmaker’s incentives. For
example, since the central bank is charged with managing
inflation, people form inflation expectations by considering
how the central bank will address that issue. (People might
not be able to do the same calculus that economists can, but
the theory says they act through intuition as if they do).
For policy, the primary outcome of this work was the
realization that the Phillips curve was a temporary trade-off
at best; inflation would reduce unemployment only if it
came as a surprise, tricking people into thinking they were
getting paid higher real wages than they were, and thereby
leading them to consume more and spur employment. But
surprising people, especially repeatedly, is hard to do. Not
only do expectations for future inflation help determine the
inflation rate today — for example, people demand higher
wages if they expect prices to rise — but also people can
rationally anticipate when a central bank has an incentive to
create inflation. Therefore, the central bank can keep inflation expectations, and thus actual inflation, anchored only
by following an anti-inflation policy rule and making that
rule well known to the public.
This research suggested that central banks should reverse
their tradition of being opaque. Prior to the 1990s, the Fed
didn’t so much as announce its policy decisions to the
public, let alone explain those decisions or provide a hint of
future policy. But the public’s tolerance of secrecy was also
waning. The Fed was sued in the late 1970s for publication of
the policy directive, the marching orders of the Federal
Open Market Committee (FOMC) to the trading desk in
New York. The Fed eventually won in courtrooms, but
not in the eyes of Congress. In the early 1990s, Rep. Henry
Gonzalez (D-Tex.) led a charge to publicize details of the
Fed’s policy meetings. Many scholars made cases for transparency on democratic grounds, if not also on economic
ones.
The Fed’s reason for its eventual decision to announce
policy for the first time, in 1994, was more immediate: It
hadn’t raised rates in five years and feared the move would
disrupt markets. The Fed has become considerably more
transparent since then. The FOMC’s post-meeting press
release, known as the FOMC statement (see sidebar on
page 8), started including an assessment of the likely future
course of interest rates in 1999. A few years later, it began to
reveal how each member voted.
Fed communication now extends far outside the FOMC

statement. Meeting minutes
help markets anticipate what
will be done before the next
meeting. Individual FOMC
members give speeches to
explain how their views compare to the consensus. Four
times per year, the Fed publishes three-year projections
for GDP, unemployment,
and inflation created by the
staffs of each FOMC member. That’s a composite of 19
different forecasts if all the seats of the FOMC are filled,
indicating the extent to which there is uncertainty on the
economy’s health. Where Fed chairmen used to decline
interviews as a rule, Chairman Bernanke started holding regular press conferences in 2011 and has even appeared on the
television program 60 Minutes. Most recently, the Fed for the
first time provided quantitative information about its plans
by announcing in January 2012 a goal of 2 percent average
inflation and stating that it viewed an unemployment rate
between 5.2 percent and 6 percent as the best sustainable
rate the current structure of the economy could achieve.

Making Policy Predictable…
The Fed’s moves have become so predictable that markets
have a pretty good idea of what will happen by the time the
FOMC meets. A 2006 study by San Francisco Fed economist Eric Swanson found that financial markets and private
forecasters became less surprised by FOMC decisions after
the Fed started announcing them. Private forecasts of the
fed funds rate grew more precise even several months before
an FOMC meeting, and markets became more certain about
their forecasts as evidenced by the hedges made on them.
In contrast, forecasts of variables like GDP and inflation did
not grow more precise over the same period, suggesting that
the improvement was due to a better understanding of the
FOMC’s objectives and not more economic certainty in
general during that time.
Yet there are several reasons why central banks can’t be
entirely transparent about future policy. For one thing, the
economic forecast is uncertain. Central bankers must make
all statements contingent on future developments, which
accounts for the notorious precision and many terms of art
with which the Fed speaks. That has given central bankers a
reputation for being indecipherable, and sometimes for
good reason. Former Fed Chairman Alan Greenspan would
intentionally speak in riddles in his testimonies before
Congress, a venue in which he was obligated to respond to
questions that had no clear answer. “Every time I expressed
a view, I added or subtracted 10 basis points from the credit
market,” he told Bloomberg Businessweek in August 2012.
So when asked a nuanced question, “I would continue on
resolving the sentence in some obscure way which made it
incomprehensible.”
A perhaps clearer way for a central bank to provide
ECON FOCUS | FIRST QUARTER | 2013

7

information about the future is to give markets an idea of
how it would react to different economic environments —
what economists call the central bank’s policy rule or
reaction function. This gives markets a sense of the central
bank’s overall strategy given several possible contingencies
— what rational expectations say people need to form accurate expectations about the future — rather than just the
near-term outcome of that strategy under present conditions, as a rate forecast alone would provide. Markets are
said to “do the work of the central bank” when they can infer
from incoming economic data how the Fed is likely to move,
pricing in policy changes before they actually take place and
allowing the Fed to stabilize the economy with fewer costs.
In a 2001 book, former Fed Vice Chairman Alan Blinder and
several coauthors argued that bond rates had begun moving
up and down according to the economic forecast, acting as a
macroeconomic stabilizer even when the fed funds rate
changed little. Donald Kohn, another former Fed Vice
Chairman, and economist Brian Sack, formerly of the
New York Fed, showed in 2003 that the Fed Chairman’s
bi-annual testimonies before Congress, which tend to focus

on longer-term issues affecting monetary policy, affected
10-year Treasury yields, a signal that markets have more
clarity about how the Fed is likely to behave even far into
the future.
Central banks have come to appreciate that the public’s
awareness of monetary policy’s longer-term goals helps the
central bank to achieve them. For example, with the Fed’s
strong anti-inflation reputation, inflation expectations
remained low through events such as rising oil prices in 2005
and aggressive monetary policy since the recent recession.
In the past, such events might have spun inflation expectations out of control and driven inflation higher, so an
awareness of the Fed’s goals may have allowed the Fed to
avoid some costly rate increases. Of course, the Fed’s goals
have been credible only because they tend to prove accurate;
talk is followed up with action.

… In Unpredictable Times
It is, of course, harder to make policy predictable in extraordinary times. Today the Fed is contending with an inability
to lower rates further — since the fed funds rate is at the

The Voice of the FOMC
Lately, the FOMC’s policy announcements have included these key components. (Historical FOMC statements are available at
http://www.federalreserve.gov/monetarypolicy/fomccalendars.htm)

Press Release
Release Date: March 20, 2013
For immediate release

Factors Considered
by the FOMC

Information received since the Federal Open Market Committee met in January suggests a return to moderate
economic growth… Inflation has been running somewhat below….

Economic Outlook

Consistent with its statutory mandate, the Committee seeks to foster maximum employment and price
stability. The Committee expects that, with appropriate policy accommodation, economic growth will … the
Committee continues to see downside risks to the economic outlook. The Committee also anticipates that
inflation over the medium term likely will run at…

Information
About Other Actions

To support a stronger economic recovery and to help ensure that inflation, over time, is at the rate most
consistent with its dual mandate, the Committee decided to continue… Taken together, these actions should…
The Committee will closely monitor incoming information on economic and financial developments in coming
months. The Committee will… until such improvement is achieved in a context of price stability...

8

New Policy Decision and
Forward Guidance

To support continued progress toward maximum employment and price stability, the Committee expects that a
highly accommodative stance of monetary policy will remain appropriate for a considerable time after the asset
purchase program ends and the economic recovery strengthens. In particular, the Committee decided to keep the
target range for the federal funds rate at 0 to 1/4 percent and currently anticipates that this exceptionally low
range for the federal funds rate will be appropriate at least as long as the unemployment rate remains above
6-1/2 percent, inflation between one and two years ahead is projected to be no more than a half percentage
point above the Committee's 2 percent longer-run goal, and longer-term inflation expectations continue to be
well anchored. In determining how long to maintain a highly accommodative stance of monetary policy, the
Committee will also consider other information, including additional measures of labor market conditions,
indicators of inflation pressures and inflation expectations, and readings on financial developments. When the
Committee decides to begin to remove policy accommodation, it will take a balanced approach consistent with
its longer-run goals of maximum employment and inflation of 2 percent.

Vote

Voting for the FOMC monetary policy action were… : Voting against the action was… who was concerned that…

ECON FOCUS | FIRST QUARTER | 2013

so-called “zero bound” — and with doubts about whether
monetary policy is the appropriate medicine for the economy’s weakness. At the dawn of the financial crisis, the Fed
realized that “the FOMC could not simply rely on its record
of systematic behavior as a substitute for communication,”
Fed Vice Chair Janet Yellen said in an April 2013 speech.
Another challenge to making predictable policy is that,
since the crisis, there has been open disagreement within
the FOMC not only about the best policy rule to follow,
but also whether it makes sense to be operating under a
single rule to begin with. “The simple rules that perform well
under ordinary circumstances just won’t perform well with
persistently strong headwinds restraining recovery and with
the federal funds rate constrained by the zero bound,” said
Yellen in November 2012. That same month, Philadelphia
Fed President Charles Plosser, a longtime advocate of policy
rules, argued that, with the Fed’s powers of communication
as an aid, unusual times are no reason not to have a rule in
place. “I would argue that we use the rules as guides and then
explain why the zero lower bound might suggest deviating
from the prescriptions of those rules when appropriate.”
Some argue the zero bound calls for a particular kind of
deviation from the policy rule. The idea comes from a 2003
study that has recently garnered a lot of attention. Gauti
Eggertsson of the New York Fed and Michael Woodford of
Columbia University devised a model in which the central
bank can boost economic activity at the zero bound by
making a credible promise to keep rates at zero even after
the economy recovers — that is, for longer than the policy
rule would call for. The promise invites the private sector to
borrow and spend because they expect that their incomes
will recover before rates go back up. But essential to the
strategy is that markets believe the central bank will follow
through with making “too easy” policy in the future. That’s
not such an easy thing to convince the public of. After the
central bank has enjoyed the boost to economic activity
created by expectations, it’s obvious that it will want to raise
rates to contain inflation. Since the central bank can change
course later, the public may dismiss its statements as mere
“cheap talk.” Thanks to people’s ability to form expectations
rationally, this is a problem faced by any party that wishes for
inherently costless words to affect future outcomes, and an
entire class of game theory research — beginning with work
by Vincent Crawford and Joel Sobel in the early 1980s — was
geared toward understanding how parties can make “cheap
talk” credible.
One way a central bank might be able to overcome
cheap talk is by making strong public statements, since its
credibility would be damaged if it didn’t follow through. And
since the Fed has substantially ramped up its statements
about the future since hitting the zero bound, many people
suspect the Fed has been following the Eggertsson and
Woodford strategy, though it has not explicitly said as much.
Those announcements of forward guidance have appeared
primarily in the post-meeting FOMC statements, and
they have all but promised that rates will stay low for the

foreseeable future. (They are not an outright promise since
all policy decisions are contingent on future developments.)
In December 2008, the FOMC stated that rates were likely
to stay low “for some time,” changed to “an extended period”
in March 2009. In August 2011, the FOMC for the first time
provided a calendar date of likely future policy changes: The
statement said rates were likely to stay low at least through
mid-2013. In January 2012, the date was pushed to late 2014,
and in September 2012, it was pushed to mid-2015. Also in
September 2012, the Fed added that rates would likely
stay low even after the economy strengthened — precisely
the sort of commitment that Eggertsson and Woodford
prescribed — which the FOMC later suggested would
be after unemployment falls to 6.5 percent provided that
inflation doesn’t rise above 2.5 percent.
Preliminary studies have found that forward guidance
has initially been credible. Recent research by Swanson and
San Francisco Fed President John Williams found that when
the Fed hit the zero bound in December 2008, private
forecasts expected rates to stay there for only a few quarters.
But after the Fed introduced a calendar date in August 2011,
private sector forecasts pushed the date of monetary policy
“liftoff ” out to seven quarters or more. Yields on 10-year
Treasuries immediately dropped by about three-tenths of a
percentage point.
It is too soon to know how much this talk affected economic activity, but forward guidance appears to have been
successful in substantially pushing down long-term interest
rates, even when it was accompanied by no change in the fed
funds rate. At the same time, this type of forward guidance
presents two ironically opposing risks to the economy: First,
that forward guidance will signal that the Fed has backed off
from its inflation objectives, permanently upending inflation expectations. And second, that people will take the
Fed’s commitment to easy policy as a sign that the economy
is in worse shape than they thought, causing them to scale
back spending as a precaution. These risks are absent in
models, which assume the central bank’s true intentions are
perfectly clear.

Use Your Words
The FOMC statement continues to evolve at a rapid pace.
In December 2012, the FOMC dropped the reference
to a calendar date through which the fed funds rate was
expected to stay at zero. In place of the calendar date, the
FOMC tied the course of future policy to specific economic
thresholds. It stated that rates were likely to stay low until
unemployment fell below 6.5 percent (compared to today’s
rate of near 8 percent) as long as the market’s medium-term
inflation projections didn’t rise above 2.5 percent (compared
to its average of just under 2 percent since the recession).
These actions, too, have not been without criticism from
within the FOMC. Richmond Fed President Jeffrey Lacker
argued that the Fed has a limited ability to reduce unemployment for long, and a single indicator can’t provide a
continued on page 19
ECON FOCUS | FIRST QUARTER | 2013

9

JARGONALERT
Sticky Wages

o, sticky wages aren’t what happens when you do
the payroll while eating a honey bun. Rather,
sticky wages are when workers’ earnings don’t
adjust quickly to changes in labor market conditions. That
can slow the economy’s recovery from a recession.
When demand for a good drops, its price typically falls
too. That’s how markets adjust to ensure that the quantity of
willing suppliers equals the quantity of willing buyers. In
theory, things are no different when the good in question is
labor, the price of which is wages.
It is natural to think that wages should fall in a recession,
when demand falls for the goods and services that workers
produce. Assuming that the supply of labor does not change,
reduced demand for labor should translate
into lower wages, until everyone willing to
work at the going wage has found employment. Of course, what we tend to observe
in a recession instead is unemployment,
sometimes on a mass scale.
One possible explanation for why
unemployment occurs is that wages are
sticky; they are slow to produce equilibrium in the market for workers. The prices of
some goods, like gasoline, change daily. But
other prices appear to be sticky, perhaps
because of menu costs — the resources
it takes to gather information on market
forces.
Wages are thought to be sticky on both the upside and
downside. But economists have long observed that wages
are especially unlikely ever to fall, even in very severe recessions, a phenomenon called “downward wage rigidity.”
The reasons for downward wage rigidity are unclear. The
prevalence of unions was once a common hypothesis — but
unions have since declined, yet rigidity is still with us. Some
economists thought employers might hold wages artificially
high to encourage productivity. Others suggested that existing “insider” employees prevent unemployed “outsiders”
from bidding down wages by threatening to disrupt the
productivity of the competing workers. Evidence for these
possible explanations is scant, however. In the 1999 book
Why Wages Don’t Fall During a Recession, Yale University
economist Truman Bewley concluded, after hundreds of
interviews with business insiders, that the key reason for
downward rigidity might simply be that pay cuts are too
damaging to morale, even more so than outright layoffs.
It’s hard to say just how sticky wages actually are since it
is impossible to know what the “correct” wage should be.
Stickiness can be estimated, however, by looking at the
number of workers who report no change in wages over the

N

10

ECON FOCUS | FIRST QUARTER | 2013

course of a year. When there is an unusual spike in that number, especially if it occurs during a recession, a reasonable
conclusion is that many employers would like to give a pay
cut but are instead just keeping wages constant.
San Francisco Fed researchers Mary Daly, Bart Hobijn,
and Brian Lucking looked at this measure of wage stickiness
for 2011. They found that wage changes did not rest on a
normal, bell-shaped distribution: Many workers experienced
modest wage increases, while only a handful experienced
wage declines. In addition, there was a large number who
experienced a wage change of precisely zero. The number of
workers with unchanged wages climbs in recessions; it
reached 16 percent in 2011, according to the Census’s
Current Population Survey, by far the
highest proportion in 30 years. And unlike
in previous recessions, the spike in downward wage rigidity occurred across a broad
range of skill levels, suggesting that downward wage rigidity is especially prevalent
today. (One caveat is that employers may
not consider the current wage to be
the true cost of labor. A 2009 study by
Richmond Fed economist Marianna
Kudlyak argued that the true cost of labor
incorporates the future path of wages
given the current state of the economy,
and found that this broader measure
of labor costs varies much more with
economic cycles than seemingly sticky wages.)
Today’s low rates of inflation exacerbate downward wage
rigidity. Modest inflation gradually erodes nominal wages,
and so is a way for employers to cut real wages without
really having to cut them. Therefore, inflation can help the
labor market achieve equilibrium. However, when inflation
is very low, an employer might have to actually cut wages
in dollar terms to reduce real wages. Since managers and
workers alike appear to dislike wage cuts, sticky wages in an
environment of low inflation means the employment
recovery is likely to be slower. In fact, the recent recession’s
hardest-hit industries — manufacturing, finance, and especially construction — experienced the greatest increase in
wage rigidity, according to Daly, Hobijn, and Lucking.
Wage stickiness is one of numerous explanations for
unemployment. For example, economists believe there will
always be some minimum level of joblessness because it
takes time for workers to search for the best jobs. To the
extent that unemployment results from sticky wages, there
may be a role for policy to improve outcomes. That’s one of
the reasons why the degree of wage and price stickiness is an
important and charged empirical question.
EF

ILLUSTRATION: TIMOTHY COOK

BY R E N E E H A LT O M

RESEARCH SPOTLIGHT

From Housing Bust to Baby Bust?
BY J E S S I E RO M E RO

Lovenheim and Mumford find that a $100,000 increase
ids are expensive. Economists typically assume that
in the value of a woman’s home over the prior two years
children are a “normal” good — one for which an
raises her likelihood of having a child by 17.8 percent.
increase in income leads to an increase in demand.
An increase of $100,000 over four years raises the likelihood
(Yes, economic theory treats even children as “goods.”) Thus,
by 16.4 percent. While these might seem like small marginal
it seems logical that people with more wealth would tend
changes, the authors note that in the context of the earlyto have more kids. But, in fact, many studies have found a
2000s housing boom and the low baseline level of fertility,
strong negative correlation between income and fertility,
the increase in fertility is economically significant. They
at both the country and the family level.
calculate that the run-up in house prices between 1999
The relationship between income and fertility is far from
and 2005 increased overall fertility by between 8.6 and
straightforward. For example, as women’s wages have
12.8 percent.
increased, so has the opportunity cost of their time, making
The change in fertility might actually reflect other ecochildren more expensive. This could lead families to shift
nomic conditions that are correlated with house price
their spending to goods other than children (a so-called
changes. The authors thus estimate their model for renters,
“negative substitution effect”) to an extent that outweighs
who experience the same economic variation as homethe positive income effect of higher wages. People with high
owners without housing wealth changes. The effect on
incomes also tend to live in places with a high cost of living,
renters is small, which suggests that the link between house
which could limit their disposable income or make child
price changes and homeowners’
care and schooling very expenfertility is indeed real.
sive. It’s also possible that
“Do Family Wealth Shocks Affect
What if people planning to
people not planning to have
Fertility Choices? Evidence from the
have children intentionally
children are more willing to
move to places with a high cost
Housing Market.” Michael F. Lovenheim move to areas with amenities
such as parks or good schools
of living (where they might
earn commensurately higher and Kevin J. Mumford. Review of Economics that make home values more
likely to rise? To check if this
incomes) because they expect
and Statistics, May 2013,
phenomenon is skewing their
to have relatively low expenses
vol. 95, no. 2 (forthcoming).
results, the authors reestimate
compared to couples planning
their equation using a method
for children.
that restricts the price growth rate to be the same in all areas
Michael Lovenheim of Cornell University and Kevin
each year. They find that selective migration is not causing
Mumford of Purdue University explore the relationship
bias in their estimates.
between family housing wealth and fertility in their forthHistorically, housing wealth has not been especially
coming article “Do Family Wealth Shocks Affect Fertility
liquid, which might lessen its impact on behavior. But
Choices? Evidence from the Housing Market.” Unlike
Lovenheim and Mumford speculate that the increased availincome shocks such as a raise or job loss, changes in house
ability of home equity loans and lines of credit in the 1990s
prices do not affect the opportunity cost of a parent’s time
and 2000s increased household responsiveness to price
or change the allocation of household work between
changes. As expected, the authors find that the fertility
parents. Any relationship between housing wealth and
response more than tripled over the sample period.
fertility is thus more likely to be causal, not just a correlaIf households responded to the housing boom by having
tion, according to the authors.
more children, did the housing bust afterward lead them to
The authors’ data come from the University of
have fewer? The authors’ data end in 2007, but the few price
Michigan’s Panel Study of Income Dynamics (PSID), a
declines in their sample suggest that people are less responhousehold survey that began in 1968. Lovenheim and
sive to falling prices. As they note, however, the declines in
Mumford look specifically at women aged 25-44 during the
their sample were not accompanied by the large reductions
years 1985 through 2007; about 54 percent of women in the
in the liquidity of housing wealth that characterized the
PSID own their own homes. The trend in housing price
recent bust, so it’s likely that the effect would have been
changes during this period is overwhelmingly positive.
larger after 2007. A recent study by the Pew Research Center
To isolate the effect of housing wealth on fertility, the
found that the U.S. birthrate fell 8 percent between 2007
authors control for factors including age, education, marital
and 2010, but sorting out the causes of that decline will be a
status, family income, the number of other children, city, the
matter for future research.
EF
state unemployment rate, and real income per capita.

K

ECON FOCUS | FIRST QUARTER | 2013

11

C O V E R

S T O R Y

Since the recession, long lines are the norm
at many agencies that provide assistance
to low-income families.

New measures of poverty illustrate just how hard it is
to define who is poor
BY J E S S I E RO M E RO

I

12

ECON FOCUS | FIRST QUARTER | 2013

vide for their children by linking family income to food
costs. As she wrote in a 1988 retrospective, “The utility of
the SSA poverty index owes much to an accident of timing:
It appeared when needed. The link to nutritional economy
and food-income consumption patterns endowed an arbitrary judgment with a quasi-scientific rationale it otherwise
did not have.”
Yet Orshansky’s measure remains the official definition
today, largely unchanged except for adjustments for inflation
and family size. The current threshold for a two-parent, twochild household is $23,283.
For decades, the official poverty rate has been criticized
by economists, policymakers, and activists from both the
left and the right. A variety of incremental improvements
and wholesale changes have been proposed by both federal
and private sector researchers. What these research efforts
show, however, is not that one definition of poverty is
unequivocally correct, but rather how challenging poverty
is to define.

The Official Poverty Threshold
The poverty rate is a widely cited gauge of the health of
the economy, and trends in the rate are used to justify new
policies and evaluate the effectiveness of existing policies.
For example, in 1993 President Clinton used the rate as a
marker for his proposed expansion of the Earned Income
Tax Credit (EITC); he pledged that full-time work at

PHOTOGRAPHY: RICHARD NEWSTEAD/FLICKR /GETTY IMAGES

n 1964, President Lyndon Johnson launched a “War
on Poverty” — an ambitious legislative agenda that
created programs such as food stamps, Medicare,
Medicaid, and Head Start, to name just a few. At the
time, no official measure of poverty existed. But just one
year earlier, a Social Security Administration economist
named Mollie Orshansky had published an article titled
“Children of the Poor,” in which she presented an income
threshold based on a subsistence level of food spending.
President Johnson’s new Office of Economic Opportunity
adopted Orshansky’s threshold for statistical and planning
purposes, and by 1969, the measure with some slight revisions had become the government’s official statistical
definition of poverty.
Orshansky derived the threshold from the Department
of Agriculture’s “economy” food plan, which detailed the
bare minimum a family could spend on a nutritionally
adequate diet. The average family in the 1960s spent about
one-third of its income on food, so she multiplied the economy-plan level of spending by three to determine the
poverty threshold — $3,165 for a family with two parents
and two children in 1963. (Orshansky also calculated equivalent thresholds for dozens of subcategories of family types.)
But the progenitor of the official poverty measure never
intended for what she called her “crude indexes” to become
a general definition of poverty. Instead, Orshansky’s goal was
to assess the ability of various demographic groups to pro-

minimum wage plus the EITC should be enough to lift a
family above the poverty line. More recently, the poverty
rate has been viewed as an alarming signal of the effects of
the 2007-09 recession. In 2010, the rate reached 15.1 percent
— comprising 46.2 million people — the highest rate in
nearly two decades. In 2011, the most recent year for which
there are data, the rate remained elevated at 15 percent.
States in the Fifth District are faring both better and
worse than the nation as a whole. Poverty rates in 2011 in
Maryland and Virginia were 9.3 percent and 11.4 percent,
respectively, and North Carolina was near the national
average, at 15.4 percent. But West Virginia, South Carolina,
and Washington, D.C., had some of the highest poverty rates
in the nation: 17.5 percent, 19.0 percent, and 19.9 percent,
respectively. (See chart.)
The official poverty thresholds also determine the eligibility for and allocation of funding across more than 80
federal programs, ranging from helping rural areas improve
their water and waste disposal systems to providing free
breakfast and lunch to low-income school children. (This
number includes many federal programs that determine
individual eligibility according to the poverty guidelines
developed by the Department of Health and Human
Services, simplified versions of the Census Bureau’s official
thresholds.) The largest program that uses the official
poverty threshold to determine individual eligibility is food
stamps, formally known as the Supplemental Nutrition
Assistance Program (SNAP), which paid out $74 billion in
benefits in 2012.

Picking the Target

tracking the level of poverty over time. But an absolute
measure will not reflect changes in the standard of living, or
shifting attitudes about what it means to be poor.
Televisions and cars were luxuries in 1963 — when the U.S.
thresholds were established — but today are viewed by many
as necessities, as Meyer and James Sullivan of the University
of Notre Dame noted in a 2012 Journal of Economic
Perspectives article.
A relative poverty measure addresses this concern by
setting the threshold relative to a metric that changes with
society’s standard of living. The United Kingdom, for
example, sets the poverty threshold at 60 percent of the
country’s median income. Such a measure better captures
how the poor are faring compared to the rest of society.
While absolute measures are criticized for holding the
level of need constant, relative measures are criticized for
not really measuring need at all. Instead, some researchers
contend, relative poverty measures actually are a measure of
inequality. For example, a relative poverty measure could
change dramatically with swings of the business cycle.
During the 1990s and early 2000s, incomes were rising very
rapidly in Ireland, but they rose more quickly in the middle
of the distribution than at the bottom. As a result, the relative poverty rate increased even though people at the
bottom actually were earning much more than they had just
a few years earlier. In addition, a constantly moving target
makes it difficult to assess the effects of anti-poverty
policies over time. “If you’re continually changing the goal
posts, it’s hard to know where you are relative to the goal
line,” Meyer says.
The official U.S. poverty rate uses pre-tax money income
as its resource measure, and the threshold is absolute, adjusted only for inflation since 1963. (Some economists believe
that the threshold is effectively relative because it is tied to
the Consumer Price Index, which might overstate inflation;
the thresholds thus could be rising faster than actual inflation.) Both of these characteristics have been widely faulted
for painting an inaccurate picture of poverty in the United

PERCENT

In theory, measuring poverty is a simple task. “If your needs
exceed your resources, you’re poor. If your resources exceed
your needs, you’re not poor,” says Timothy Smeeding, an
economist at the University of Wisconsin-Madison and
director of the Institute for Research on Poverty. But in
practice, “all those measures are subjective” — making the
task far more complicated.
Researchers must make a number of decisions about how to measure resources: Should
Official Poverty Rates, U.S. and Fifth District
they count pre- or post-tax income? Should
30
they include in-kind transfer benefits? What
about assets? And how should they account for
25
differences in family size or regional variations
in the cost of living? Then they must decide
where to set the threshold for need, a decision
20
that is inherently arbitrary. “There’s nothing
magic about [setting the threshold],” says
15
Bruce Meyer, an economist at the University of
Chicago. “It isn’t something that comes down
10
on a tablet from Mt. Sinai.”
A fundamental question is whether the
threshold should be absolute or relative. An
0
1981 1983 1985 1987 1989 1991 1993 1995 1997 1999 2001 2003 2005 2007 2009 2011
absolute threshold is adjusted only for inflaNOTE: Shaded areas denote recessions.
tion; the real value of the threshold remains
SOURCE: U.S. Census Bureau
constant from year to year, making it useful for

ECON FOCUS | FIRST QUARTER | 2013

DC
SC
WV
NC
VA
MD
US

13

indicators of well-being. Since the early 1970s, for example,
the poverty rate has increased while the infant mortality rate
and the number of people who are nutritionally deprived
have decreased. In addition, according the Eberstadt, a
poverty-level household in 2001 was more likely to have
central air conditioning or a television than a medianincome family was in 1980.

Official Poverty Rate Versus
Supplemental Poverty Measure (SPM)
Age and Race
30
Official Rate

SPM

25

PERCENT

20

The Supplemental Poverty Measure

15
10
5
0
All
People

Under
18

65
and
Older

Black

White

Asian

Hispanic

Fifth District States
25
Official Rate

SPM

PERCENT

20
15
10
5
0

NC

MD

SC

VA

DC

WV

Region
25
Official Rate

SPM

PERCENT

20
15
10
5
0
Northeast

Midwest

South

West

NOTE: State poverty rates are a three-year average (2009-2011).
SOURCE: U.S. Census Bureau

States. Pre-tax money income, for example, doesn’t include
expenses or in-kind benefits, and thus doesn’t reflect the
actual disposable income available to a family. In addition,
critics say that the official thresholds have “defined deprivation down.” The poor today are poorer relative to the rest of
society than they were a half century ago: In 1963, the
poverty threshold for a family of four was about 50 percent
of U.S. family median income. Today, it’s closer to 30 percent.
At the same time, however, the official poverty rate doesn’t reflect that the poor appear to be better off in absolute
terms than they were in 1963, according to Nicholas
Eberstadt, an economist and political scientist at the conservative American Enterprise Institute. In his 2008 book
The Poverty of “The Poverty Rate,” Eberstadt found that
the trend in the poverty rate contradicted trends in other
14

ECON FOCUS | FIRST QUARTER | 2013

The limitations of the official poverty rate have been recognized from the beginning. As Orshansky herself wrote in her
pioneering 1963 article, “There is need for considerable
refinement of the definition or standards by which poverty
is to be measured, if we are to trace its course with assurance.” Numerous economists, statisticians, and other
researchers have thus spent decades grappling with questions ranging from data collection to philosophy.
In 2011, the Census Bureau unveiled the Supplemental
Poverty Measure (SPM), which attempts to address many
critiques of the official poverty rate. The SPM will not
replace the official rate, but will be released alongside it each
fall. The first major difference is that instead of pre-tax
money income, the SPM counts cash income plus tax
credits and in-kind benefits such as food stamps, school
lunches, heating and housing assistance, and WIC, a nutrition program for women and children. It then subtracts
work expenses such as transportation or child care, out-ofpocket medical expenses including insurance premiums, and
child support paid to another household.
Another major change is to the threshold for need. The
new threshold is based on expenditures on food, clothing,
shelter, and utilities, or FCSU, by different types of family
groups. The line is drawn at the 33rd percentile of FCSU
spending, multiplied by 1.2 to account for additional basic
needs and adjusted for various family sizes. The thresholds
will be revised each year according to the five-year moving
average of FCSU expenditures; this method is designed to
ensure that the thresholds change with time, but more gradually than if they were pegged to annual data. The SPM also
includes regional adjustments for housing costs, so a family
living in New York City has a higher threshold than a family
in Oklahoma.
The SPM poverty rate is 16.1 percent; about 3.1 million
more people are counted as poor than under the official
threshold. Underlying this increase are dramatic changes in
demographic groups. The poverty rate for children under 18
decreases from 22.3 percent to 18.1 percent, since many inkind benefits are targeted toward children. But including
medical costs causes the poverty rate for the elderly to
nearly double, from 8.7 percent to 15.1 percent. The poverty
rates for white, Hispanic, and Asian people increase, while
the poverty rate for black people decreases. A number of
factors could contribute to these differences, including
different participation rates in benefit programs or the
likelihood of having health insurance. Hispanics, for example, have low rates of health insurance coverage, which

could increase their out-of-pocket medical spending.
There also are significant regional changes. Poverty
increases in the Northeast and West, reflecting the higher
cost of living in these regions, but decreases slightly in the
Midwest and South. In the Fifth District, SPM poverty is
higher than official poverty in Washington, D.C., Maryland,
and Virginia, and lower in North and South Carolina and
West Virginia. (See chart.)
Because the SPM includes in-kind benefits, it better
illustrates the effects of government anti-poverty programs.
The poverty rate without the EITC would rise to 18.9 percent; without food stamps it would be 17.6 percent. The
effects are especially noticeable for children. Child poverty
would be 24.4 percent without the EITC, 21 percent without food stamps, and 19 percent without the school lunch
program.
The SPM also underscores how many people have difficulty making ends meet, even if they aren’t officially poor.
The share of people with incomes between 100 and 150 percent of the poverty line increases from 10 percent to 17
percent under the SPM — to a total of 57 million people.
More than 10 million people were lifted out of poverty into
near-poor status, but more than 26 million people were
brought down by the inclusion of taxes and expenses. “The
programs that reduce poverty at the bottom are very well
targeted at the poor. They really help people at the bottom.
But if you move above the poverty line the benefits phase
out. And the higher up you go, the more of your income is
earnings, so you have more work and child care expenses,”
says Smeeding.

Challenges to the SPM
The SPM is not intended to replace the official poverty rate;
instead, it was designed as “an additional macroeconomic
statistic providing further understanding of economic conditions and trends,” according to the Census Bureau. Given
the many programs that make use of the official poverty
rate, replacing it with the SPM would be both administratively and technically challenging. Because the official rate
and the SPM have different standards of need and measures
of resources, a program that sets eligibility at, say, 130 percent of the official poverty line might have to determine a
new standard using the SPM. The SPM also could complicate funding allocation to states, for example by penalizing
states with low costs of living or generous benefits programs
and that thus have lower poverty rates than under the official measure. “Are we going to penalize the states that do a
great job for the poor, and give them less money? Or should
we look at poverty before taxes and benefits, and see where
the need is?” asks Smeeding.
In addition, some critics of the SPM believe that the
measure both adds and subtracts the wrong people. For
example, child and elder poverty rates are about the same
under the SPM, but the Department of Agriculture’s foodinsecurity index shows more than twice as many children
as elderly people at risk, notes Shawn Fremstad, a senior

research associate at the liberal Center
for Economic and Policy Research.
“Adding a child to your household costs a
lot more than adding another adult,”
Fremstad says, a fact that might not be
picked up by the SPM’s family-size conversions. Moreover, Fremstad asks, “Are
we really capturing the need kids have
for care, for development beyond subsistence needs?”
Conversely, the increase in elder
poverty relative to the official poverty
rate might not be all that it appears.
Much of the increase is driven by large
out-of-pocket medical expenses, which
lower disposable income. But it’s possible that the elderly have high medical
expenses in part because they choose to
allocate their resources toward health,
by purchasing expensive insurance plans
or having procedures that aren’t covered
by insurance. “It is difficult a priori to
determine whether most out-of-pocket
medical spending reflects those with lower health status or
those who have greater resources and makes choices to
spend more on out-of-pocket health costs,” Meyer and
Sullivan wrote in their 2012 article. In fact, neither the official poverty rate nor the SPM might be suitable for
measuring elder poverty. “An income measure is particularly
poor at capturing the well-being of the elderly because many
older households are living off their savings, which don’t
count as income,” Meyer says. “And the vast majority own
their own home and have a car. They get a flow of services
from these resources that don’t require income or current
spending.”
Overall, Meyer and Sullivan’s analysis suggests that the
people newly counted as poor by the SPM are likely to have
a higher standard of living than those who are no longer
counted as poor. (A person could be officially poor but not
SPM poor if he has very low income, but receives many
in-kind benefits. A person could be SPM poor but not officially poor if she has income above the official poverty
threshold, but also has high medical or child-care expenses.)
For example, those newly counted by the SPM are
more likely than those no longer counted as poor to be a
homeowner, to own a car, and to live in a household headed
by a college graduate; they also tend to live in larger homes
and have more amenities such as air conditioning, dishwashers, and computers. This suggests that the SPM is not
accurately capturing those who are truly the worst off.

Alternative Poverty Measures
Both the official poverty rate and the SPM are incomebased measures. But income is not the only way to measure
a person’s well-being. One option might be to use consumption, which takes account of the fact that some people
ECON FOCUS | FIRST QUARTER | 2013

15

have savings or own durable goods such as houses or cars.
Consumption is thus a better reflection of lifetime
resources than income at a point in time. Or, as Meyer says,
“The reason you care about income is because it allows
you to consume, so you might as well look at consumption,”
Meyer says.
Meyer and Sullivan constructed a measure based on consumer expenditures, including an annual value of home and
car ownership for households with these items. They found
that people who are consumption poor under their measure
but not officially poor (that is, people who have incomes
above the official poverty line but low consumption, perhaps because of high expenses) tend to score lower on many
measures of well-being than those in the opposite situation
— those who are officially poor but not consumption poor
(they have low incomes but high consumption). On average,
the consumption-poor live in smaller homes with fewer
amenities and are less likely to own their own homes, and
the head of the household is less likely to be a college
graduate. The consumption measure thus does a better job
of identifying people who are truly disadvantaged, according
to Meyer and Sullivan. They also found that consumption
poverty fell 8 percentage points between 1980 and 2010,
while the official poverty rate rose 2 percentage points; the
poor today tend to have a higher standard of living than the
poor of the past.
Just as an income-based measure might include families
who have low incomes but are able to smooth consumption
via savings, a consumption measure could exclude people
who have low incomes but are consuming via credit. It’s
likely that the measure would balance over time, however,
since people who are borrowing today will have to pay it
back tomorrow, leading to lower future consumption.
Moreover, people close to the poverty line tend to have very
little credit and debt.
Because savings allow a person to consume even with low
income, another gauge of poverty is assets. An asset-based
measure reveals a family or individual’s vulnerability to a
sudden loss of income. About half of U.S. households do not

have enough financial assets to maintain them above the
official poverty line for at least three months, according to
research by Smeeding and Andrea Brandolini and Silvia
Magri of the Bank of Italy.
Another way to measure poverty is not in terms of a
single number, but rather as the ability to maintain “a minimum decent standard of living. It’s a quality of life concept,”
says Fremstad. The European Union, for example, not only
counts the number of people below 60 percent of median
income, but also tracks measures of “material deprivation”
and “inclusion.” Material deprivation is the inability of individuals or households to afford the goods and activities that
are typical in a society at a given point in time; the purpose
of the measure is to reflect a consensus about what items
constitute necessities. According to the United Kingdom’s
Family Resources Survey, for example, necessities include a
warm winter coat, keeping the home in a “decent state of
decoration,” and having enough toys and games for a child’s
development. The concept of inclusion is even broader,
and refers to a person’s ability to participate in economic,
social, and cultural activities. Social inclusion is difficult to
measure — “I think it’s hard to get a handle on what it means
in a practical sense,” says Fremstad — but many researchers
believe that the United States would benefit from a more
holistic approach to poverty measurement.
The debate over poverty measurement highlights that no
single measure can be sufficient for all purposes. For example, both relative and absolute poverty are valuable. “It’s
important to know how the poor are doing relative to everyone else, but it’s also important to know if people are doing
better than they were,” Smeeding says. Similarly, income,
consumption, and assets all shed light on the multiple types
of hardship faced by different groups of people. In the end,
of course, changing the words doesn’t change the reality; a
new definition of poverty doesn’t alter the material circumstances of those who find themselves in a new category. But
the continuous effort to refine the measures is an important
step toward understanding who is poor and how they can
best be helped.
EF

READINGS
Fisher, Gordon M. “The Development and History of the Poverty
Thresholds.” Social Security Bulletin, Winter 1992, vol. 55, no. 4,
pp. 3-14.
Meyer, Bruce D., and James X. Sullivan. “Identifying the
Disadvantaged: Official Poverty, Consumption Poverty, and
the New Supplemental Poverty Measure.” Journal of Economic
Perspectives, Summer 2012, vol. 26., no. 3, pp. 111-136.
Citro, Constance F., and Robert T. Michael (eds.).
Measuring Poverty: A New Approach. Washington, D.C.:
The National Academies Press, 1995.

16

ECON FOCUS | FIRST QUARTER | 2013

Orshansky, Mollie. “Children of the Poor.” Social Security Bulletin,
July 1963, vol. 26, no. 7, pp. 3-13.
____. “Commentary: The Poverty Measure.” Social Security Bulletin,
October 1988, vol. 51, no. 10, pp. 22-24.
Smeeding, Timothy. “Poor People in Rich Nations: The United
States in Comparative Perspective.” Journal of Economic Perspectives,
Winter 2006, vol. 20, no. 1, pp. 69-90.

Muddy language can be costly
BY B E T T Y J OYC E N A S H

any multipage agreements, notices, and forms
are crowded with microscopic print and convoluted text, a powerful deterrent to readers.
Complexity is still the rule rather than the exception, but
readability may be on the rise.
In the United States, clear communication is the law
under the Plain Writing Act of 2010, which applies to public
letters, notices, and forms from federal agencies. Though
earlier legislation, such as the Paperwork Reduction Act of
1980, encouraged plain language in connection with other
goals, the 2010 law’s sole focus is requiring agencies to write
clearly. Some states — and nations — had attacked bureaucratic language even earlier. For example, Canada’s federal
and provincial “plain language” efforts date back to the
1970s; Sweden’s laws are all written in plain language.
New York State enacted a plain language law in 1978 for
consumer transactions.
The movement toward disclosures and plainer communication has waxed and waned in the United States since the
1970s era of consumer protection laws. Today, it’s waxing in
both the public and private sectors. This language transformation won’t happen overnight, however, says Annetta
Cheek, board chair and a founder of the Center for
Plain Language, which grew from a group of like-minded
government employees.
“Taking traditional bureaucratic stuff and issuing it in
plain language is hard work, and the government doesn’t
have a lot of people skilled at it,” she says.
It is hard. Try finding a shortcut for “default,” for
instance, a word with several meanings. But how can people
make decent decisions if writers bury critical content in jargon and tangled sentences?

M

Plain Benefits
Writing simple language is anything but simple. The range of
tools includes using active voice, succinct language, common
words, short sentences, headings, and tables and figures.
Even then, complicated concepts can remain elusive,
depending on the audience, without additional explanations
and examples.
Michael Masson and Mary Anne Waldron, University of
Victoria professors of psychology and law, respectively,
tested traditional notions of plain language. Their study,
published in Applied Cognitive Psychology in 1994, found that
simply removing archaic terms and “legalese” from contracts
had little value. But simplifying language and shortening
sentences did improve reader comprehension. “By using
more familiar words we made more concepts accessible to
readers, and by using shorter sentences fewer demands were
placed on working memory capacity,” the researchers noted.

Even so, some participants responded erroneously when
asked to answer questions and paraphrase material. The
results suggested that, “quite apart from the constraints of
language, nonexperts have difficulty understanding complex
legal concepts that sometimes conflict with prior knowledge
and beliefs,” Masson and Waldron concluded. This indicates
that plain language is not only challenging to write, but
also that it may not always solve comprehension problems,
especially if it is approached in a superficial way.
Simplifying legal language is a mission for Joseph Kimble,
a professor at the Thomas M. Cooley Law School. He has
worked for years, he jokes, to “poison the well of legalese at
its source.” He teaches legal writing and has written three
books about plain language. The latest, published last year,
is Writing for Dollars, Writing to Please: The Case for Plain
Language in Business, Government, and Law.
Measurable benefits of plain language are substantial,
according to Kimble. Simplified memos, agreements, and
notices take less time to understand, so they require
less staff time. Examples range from plainly written user’s
manuals to clearer memos for U.S. naval officers.
One such case is that of a U.S. Veterans Benefits
Administration letter that went to 320,000 veterans who
needed to update information about their life insurance
beneficiaries. The response rate for previous letters had
never exceeded 43 percent, but the plain language version,
with a revised structure and clean design, had a 66 percent
rate of response. Staff time saved (because the agency had
fewer beneficiaries to identify and locate) amounted to
$4.4 million in 1999 dollars.
Likewise, the Internal Revenue Service has trained
employees and revised more than 100 of its taxpayer notices
and guides, no small feat given the arcane U.S. tax code. For
example, a child care tax credit notice went from five to
three pages. The revision used bold type, clear, concise
language, and the pronouns “we” and “you” to clarify the
taxpayer’s responsibility. Overall, IRS results from improved
writing include reduced penalty and interest payments and
improved taxpayer compliance, according to Terry Lemons,
director of its office of communications. Taxpayers are less
frustrated and report higher levels of satisfaction because
they’re less confused and their cases are resolved sooner.

The Holy Grail
Private firms are looking harder at communications not only
for clarity’s sake but also to court consumer satisfaction.
Even if clear benefits to firms aren’t easily calculated, clear
communications promote customer loyalty and trust.
People feel cheated if they suffer financial penalties and consequences because they didn’t understand their obligations.
ECON FOCUS | FIRST QUARTER | 2013

17

Before-and-After Example from a
Financial Privacy Statement
BEFORE:
If you prefer that we not disclose nonpublic personal
Clear and simple language,
And that hurts business.
information about you to nonaffiliated third parties, you
though, is only one step along
“This isn’t just dollars and
may opt out of those disclosures, that is, you may direct us
the way to comprehension,
cents, it’s also a matter of
not to make those disclosures (other than disclosures
permitted by law).
especially in financial matters.
looking at a document and
The way information is dissaying, ‘That’s straightforward.
AFTER:
closed and framed may
Nobody’s trying to pull the wool
We share personal, nonpublic information about you to
influence whether and how
over your eyes. That company
third parties that are not affiliated with us unless you tell us
not to (opt out).
much people borrow.
deals straight with its clients and
customers,’” Kimble says. “The
benefit to the readers produces benefits for companies;
Context and Complexity
obviously they are related.”
A study by two University of Chicago Booth School of
Cutting the number of customer service calls is a “holy
Business professors, Marianne Bertrand and Adair Morse,
grail” of plain language, says Deborah Bosley, an associate
evaluated the way that additional information and presenEnglish professor at the University of North Carolina,
tation affects payday borrowers’ decisions. Their paper
Charlotte and a consultant on plain-language issues. Besides
appeared in November 2011 in the Journal of Finance.
meeting regulatory requirements, a well-written document
The authors designed three types of disclosures based on
answers customer questions rather than raising more.
behavioral principles from psychology and economics
Private firms are also building plain language efforts into
literature to investigate “possible cognitive lapses payday
corporate cultures. For example, Chase Bank has revamped its
borrowers might be making,” according to the paper. For
credit card agreement. In 2010, the Center for Plain Language
instance, the researchers placed interest rates in context by
named Chase’s agreement as a finalist for a “WonderMark”
comparing those of payday lenders to rates on car loans,
award, which means the document was among the “least
credit cards, and subprime mortgages.
usable.” This unflattering distinction described six pages of
They found that borrowers in all groups reduced borrowwhat Joan Bassett, a senior director at Chase, calls “mice” type,
ing amounts. Those who received information about fee
typography slang for very small print.
accumulation compared to other types of loans over a fourChase got the message and got to work. “If you look at
month span were 5.9 percentage points less likely to borrow
the old agreement, with paper-thin, ‘mice’ type — it’s very
during subsequent pay cycles, an 11 percent decline relative
legal-heavy,” Bassett says. The redesigned agreement comes
to the control group. (Payday lenders may charge rates of
as a booklet, organized with tabs for easy reference.
400 percent or more for these short-term, high-risk loans,
Information is displayed in tables that use larger print.
which can provide needed liquidity to some households but
Testing helps Chase tweak plain language communicaalso have the potential to lead to significant debt-to-income
tions. “Consumers found it [the new agreement] easier to
burdens.)
navigate and they understand it better.” So far, customer
Clearly written, understandable, and organized content
satisfaction has improved, specifically with regard to the
also can educate people about their health, maybe even
communications, she says, according to internal measures
save lives. People who don’t understand drug labels or a set
by J.D. Power and Associates, a marketing information
of instructions — those with limited or poor health
services firm.
“literacy” — have worse health outcomes, according to
Chase worked with its legal team to make the new
Karen Baker, senior vice president at Boise, Idaho-based
document as consumer friendly as possible while meeting
Healthwise. The nonprofit has produced, since 1975,
regulatory requirements. “You really have to dig into what is
health information, tools, and services for hospitals, clinics,
driving confusion, what’s driving the lack of transparency,”
insurers, government agencies, and medical practices.
she says. “You want to understand the whole process.” (The
The ultimate audience for Healthwise, though, is the
revised agreement was recognized with a “TurnAround”
patient who uses that content to make health decisions and
award from the Center in 2011.)
change behavior.
Financial documents are prime candidates for simplifica“We know that people with low health literacy access
tion. For example, the Pew Charitable Trusts has developed
health services more, have a hard time sticking to a treata plain-language model of checking account statements,
ment plan, and are readmitted more,” Baker says. “They
adopted by Bank of America and other financial institutions.
are less likely to understand the need for screenings and
Financial disclosures can be particularly complex. The
preventive care. All that drives up costs.” The costs of low
Canadian government in 2009 amended its disclosure reguhealth literacy range from $106 billion to $238 billion a year,
lations for credit products to include a plain language
according to a 2007 report for which the lead author was the
provision, requiring that the language be “clear, concise, and
late finance economist John Vernon of the Department of
not misleading.” Earlier, the Canadian Bankers Association
Health Policy and Management at the University of North
in 2000 had adopted a voluntary plain language code of
Carolina, Chapel Hill.
conduct — the Plain Language Mortgage Documents CBA
Healthwise has built plain language into its genes, Baker
Commitment — covering mortgages.
says. “If you walked in here tomorrow and asked about plain
18

ECON FOCUS | FIRST QUARTER | 2013

language, you would get an answer from anybody on the
staff.” They write, design, and organize content and, finally,
test it extensively using a professional organization.
“We want to make sure that if we want feedback on
instructions for using an asthma inhaler that we are testing
people who have asthma,” Baker says. “Do we need to
change something? Is it informational? Do we think we have
it right? We consider all that feedback that helps make our
products and our assets better.”
Plain language helps people participate fully in decisions
that affect well-being, whether it’s physical health or financial health. More plain language efforts are under way, public
and private; a federal law covering the writing of agency
regulations, in committee since last January, may be next.

There’s a snowball effect along with a willingness to take
plain language seriously. “I think the people who have
embraced the idea of plain language are using what, to some,
is an onerous regulation, to their own advantage,” Bosley
says. “The smarter companies understand this is a marketing
opportunity for them because every piece of material that
comes out of an institution is a piece of marketing.”
Even though consumer finance and other regulations
have mandated clearer statements and disclosures, plenty of
dense text remains in a wide range of contracts and agreements — construction contracts to cell phone agreements
to warranties. Until plain language dominates most documents, it’s not a bad idea to keep the magnifier handy and
read the fine print.
EF

READINGS
Bertrand, Marianne, and Adair Morse. “Information Disclosure,
Cognitive Biases and Payday Borrowing.” Journal of Finance,
December 2011, vol. 66, no. 6, pp. 1865-1893.

FEDERAL RESERVE

Kimble, Joseph. Writing for Dollars, Writing to Please: The Case for
Plain Language in Business, Government, and Law. Durham, N.C.:
Carolina Academic Press, 2012.

continued from page 9

complete picture of labor market conditions — so for both
reasons, the unemployment rate is an inappropriate basis for
policy changes. Plosser argued that, while the thresholds
provide a clear near-term forecast for the fed funds rate and
in that sense could improve transparency, thresholds do not
equip financial markets to understand how policy will
behave after the thresholds are met.
The debate reflects not only that communications are an
inherently imprecise policy tool, but also that monetary
policy is an imprecise science. In deciding how and what to
communicate, the Fed must balance the benefits of making
policy predictable with the risk that too much specificity,
like thresholds for a limited set of economic variables, will
obscure the fact that a complex array of data is behind
policy decisions. The recent FOMC minutes reveal that the
committee continues to discuss the risks and benefits of new

communication strategies, and Chairman Bernanke even
established a subcommittee headed by Yellen in 2010 to
analyze these very questions, because with limits on movements in the fed funds rate, “sometimes communication is
the policy,” she said in April.
Among the questions on the table: While the Fed has
become clearer about its thinking in the moment and has
adopted quantitative long-term goals, should it adopt an
explicit policy rule that defines how it will behave to achieve
those goals? Could it communicate a rule in a way that
reduces uncertainty but allows policymakers to deviate
from the rule when appropriate? And when is deviation
appropriate? While the Fed has made significant beneficial
strides in communication over the last two decades, the last
several years prove that there are many more issues still on
the table.
EF

READINGS
Bernanke, Ben S. “Central Bank Talk and Monetary Policy.”
Remarks at the Japan Society Corporate Luncheon, New York,
N.Y., Oct. 7, 2004.

Kohn, Donald L., and Brian P. Sack. “Central Bank Talk: Does It
Matter and Why?” Federal Reserve Board of Governors Finance
and Economics Discussion Series No. 2003-55, Aug. 25, 2003.

Crawford, Vincent P., and Joel Sobel. “Strategic Information
Transmission.” Econometrica, November 1982, vol. 50, no. 6,
pp. 1431-1451.

Swanson, Eric T. “Have Increases in Federal Reserve Transparency
Improved Private Sector Interest Rate Forecasts?” Journal of Money,
Credit, and Banking. April 2006, vol. 38, no. 3, pp. 791-819.

Goodfriend, Marvin. “How the World Achieved Consensus on
Monetary Policy.” Journal of Economic Perspectives, Fall 2007, vol. 21,
no. 4, pp. 47-68.

Swanson, Eric T., and John C. Williams. “Measuring the Effect of
the Zero Lower Bound on Medium- and Longer-Term Interest
Rates.” Federal Reserve Bank of San Francisco Working Paper
No. 2012-02, January 2013.

Gordon, Robert J. “The History of the Phillips Curve: Consensus
and Bifurcation.” Economica, January 2011, vol. 78, no. 309,
pp. 10-50.

ECON FOCUS | FIRST QUARTER | 2013

19

Many online
professors follow
Salman Khan’s
lecture method
of writing on the
screen while they
explain concepts,
as shown in this
video from
Udacity.

BY T I M S A B L I K

T

20

ECON FOCUS | FIRST QUARTER | 2013

learned anything at all. And measured learning was not
much higher by the time students graduated.
One new education platform purports to answer both
concerns by offering top-notch training for a bargain price:
the MOOC.

What is a MOOC?
The first online classes in the United States began appearing
in the early 1980s, and even before that, researchers had
discussed how computers might expand the reach of traditional teaching. Early efforts at online education tended
to resemble physical classrooms transplanted to the digital
world, but with less interaction. Teachers posted lectures
in text form, and limitations in technology inhibited the sort
of dialogue possible in a physical classroom.
Today’s MOOCs — massive open online courses —
promise to overcome the limitations of early online education. Providers of MOOCs are less interested in replicating
the existing classroom infrastructure and more interested in
creating their own. One of the pioneers in the modern
MOOC movement is Salman Khan, who founded the nonprofit Khan Academy in 2008. Through videos he records
himself, Khan has taught more than 40 million students,
making him one of the most wide-reaching teachers in
history. The advancement of online video distribution and
broadband internet access allows millions of students to
virtually sit down one-on-one with Khan for brief lessons on
a variety of topics. The price of admission? Zero.
Khan has inspired instructors in higher education to go
online as well. In the fall semester of 2011, Sebastian Thrun,
who at the time was a professor at Stanford University,
taught a class on artificial intelligence with Peter Norvig,
Google’s director of research. The class was simultaneously
offered online for free to anyone who wanted to register. In
total, 160,000 enrolled. Students at Stanford who signed up

COURTESY OF UDACITY

he structure of college hasn’t changed much in the
last century. Higher education is certainly more
accessible today, allowing more graduates to earn
a significant wage premium in the workplace. But tuition
costs have increased steeply, prompting some students to
search for alternatives. Over the last decade, the average
published tuition at public four-year colleges rose by 5.2
percent per year after adjusting for inflation. Tuition at
public four-year schools is now about 3.6 times higher in
real terms than it was in 1982-1983; tuition at private fouryear schools is about 2.7 times higher (see graph). Students
have turned to loans to fund the growing expense of higher
education, which has increased the debt burden new graduates carry with them into the workforce. (See “Debts and
Default,” Region Focus, Fourth Quarter 2010.)
Some educators have also begun to question whether
this time-honored method of instruction is the most effective way to transfer knowledge. In their book Academically
Adrift, Richard Arum and Josipa Roksa, professors of
sociology at New York University and the University of
Virginia, respectively, examined survey results from
college students to quantify the amount of learning that
goes on at American universities. The authors looked at
data from the Collegiate Learning Assessment, which
measures learning based on the results from a performance
test and two analytical writing tests. They found very little
improvement in critical thinking and writing skills in the
first two years of students’ college careers. From the start
of their freshman year to the end of their sophomore year,
students improved an average of 7 percentile points on the
assessment, meaning a student that started college in the
50th percentile might have only advanced to the 57th percentile by the end of their second year. For nearly half of
the students surveyed, there were no statistically significant gains, making it unclear whether most students

economics,’ we’ll have a role in the future of education.”
The MOOC movement has captured a lot of attention,
but will it transform traditional higher education?

Building a Better Classroom
To be competitive, online education must suit the needs of
students seeking an alternative or supplement to traditional
education. On this front, studies are promising. Ithaka S+R,
a nonprofit consulting and research group focused on
studying the use of technology in education, conducted a
randomized experiment in which they assigned some
students to traditional college classrooms and others to
hybrid classes where students learned mostly online but also
met for one hour each week face-to-face with the instructor.
On average, the researchers found no significant differences
in learning — in terms of course completion, grades, and
performance on a standardized test of course material —
between the two groups of students.
“This seemingly bland result is in fact very important,”
noted the researchers. It suggests that transferring most
classroom education online does not impair student learning. In another review of several studies on online education,
the U.S. Department of Education (DOE) found that, on
average, students in online classes performed as well or even
slightly better than students in traditional classrooms.
Students in hybrid classrooms, like the ones studied by
Ithaka S+R, performed even better, according to the DOE.
These findings are encouraging for the future of
MOOCs. Modern technology allows teachers like Thrun,
Evans, or Cowen to reach many times the number of students they might see in a lifetime of classroom instruction.
If these students can learn the same material just as successfully, then the upside to MOOCs could be very high.
While MOOCs have certainly proven that they can
attract large audiences, the number of students who actually
completed courses is another matter. Evans’ initial Udacity
class had 94,000 students enroll, but only 9,700 finished.

Inflation-Adjusted Published Tuition and Fees
Relative to 1982-83
400
350

INDEX (1982-83=100)

300
250
200
150
10-11

12-13

08-09

02-03
04-05
06-07

98-99
00-01

94-95

96-97

90-91
92-93

86-87

88-89

100
82-83
84-85

for the physical class were also given the option to watch the
content online rather than attend class, and the majority
did so. Those students scored a full letter grade higher on
average than students who had taken the traditional class in
the past. After the experience of teaching the course, Thrun
left Stanford to start his own for-profit MOOC, Udacity Inc.
In the beginning, Udacity was a garage company of sorts.
“It was really just four people operating out of Sebastian’s
guest house,” recalls David Evans, a professor of computer
science at the University of Virginia who served as Udacity’s
vice president of education. The site now offers over a dozen
courses on topics ranging from computer science to building
a startup, all for free. Evans taught one of the inaugural classes, Computer Science 101, which enrolled 94,000 students.
“Until recently, most online education has been sort of a
pale substitute for in-person education,” says Evans. “It was
trying to replicate the classroom experience of watching a
long lecture and maybe having some synchronous discussion. And technology allows us to do more interesting things
than that now.”
Of course, online classes still present some unique
challenges. Providing individual attention to thousands of
students is an impossible task, even with advancements
in technology. Evans says the huge scope of MOOCs can
actually help in that regard.
“One nice thing about a class like this is the class scale
and the diversity of students means that almost all questions
get answered quickly by other students in the class, often
within 15 seconds of a question being posted in the discussion forums,” says Evans.
If the initial response is unsatisfactory, other students
will soon chime in with their own answers, says Evans. And
the professors can get involved as well. When Udacity first
launched the Computer Science 101 class, all of the students
took the same units together, allowing Evans and his teaching assistants to hold virtual office hours to answer questions
from students. Many students in MOOCs have also formed
in-person study groups at coffee shops and libraries with
classmates who live in their area.
Major universities have started to recognize the growing
demand for education alternatives too. Coursera partners
with 62 universities that offer free courses taught by their
faculty. The Massachusetts Institute of Technology and
Harvard University founded edX as an outlet for their free
online courses, and they have since been joined by nearly a
dozen other major universities.
Moreover, technology enables individual instructors to
create and host their own MOOCs at low cost. Tyler Cowen
and Alex Tabarrok, economists at George Mason University
who co-author the blog Marginal Revolution, started their
own online source for economics education called Marginal
Revolution University in September 2012.
“The main driver of blogging and Marginal Revolution
University is the desire to communicate ideas to a broader
audience,” says Cowen. “We think that if we have the
best economic content, sort of like the ‘Khan Academy of

Public Two Year
Private Nonprofit
Public Four Year
Four Year
NOTE:: Tuition and fees in the chart are expressed as a percentage of the prices
in 1982-83, which are indexed at 100. So, tuition and fees for public four-year
schools in 2012-13 are 357, or 3.57 higher than in 1982-83.
SOURCE: Trends in College Pricing, ©2012 The College Board.
www.collegeboard.org
ECON FOCUS | FIRST QUARTER | 2013

21

“Certainly you would like to have much more than 10 percent of students starting a class successfully finish it,” says
Evans. “On the other hand, because the costs of getting
started were so low, that is a pretty reasonable rate.”
Evans notes that many students never even watched the
first video, since registering is free and requires only a
few minutes. He says that after the initial drop-off, most
students left after the fourth unit out of seven.
“That’s where it gets into some of the deeper computer
science ideas,” says Evans. “It may be something that is less
compelling for people who just wanted to understand a little
bit about programming.”
In that sense, the flexibility of online classes is a plus,
allowing students to learn as much or as little as they like
or dabble in new fields of study with little upfront cost or
commitment. On the other hand, some argue that the
economies of scale made possible by online education are a
mixed blessing. “It’s not the ‘O’ in MOOC that worries me,
it’s the ‘M,’” says economist Arnold Kling, who formerly
taught at George Mason University and served on the staff
of the Fed’s Board of Governors. He now teaches economics
and statistics at the Melvin J. Berman Hebrew Academy in
Rockville, Md. Kling wrote an article in The American in
which he was critical of placing too much faith in MOOCs
as the solution to improving higher education.
“I think online education has a role. But my basic view is
that students are different. The more an educational technology is adapting to the individual student, the more
productive it is,” says Kling.
Taking a one-size-fits-all approach of simply posting
videos online, as many MOOCs have, is not likely to yield
great results, argues Kling. He advocates a “many-to-one”
approach, in which courses are more adaptive and tailored to
the individual student. He notes that this is already starting
to happen, as Khan has championed a “flip the classroom”
approach, where students watch lecture videos at home and
then use class time to work on problem sets and receive help
where they are struggling.
“It’s great because you have students working on problems with each other,” says Kling, who has used the method
in his classes. “Their thought process is the thought process
of someone who’s learning, so that’s better than just
getting my thought process as someone who has done this
for years.”

Valuing Digital Degrees
Even if digital classrooms are effective, the for-profit
MOOCs still must find a way to translate free classes into a
sustainable business model. And even the nonprofit ones are
likely to thrive only if they create economic value for
students. But how does higher education create value?
It is clear that for the median graduate, college does pay.
According to a 2010 study from the College Board Advocacy
and Policy Center, the median college graduate with a
bachelor’s degree makes nearly $22,000 more per year working full-time than the median high school graduate, roughly
22

ECON FOCUS | FIRST QUARTER | 2013

a 65 percent wage premium. What is less clear is where this
value comes from. One view is that college graduates learn
valuable knowledge and skills in the course of obtaining
their degree and this makes them more productive members
of the labor force. As a result, employers are willing to pay
them a higher wage for their higher productivity. If knowledge accounts for most of the value of a college degree, then
the evidence on the efficacy of online education is encouraging for the future of MOOCs. Online education appears to
provide a comparable level of learning for a fraction of the
cost, which would make it an attractive option for students.
So what has prevented college students from leaving
campuses in droves to sign up for free MOOCs? Signaling
theory may provide the answer. That theory posits that the
value of college education comes primarily from the information it imparts to potential employers. Workers have
varying levels of ability, which potential employers cannot
easily discern. A college degree provides a signal for employers that a worker possesses certain abilities, and under
signaling theory it is the credential that matters most — and
an online education without a credential to validate the
student’s ability would have little value in the job market.
The jump in wage premium enjoyed by college graduates
versus those with only some college education suggests that
there is significant value in holding a degree (see chart). That
value also seems to diminish as workers are better able to
signal their ability through work experience. Andrew
Hussey, an economist at the University of Memphis, looked
at students in Master of Business Administration (MBA)
programs. Many of these programs require candidates to
have some work experience, but the amount of prior experience varies greatly among students. Hussey postulated that
if signaling does not matter, then the returns from the MBA
program should be roughly the same for students after
controlling for their years of work experience. He found that
the wage premium from an MBA diminishes for students
with more work experience, suggesting that students who
have worked longer have already signaled their abilities and
thus earn less value from the signaling provided by the MBA.
“I think if the credential is what matters and online
education cannot offer an equivalent credential, then it’s
just not going to go anywhere,” says Kling.
The evidence is not clear-cut, though, as other studies
have pointed to skills gained in college having a larger
impact on the wage premium. What seems more likely
is that the value from higher education stems from a
combination of learning and signaling. Thus, in addition to
looking at ways to improve how they teach, MOOCs have
begun to address the need for credentials.
In February, the American Council on Education (ACE)
approved five courses offered on Coursera as eligible for
college credit. Students pay fees to take a proctored exam
and receive a transcript from ACE that they can use to apply
for credit at a physical school. Two of those courses are
offered by Duke University, which has also announced that it
will collaborate with other schools on a new online platform

to offer courses that Duke students can take for college
credit. Udacity has partnered with Pearson VUE to allow
students to take independently proctored certification tests,
for a fee. It has also partnered with a number of companies
in the computer industry, such as Google, and offers a
service to share student resumes with potential employers.
“One potential business model is to have companies
sponsor classes,” says Evans. “They get a lot of value from
this in terms of students learning to use their technology.
The business model would have students take a set of
courses through us and build up something comparable to an
academic transcript. We will have a very detailed record of
the student’s performance in the class as well as their social
contributions, and employers would pay a recruiting fee for
these referrals.”
Credentialing could offer a path for MOOCs to be financially sustainable, but only if the credentials are accepted in
the marketplace. Although Cowen says MRU is financially
stable since it is small and its costs are low, he suspects some
form of accreditation will be necessary for larger-scale
operations to thrive.
“My guess is, if those courses are accredited by legitimate
schools, people will pay, and those companies will make
money,” says Cowen. “If not, people won’t pay enough to
keep that whole thing up and running.”

You Say You Want a Revolution?
If MOOCs can provide accepted credentials and develop
sustainable business models, then aren’t brick and mortar
schools hurting themselves by partnering with them?
Most of those involved in the movement don’t see it
that way.
“I don’t think we’ll ever replace face-to-face education,”
says Cowen, noting that George Mason has been very
supportive of MRU. “I think smart schools will move toward
hybrid models. They’ll hire fewer instructors of particular
kinds, but I’m not sure the demand for instructors has to
go down. What the instructor does has to change. It will
be more about motivation and less about just repeating
lectures, which I think is actually how it should be.”
Evans agrees that online education is more likely to
augment brick and mortar schools than replace them.
“But at the same time, there’s a huge population that is
not being served by the traditional universities today,
and if the success of open education makes traditional
universities question some of the things they do, I think
that would be a great thing,” he says.

Median Pretax Earnings of Full-Time Workers Ages 25 and Older
120,000
100,000
80,000
60,000
40,000
20,000
$
Associate Bachelor’s Master’s
High
Some
Not a High
Degree
Degree
Degree
School College,
School
Graduate Graduate No Degree

Doctoral Professional
Degree
Degree

SOURCE: Education Pays, ©2010 The College Board. www.collegeboard.org

The broad reach and open access of MOOCs have been
particularly valuable for students in developing countries.
For example, schools in India have turned to the videos on
Khan Academy to supplement their shortage of teachers and
textbooks.
“Online classes are gaining popularity in India,” says
Satyakam, a 31-year-old master’s student at the National
Institute of Technology in Kurukshetra, India, who is taking
a class on Udacity. “Recently my friend visited one of the
premiere engineering colleges in India on a recruitment
drive for his company and some of the students used these
courses as a main source of learning.” Satyakam plans to use
the material from online courses to help him start an
Internet company.
For students with little access to quality education,
MOOCs could be a “lifeline,” says Cowen. But to get the
most out of education, he feels some in-person instruction is
needed. “The naive idea that you just put it up and then the
world soaks up all this knowledge, I don’t think is where
it’s at. There’s some percentage of people who do great with
that, but that’s a minority. There’s another much larger
group that needs this hybrid model, with good teachers and
care and motivation, and we cannot do that for the whole
world. Other educational institutions have to pick up the
ball and run with it.”
In the United States, online education seems more likely
to supplement traditional classrooms, which could be what
traditional universities are counting on. Even so, the hybrid
classrooms of the future may not so closely resemble those
of a century ago.
“We’re optimists,” Cowen says.
EF

READINGS
Bowen, William G., Matthew M. Chingos, Kelly A. Lack, and
Thomas I. Nygren. “Interactive Learning Online at Public
Universities: Evidence from Randomized Trials.” Ithaka S+R
Research Paper, May 22, 2012.
“Evaluation of Evidence-Based Practices in Online Learning:
A Meta-Analysis and Review of Online Learning Studies.”

U.S. Department of Education, September 2010.
Hussey, Andrew. “Human Capital Augmentation versus the
Signaling Value of MBA Education.” Economics of Education Review,
August 2012, vol. 31, no. 4, pp. 442-451.
Kling, Arnold. “Many-to-One vs. One-to-Many: An Opinionated
Guide to Educational Technology.” The American, Sept. 12, 2012.
ECON FOCUS | FIRST QUARTER | 2013

23

evin Nozeika lost his job of 16 years at Sparrows
Point last summer when the steel company’s fourth
owner in a decade, RG Steel, went bankrupt. His
father retired from another now-departed Baltimore steelmaker. Nozeika worked there, too, with 2,000 others, until
it closed. “I’m an old hand at shutting down steel mills in
the Baltimore area,” Nozeika jokes. He started working in
steel just before his 20th birthday. He’s now 44.
Sparrows Point’s flaring furnaces reddened the skies
above Baltimore Harbor for more than a century. Workers
made steel for rails, bridges, ships, cars, skyscraper skeletons, nails, wire, and tin cans.
It was a remarkable run. “If you look at Sparrows Point’s
history, there were times when the different generations of
blast furnaces, or the rolling mills, or the coke works were
the biggest in the world," says David Hounshell, a professor
of industrial history at Carnegie Mellon University. Even as
late as 1995, the Point’s “Beast of the East,” its last big blast
furnace, reached record output — 3.4 million tons.
At its 1957 peak, the company made Maryland’s biggest
payroll, for 30,000 workers. Many of them lived in the
shadow of the hulking steel works — in the unincorporated
town of the same name — others in Baltimore City or the
nearby neighborhoods in the suburb of Dundalk. As steelmaking grew, matured, and declined in the United States,
the plant that once symbolized blue-collar Baltimore
changed the city — its footprint, population, and economy.
Today, smokestacks tower over rows of empty gray mills.
Rusting locomotives line chain link fences, and it’s the Johns
Hopkins Institutions — “eds and meds” — that comprise
Maryland’s biggest private payroll.
Sparrows Point grew into its own 3,000-plus acre
city-state, where raw iron ore was transformed into
finished steel, and nearby manufacturers like General

K

24

ECON FOCUS | FIRST QUARTER | 2013

Motors, Western Electric, Signode Strapping, Thompson
Steel, Continental Can, and Crown Cork and Seal employed
thousands.
Once the site was nothing but desolate marshland jutting
into the Chesapeake, but as historians often say, geography
is destiny.

Forging an Industry
The Pennsylvania Steel Co. in 1887 sent its engineer,
Frederick Wood, to scout the East Coast for a site conveniently located to receive iron ore shipments from the firm’s
captive Cuban mines; there, they’d transform the iron
into steel for rails. Sparrows Point lay 100 miles closer to
Cuba than Philadelphia and 65 miles closer to western
Pennsylvania’s bituminous coal fields. The Point also had
deep water and its swamps were flat and easily filled.
Industrialists were nurturing their first infant — steel.
Wood built this subsidiary of Pennsylvania Steel, known
then as the Maryland Steel Co., for $7 million; his brother,
Rufus, built the town of Sparrows Point for $700,000.
(The brothers had grown up in a company town themselves,
in Lowell, Mass.) From the start, the company sought and
received political favors giving it government-like powers,
including the right to run its own police force and prohibit
alcohol consumption.
The company, backed by the Pennsylvania Railroad,
made steel rails and ships using the first steelmaking breakthrough, the Bessemer oven, which allowed large-scale steel
production — 20 tons of steel in 20 minutes. “By 1910, the
railway track rolled at the plant had spanned Mongolia,
climbed the Andes, breached the pampas of Argentina, and
descended into the tunnels of the London Underground,”
wrote Mark Reutter, in Making Steel: Sparrows Point and the
Rise and Ruin of American Industrial Might.

PHOTOGRAPHY: MARK REUTTER

BY B E T T Y J OYC E N A S H

A 1976 view of Sparrows Point’s coke ovens, which were closed
in 1992 for air-quality reasons. The ovens converted coal
into coke, the fuel used to fire steelmaking furnaces.

Skilled workers were imported from the steel towns
in Pennsylvania — Johnstown,
Steelton,
Homestead,
and
others. But the plant also relied on
workers from eastern Baltimore,
already a hub for canneries, immigrants, and domestic migrants,
white and black. Families who
lived in the company town of
Sparrows Point had their own
schools, including the first kindergarten in the South, even a
separate one for black children, and their own dairy, bakery,
and slaughterhouse.
The work was perilous and backbreaking and dirty, but
the living in the company town was cheap. A company store
sold groceries and just about everything else until it closed
in 1944 when the plant expanded. In 1956, the company
flattened about a third of the town’s homes for the same
reason. In 1974, the company razed the rest of the town to
make room for the biggest blast furnace in the western
hemisphere, the “L” furnace.
Early on, black and white immigrant men lived in a
“shantytown” beside the steel furnaces, four to a shanty.
In the town proper, most workers lived in duplexes, with few
houses for black workers.
Blacks represented about a third of the workforce over
the plant’s history, according to Reutter. Blacks were segregated; their homes had no indoor plumbing. That beat life in
Georgia, the Carolinas, and Virginia, from where many of
them had migrated.
“It was certainly a way up for these guys who were working essentially as sharecroppers, where there was no future,”
Reutter says. “There was terribly institutionalized racism,
but there was that everywhere.”
Charles Mandy, who is 71 and black, started at Sparrows
Point in 1960 checking railcars loaded with various materials
and notifying the rail yard master when they needed hauling
off. Back then, blacks and whites used separate locker
rooms. His uncle worked there, too. “If you had a family
member working there, you had a better chance of getting
on.” He did well there and retired as a shipping supervisor
in 2005.
Inside the company town of Sparrows Point, white
workers lived in 500-square-foot brick and frame row
houses on sycamore-lined streets.
Elmer Hall was born and raised there. His father and his
father’s three brothers had migrated from Virginia to
Sparrows Point to get work during the Depression. “We all
had one thing in common,” he says of the families living so
close together. “We had the same boss.”
Superintendents lived in multistoried homes, with wide
porches and carved banisters inside. “Just to give you some
idea of the houses on any block of ‘B’ and ‘C’ streets, where
the superintendents lived, there were six houses on a block
there and 32 on a block on my street,” says Hall, who lived on

Beechwood Street. He can still
name every resident who lived
on Beechwood. Nobody moved
unless they died. His dad paid
$23 a month until the family
moved, displaced by the new blast
furnace. “If you never moved, they
never raised your rent,” he recalls.
The town of Sparrows Point,
by most company town standards,
was in some ways progressive,
providing education and leasing land to churches for $1, with
baseball and football teams, Boy Scouts, Christmas and
Halloween parades, playgrounds, and carnivals. Yet there
was no local government. Company towns were common in
that era: automotive suburbs in Detroit, the Pullman
factory town outside Chicago, New York’s satellite towns in
New Jersey, as well as steel towns in the Midwest and textile
towns in the North and South. Between 1880 and 1890,
about 100 manufacturing suburbs grew up with industrial
America, wrote Reutter in his book. The town of Sparrows
Point died only when the company flattened it.
Even now, the spirit of the town lives on the Facebook
page called “I grew up in Sparrows Point, Md.”
“For all the red soot and belching smokestacks, it was a
wonderful place for a kid to grow up,” Hall remembers.
“I have nothing but wonderful memories of the place as a
kid, fishing off piers, crabbing, swimming at the bathing
beach, riding my bike all over Sparrows Point. You couldn’t
do much without word getting home before you even arrived
because somebody was watching — in a good way.”
Hall grew up and left. In 1942 when Hall was born,
Sparrows Point was pouring steel and building ships for
wartime.

From Heyday to Doomsday
Sparrows Point got plenty of military business in both world
wars, in no small part because of its location near the
Atlantic. In the middle of World War I, in 1916, Bethlehem
Steel, led by Charles Schwab, bought Sparrows Point.
Schwab had built “Aunt Beth,” as the Sparrows Point
workers came to call it, into a power by making guns and
armor plate for Central and South American countries and
selling weapons to Britain and France before the United
States went to war. The company also defied neutrality by
arranging for a Canadian shipyard to make submarines for
Britain, wrote Reutter in his book. By war’s end in 1918, the
nation’s steel production had almost doubled, to 44.5 million
tons. Sparrows Point alone produced 366,000 tons of steel
that year.
By Dec. 8, 1941, when the United States officially entered
World War II, the Roosevelt administration had already
anticipated a steel shortage and had arranged two months
earlier to finance additional capacity of 10 million tons.
The government agreed to fund a million tons of steelmaking at Bethlehem Steel alone, half of which would expand
ECON FOCUS | FIRST QUARTER | 2013

25

Sparrows Point, at a cost of more than $20 million.
Between 1941 and 1944, the Point produced about
2.4 million tons of steel a year. In 1941, the plant built a shipyard under a government contract on the west side of
Baltimore Harbor where workers would make 384 of 2,500
“Liberty” ships, a critical link in supplying overseas troops.
Employment at Bethlehem’s Fairfield Shipyard, on the west
side of Baltimore Harbor, reached 60,000 during the war.
Yet even amid wartime production — Bethlehem Steel
was the nation’s top war contractor — there were hints of
steel’s decline. “World War II was a war where steel was
hugely needed, but it was no longer the most advanced
material,” Reutter says. Emerging technologies, particularly
in flight, depended on lighter weight materials and electronics. Aviation led wartime innovations. They grew from
startup flight companies formed in World War I — Grumman,
Lockheed, Boeing, and others. Steelmakers were largely left
behind, in terms of advancing new technologies and ideas,
though steel output was so great they had steel to spare —
2.5 times as much production as in World War I.
The domestic steel industry also was not forced to innovate in the way many companies abroad were: None of the
U.S. steel plants were bombed during the war; Japan and
Europe, meanwhile, rebuilt their steel industries from
scratch, with the latest available technology.
War changed life on the factory floor too. Women
had worked as crane operators and machine tenders during
World War II, but lost those jobs in 1945; the only jobs open
to them afterward were in the sheet and tin plate mills.
The Point’s first tin mill had opened around 1917, and served
thriving waterfront canneries that sprang up in the late 19th
century to preserve oysters, tomatoes, and corn from the
fertile fields of the Eastern Shore. The women’s job was to
check — by eye and touch — for defective tin. If they found
holes, spots, or pits, they quickly “flopped” the tin sheet into
the appropriate bin, sometimes at a rate of 30 a minute.
They were known as “tin floppers” for the sound the tin
made as it hit the bin. In the 1950s, opportunities for women
at the plant closed still further as the tin floppers were
automated out of their jobs.
Other wars, including the Cold War, required steel too.
“In the heyday of the 1950s, you have a strong program of
both guns and butter,” Hounshell says. The United States
was building domestic interstate highways, bridges, and skyscrapers, and the cold war brought construction contracts
for armaments and infrastructure at overseas bases.
Production at Sparrows Point reached a record high of
6.6 million tons in 1968, during the Vietnam War, and
declined from there. Even as the plant and the industry as a
whole were prospering, executives were making decisions
that would cause the fortunes of Sparrows Point to take a
turn for the worse.
Bethlehem Steel became blinded to threats to its future,
Reutter says, perhaps on account of its success. Substitute
products, imports, and declining demand were biting steelmakers, but the industry failed to respond. For example,
26

ECON FOCUS | FIRST QUARTER | 2013

steel production fell to less than 60 percent of capacity in
the 1958 recession, yet the company raised prices. The oil
company Esso Standard Oil, a predecessor of today’s
ExxonMobil, converted quart oil cans to aluminum in
1960, partly because of rising tin plate costs. Even the
beer industry converted from tin cans to aluminum.
Construction projects used less steel and more reinforced
concrete for highways, bridges, and buildings. “Engineers
found that great economies could be achieved by reducing
overall weight, for example, by using fewer structural girders
in a large building,” Reutter wrote in his book.
And then there was plastic. Carmakers switched to
plastic-coated stainless or other metal trim, known as
“plated plastic.”
It didn’t help that the firm was slow to adopt new technology. In the 1950s, Sparrows Point built another open
hearth furnace rather than a faster oxygen furnace. That
technology converts up to 350 tons of iron to steel in 40 minutes, compared to 10 to 12 hours in an open hearth furnace.
“The next innovation — and this is not rocket science —
was the electric arc furnace to replace the blast furnace,”
says Reutter. The major industry players passed on this
innovation, too, because at first the lower-grade steel they
produced only served low-end product markets. But the
technology improved, opening the door to competition
from the smaller scale “mini-mills,” which melt scrap steel
from autos and appliances to make new steel.
Today, 60 percent of the world’s steel comes from Asia.
Half of the world’s steel today is made from scrap, in minimills, and the biggest steelmaker in the United States is
Nucor, based in Charlotte. Pittsburgh, once the steel capital
of the world, hung onto 4 percent of the nation’s steelmaking capacity, says economist Frank Giarratani of the
University of Pittsburgh, because of the region’s historically
healthy cadre of suppliers. “We had not just equipment
suppliers but engineering and materials suppliers. We’re
exporting stuff steelmakers use from this region to other
parts of the United States.”
That’s not true in Baltimore, once home to several steelmakers. And it doesn’t help Kevin Nozeika, who’s watched
three steelmakers exit Baltimore.

The New Baltimore?
“When I was a kid all three were still going and now all three
are gone,” Nozeika says. There’s no job that will pay him the
$25 an hour he was earning in steel. He’s had some college,
but not much. He knows that limits his options.
“I was looking for manufacturing jobs, but honestly, when
I look at them and see what they pay I know there is no way
I could live on that kind of money,” he says. Companies are
offering $8 or $10 an hour.
Some former Sparrows Point workers have found the
situation overwhelming. One, Robert Jennings, a 59-yearold welder, committed suicide in January, reported Reutter
in an article for Baltimore Brew, an online daily journal.
Though Baltimore, in many ways, is healthy, its shrinking

manufacturing base means it is often labeled a declining
industrial metropolitan area. Jennifer Vey of the Brookings
Institution, who has studied Baltimore’s labor market,
bristles a little at the notion. Heavy manufacturing work has
gone from the nation’s landscape, not just Baltimore’s.
“We’ve shifted away from an economy with a lot of bluecollar jobs that don’t require education toward one that is
more service oriented,” she says.
Manufacturing now represents only about 4.8 percent of
the metro Baltimore economy. “But that’s 62,000 people
working in manufacturing in Baltimore,” Vey says. The
diverse sector is dominated by small and midsized firms,
including computer and electronics firms that are driven by
the defense industry. (Older titans remain, though. Domino
Foods’ sugar refinery, with 500 employees, has occupied its
storied waterfront location for about 90 years.)
The reasons why those firms stayed are the same as
Frederick Wood’s were back in 1887 — ports and proximity
to markets. Location, location, location.
Still, there’s no question that the old blue-collar jobs are
dying. And that’s a problem. Though median household
income in the metro Baltimore area was $15,000 higher than
the national average in 2010, and its unemployment rate
lower than most of its metro peers, the jobs you can get
without post-secondary education aren’t easily replaced.
Most, three-fourths, of low-income workers in Baltimore
work in the service end of health care, education, retail, and
the food and hospitality sectors.
At least Baltimore has cultivated economic variety.
Its advantages include world-class hospitals and universities.
“Baltimore has always had other things — now you have
the whole biomedical sector,” says Scott Holupka, senior
research associate at the Johns Hopkins Institute for Policy
Studies. In fact, the Johns Hopkins Institutions employ
more than 46,000 people, excluding the students who work
part-time, in Maryland.
But to compete in a global economy, Vey says, people
need skills to get jobs in the growing sectors. That takes
coordination among economic developers, high schools,
private training firms, the public workforce system, and
employers.

Vey suggests economic developers capitalize on exports
to take advantage of purchasing power of other nations; on
transportation — truck, air, and rail — which had expanded
employment leading up to the recession; on the sectors
of information technology, bioscience, and the “green”
economy; and on that new kind of manufacturing, the one
that’s clean and requires fewer, tech-savvier workers.
Getting jobs that pay well in those sectors, though, is
tough without an education. Overall, 28 percent of lowincome residents have no high school diploma, much less
post-high school training.
No one, in short, wants a dirty factory like the old Point
except maybe the people who need those jobs. Ideas for the
Point’s re-use were recently floated by area architects and
planners and published in Baltimore’s weekly City Paper.
They were far removed from the Point’s industrial history:
biotech and amusement parks linked by hydrofoil to the
Eastern Shore, a port expansion with a cruise ship terminal
and luxury hotels, dense housing, with parks featuring pollution-eating plants.
The Point struggled — and failed — over the years to
meet state and federal environmental standards which are
often blamed, with its other legacy costs and foreign
imports, for the industry’s demise.
Nozeika still can’t believe Sparrows Point is probably
history. “Everybody assumed when it went to auction
another steel concern would buy it,” he says. Hilco
Industrial, a liquidator, bought it for $72.5 million, with a
brownfield redeveloper, Environmental Liability Transfer.
Hilco, which is selling all assets, offers the old “Beast of the
East” on its website. No one has bought that. But in midDecember, Nucor acquired the most profitable component,
its cold mill, not to crank up steelmaking at the Point, but to
upgrade and replace parts at Nucor’s own mills.
The sale killed off any hope of a revival. Nozeika
is making other plans, and so are his friends from his
steel days. Some, including Nozeika, are going into a
federal program for displaced workers. After six months
of classes, he may be back in manufacturing, only this
time, with the technical skills to operate computerized
machines.
EF

READINGS
Giarratani, Frank, Ravi Madhavan, and Gene Gruver. “Steel
Industry Restructuring and Location.” In Frank Giarratani,
Geoffrey Hewings, and Philip McCann (eds.), Handbook of Economic
Geography and Industry Studies. Northampton, Mass.: Edward Elgar
Publishing, forthcoming.
Lewis, Robert, ed. Manufacturing Suburbs: Building Work and Home
on the Metropolitan Fringe. Philadelphia: Temple University Press,
2004.

Reutter, Mark. Making Steel: Sparrows Point and the Rise and Ruin of
American Industrial Might. Urbana, Ill.: University of Illinois
Press, 2004.
Vey, Jennifer S. “Building from Strength: Creating Opportunity in
Baltimore’s Next Economy.” The Brookings Institution
Metropolitan Policy Program, 2012.
Warren, Kenneth. Bethlehem Steel: Builder and Arsenal of America.
Pittsburgh, Pa.: University of Pittsburgh Press, 2010.

Muller, Edward K., and Paul A. Groves. “The Emergence of
Industrial Districts in Mid-Nineteenth Century Baltimore.”
Geographical Review, April 1979, vol. 69, no. 2, pp. 159-178.

ECON FOCUS | FIRST QUARTER | 2013

27

Fed does not manage high excess reserves properly, they
create the potential for high inflation. The $1.7 trillion
dollar question is, can the Fed do it?

Interest on Reserves to the Rescue

Monetary expansion has led banks to park
huge excess reserves at the Fed — for now
BY DAV I D A . P R I C E

n response to the financial crisis of 2007-2008 and the
recession of 2007-2009, the Fed has carried out an
unprecedented monetary expansion by purchasing a
variety of financial assets in large amounts, especially
Treasury bonds and mortgage-backed securities. The monetary base, the total of bank reserves and currency, has
more than tripled from June 2008 to March 2013.
Where, then, is the inflation? While the prices of some
goods have increased, the general level of prices has
remained stable; average inflation in 2012 was 2.1 percent,
according to the Bureau of Labor Statistics. Given the
magnitude of the monetary expansion that has taken place,
why are we not swimming around in money like Scrooge
McDuck?
The seemingly missing money can be found, for the time
being, in an accounting entry at the Fed known as “excess
reserves.” This figure refers to the amount of reserves that
banks and other depository institutions keep at the Fed
beyond the level of reserves that they are required to maintain there. Before the financial crisis and recession, banks
tended to hold a minimal amount of excess reserves. In the
time since, however, excess reserves have skyrocketed
850-fold: from about $2 billion in mid-2008 to about
$1.7 trillion in March of this year.
The significance of high excess reserves is that banks can
draw them down to make loans, which in turn creates
deposits — money — in the broader economy. Thus, if the

I

Excess Reserves
1,800
1,600
1,400

$BILLIONS

1,200
1,000
800
600
400
200

1/

1/
9
1/ 8
1/
99
1/
1/
00
1/
1/
0
1/ 1
1/
02
1/
1/
0
1/ 3
1/
04
1/
1/
0
1/ 5
1/
06
1/
1/
07
1/
1/
08
1/
1/
09
1/
1/
10
1/
1/
11
1/
1/
12
1/
1/
13

0

SOURCE: Federal Reserve Economic Data (Federal Reserve Bank of St. Louis)

28

ECON FOCUS | FIRST QUARTER | 2013

The Fed gained the ability to control the outflow of excess
reserves in October 2008, when it received the authority to
pay interest on both excess reserves and required reserves.
Interest on reserves, or IOR, enables the Fed to make it
more attractive to banks to leave their reserves parked than
to lend them out. In effect, the Fed can use IOR to keep the
velocity of money low.
IOR is a MacGyver-like adaptation of a tool that had
been meant for other purposes. It was originally conceived
as a way to eliminate the implicit tax that banks paid
through maintaining required reserves without earning
interest, a tax that economists viewed as distortionary.
Milton Friedman had advocated it for this reason as early as
1959. IOR was also intended to free banks of the burden of
moving their excess reserves each day from noninterest-paying reserves into interest-paying sweep accounts.
In the Financial Services Regulatory Relief Act of 2006,
Congress authorized the Fed to begin paying IOR on Oct. 1,
2011. In May of 2008, however, in the midst of the financial
crisis, the Fed asked Congress to move up the effective date.
During the crisis, the Fed had been carrying out emergency
lending to financial institutions on a large scale. The Fed
neutralized this process in monetary terms by “sterilizing”
the money that it was creating; that is, as it created money, it
sold the same amount of Treasury bonds from its holdings to
absorb an equal amount of money. (Technically, the New
York Fed, acting on behalf of the Federal Reserve System,
would sell the bonds and the reserve account of the trading
counterparty would be debited, causing those reserves to, in
effect, disappear.) The Fed was selling off its supply of
Treasury securities quickly, however, and it was foreseeable
that it would run out of sufficient Treasuries with which to
sterilize its lending.
“The Fed had sold so many securities that most of those
left in its portfolio were encumbered in one way or another,”
says Alexander Wolman, a Richmond Fed economist who
co-authored a 2012 working paper on excess reserves with
colleague Huberto Ennis. “Given that the Fed wanted to
continue expanding its credit programs without lowering
market interest rates, the answer was to start paying interest
on reserves.”
Congress granted the Fed’s request in the Emergency
Economic Stabilization Act of 2008, allowing it to begin
paying IOR on October 1 of that year at its discretion. The
Fed announced on October 6 that it would start doing so a
few days later “in light of the current severe strains in financial markets.” In addition to the longstanding efficiency
rationales for IOR, the Fed explained, “Paying interest on
excess balances will permit the Federal Reserve to expand its
balance sheet as necessary to provide sufficient liquidity
to support financial stability while implementing the

monetary policy that is appropriate in light of the System’s
macroeconomic objectives of maximum employment and
price stability.”

A Question of Timing
If banks believe that they can earn more by reducing their
excess reserves, and if they appear likely to use their excess
reserves to expand their activities faster than the economy is
growing, the Fed can avoid the torrent of money simply by
raising the interest rate that it pays on reserves. That is why
high excess reserves do not necessarily set the stage for high
inflation.
But is there a risk of the Fed getting the timing wrong?
If it doesn’t act quickly enough to raise IOR, or if it doesn’t
raise the rate enough, an unwanted rise in inflation or inflationary expectations could be the result.
For some economists, the likelihood of such a sequence
of events is remote. “The FOMC [Federal Open Market
Committee] meets every six weeks,” says Stephen
Williamson of Washington University in St. Louis. “You’re
not going to have a huge inflation instantaneously. They can
head it off if they’re willing to tighten at the appropriate
time.”
Ennis and Wolman of the Richmond Fed suggest, however, that high excess reserves create a greater timing
challenge for the Fed than it normally faces. “Absent the
excess reserves, banks would have to raise funds to make
new loans,” Wolman says. “People argue about whether the
large quantity of reserves materially changes the sensitivity
of the economy to the Fed messing up.”
The issue is that with high excess reserves on tap, banks
can increase lending quickly — “without having to sell
assets, raise deposits, or issue securities,” Ennis and Wolman
wrote. Thus, they suggested, high excess reserves mean that
an expansion can take place more quickly, perhaps before
the Fed is ready to act on signals that it is happening.
Philadelphia Fed President Charles Plosser has also
expressed reservations about the potential effect of high
excess reserves, together with the scale of the Fed’s balance
sheet, in a speech in November. “It is difficult to identify the
appropriate moment to begin tightening policy, even in the
best of times,” he said.
Indeed, the Fed’s historical track record in that regard
has reflected that in practice, the timing of monetary policy
is an art as well as a science, and one that is conducted by
human beings. For example, in a 2010 working paper,
Andrew Levin of the Fed and John Taylor of Stanford
University looked at the Fed’s record in responding to

inflation from 1965 to 1980, and found that “policy fell
behind the curve by allowing a pickup in inflation before
tightening belatedly.” To be sure, however, the Fed today is
more watchful of inflation than during that era.
In addition to the question of whether the Fed would
know when to act, some see a question of whether the Fed
would have the will to do so — and whether Congress would
permit it. These observers are concerned that the Fed might
consider the effect that rising interest rates would have on
the cost of servicing the federal debt. Moreover, they are
concerned that the Fed might be reluctant to raise rates
when the time comes because as interest rates go up, the
prices of assets held by the Fed will go down; the Fed, in
turn, would experience significant losses.
“They’ve acquired long-maturity assets, and will acquire
more, at very high prices, so there will be a capital loss on
long-term bonds when the short-term interest rates go up,”
says Williamson. “That will not look good politically.”
Increasing IOR would also reduce the Fed’s remittances
to the Treasury. At the end of each fiscal year, the Fed in
effect turns over its unspent income to the taxpayers. The
more interest that the Fed pays to banks, the less it has left
over. A paper by five economists with the Fed’s Board of
Governors, released in January, found that the Fed’s remittances to the Treasury have grown along with the growth of
its assets, reaching nearly $90 billion in 2012, but projected
that those payments may fall to zero for several years when
the Fed increases interest rates and begins selling assets.
While the Fed is independent of Congress and the
Executive Branch in setting monetary policy, there is concern that losses on the Fed’s balance sheet or a temporary
halt in remittances could create political conditions in
which the Fed’s independence may be curtailed.
Finally, as the amounts of IOR payments increase, those
unappropriated payments to the banks might also be viewed
as politically problematic in their own right. If the Fed were
to raise the rate from its current 0.25 percent to 2.25 percent,
for example, then at the present level of reserves, it would be
paying the banks some $38.2 billion per year — up from zero
in September 2008, and a far cry from the $359 million that
the Congressional Budget Office forecast when Congress
first approved the payments in 2006.
Yet the Fed’s political independence has been tested
before. Even those economists who are concerned about
the potential for an inflationary scenario from the management of excess reserves agree that it is far from a foregone
conclusion.
Wolman notes, “All we’re saying is, ‘Let’s be careful.’” EF

READINGS
Ennis, Huberto M., and Alexander L. Wolman. “Excess Reserves
and the New Challenges for Monetary Policy.” Federal Reserve
Bank of Richmond Economic Brief No. EB10-03, March 2010.
____. “Large Excess Reserves in the U.S.: A View from the
Cross-Section of Banks.” Federal Reserve Bank of Richmond

Working Paper No. 12-05, August 24, 2012.
Keister, Todd, and James J. McAndrews, “Why Are Banks Holding
So Many Excess Reserves?” Current Issues in Economics and Finance,
December 2009.

ECON FOCUS | FIRST QUARTER | 2013

29

INTERVIEW

Christopher Carroll
Editor’s Note: This is an abbreviated version of EF’s conversation
with Christopher Carroll. For the full interview, go to our
website: www.richmondfed.org/publications

30

ECON FOCUS | FIRST QUARTER | 2013

EF: How well did existing theories of the wealth effect
hold up during the housing boom and crash?
Did economists learn anything new?
Carroll: The theory was never particularly clear about
how large wealth effects should be and what would be the
channels. There was empirical evidence that when the value
of some set of assets goes up, whether it’s house values or
stocks or total wealth, then there’s subsequently a growth in
consumption spending. You could interpret the change in
spending as a consequence of wealth changing. An alternative interpretation is that everybody got more optimistic:
They saw that the economy was improving, and that’s why
the stock market boomed — it was anticipating the movement in consumption. That’s not really a causal story. So that
was always an issue.
There was a substantial literature showing that subsequent movements in consumption after house price changes
were bigger than those associated with the stock market.
But it was never clear from that literature whether people
spend more when their house value goes up because they feel
richer, or whether a collateral constraint has been reduced.
That is, when your house is worth more, you have a greater
ability to get a home equity loan or a second mortgage or

PHOTOGRAPHY: LISA HELFERT

When the housing market took its precipitous negative
turn in 2006, policymakers were plagued by a single
nagging question: How much would a collapse in housing wealth drag consumer spending down with it?
There were two schools of thought. One was based
on the notion of wealth effects, that wealth makes
people feel richer, such that a dollar change in wealth
pushes spending in the same direction by a few cents.
The more ominous school of thought said that the
unprecedented growth in consumption during the
housing boom years was not due to wealth alone,
but also a relaxation of credit constraints that gave
people an increased ability to use their housing wealth
for consumption. Take that cash cow away, research
suggested, and consumption was likely to fall by two or
three times as much as suggested by the wealth effect
alone. A threat, indeed, for an economy comprised twothirds by consumer spending.
Christopher Carroll, professor of economics at Johns
Hopkins University, was one voice behind the more
pessimistic estimates, and he says the evidence from the
Great Recession has proved that view correct. Carroll
is a long-time scholar of saving and consumption
dynamics at the individual and aggregate level, studying
questions that range from housing wealth effects to the
consumption response of households to uncertainty,
and from national saving patterns to the surprisingly
modest spending of the wealthy. Much of Carroll’s work
came to the forefront of current events nearly simultaneously, leading to a second stint at the President’s
Council of Economic Advisers (CEA) that spanned
the implementation of the historic 2009 American
Reinvestment and Recovery Act — also known as the
fiscal stimulus — an experience that Carroll describes
as changing how he views public policy.
Carroll joined the faculty of Johns Hopkins
University in 1995. In addition to serving twice on
the CEA, he began his economics career as a staff
economist at the Federal Reserve Board of Governors.
Renee Haltom interviewed Carroll at his home in
Columbia, Md., in February 2013.

stable developed equilibrium for a
refinance, and use your house as
When there is a really
long time. Most such countries tend
an ATM. Or you buy a new house
dramatic change in the saving to have personal saving rates someand sell the old one, which bids up
house prices further. There’s a lot of
rate, either an increase that where in the 5 percent to 8 percent
range. I think when our saving rate
research using local geographical
we saw in the Great
gets below that range for a sustained
data, especially by Atif Mian at
period of time, that’s something
Recession or the drop that
Princeton and Amir Sufi at the
that one ought to worry about. And
University of Chicago, finding that
we saw in the mid-2000s,
it had for quite a while been below
in places where house prices went
that ought to be a danger
that range.
up a lot, debt and consumption
spending went up.
signal for policymakers.
EF: What are the major unreSo one thing that has become
solved puzzles in consumption
much clearer in the last couple of
years is that a lot of the relationship between house price
theory? Are there areas where theory doesn’t quite
changes and subsequent spending really was the result of
match up with reality?
collateral constraints being loosened. That now seems to be
a consensus while it was a speculative idea five years ago.
Carroll: One is the research on default retirement contribuAnd, of course, that’s not sustainable, especially if house
tion rates. There’s an impressive body of new research that
prices peak and then start going back down. Then you’ve got
finds that people’s retirement saving decisions are very
a real hangover afterward.
much influenced by the default choices in their retirement
saving plan. I recently discussed the latest paper in this
EF: Before the crisis, many economists were lamenting
literature at the National Bureau of Economic Research’s
Economic Fluctuations and Growth meeting in San
the long-term decline of the household saving rate,
Francisco. The authors had data that basically covered the
which went from about 10 percent of disposable income
entire population of Denmark; 45 million data points, and
in the early 1980s to a low of 1.3 percent in 2005. You’ve
they could see people for 15 years. They found that if an
noted that economists are hard to please: They worry
employer has a default 401(k) contribution rate of 6 percent,
about the current economy when consumers spend too
85 percent of people will just go with 6 percent, rather than
little, and they worry about our long-term welfare when
changing the contribution rate or opting out. If the default
consumers spend too much. Is there a middle ground
is 10 percent, then 85 percent of people will go with 10 perthat would keep economists happy?
cent. I think the evidence for default contributions is just
overwhelmingly persuasive.
Carroll: What the saving rate is ultimately about is the
That is a really big challenge to the economists’ standard
aggregate capital stock and aggregate national wealth.
modeling approach, which is to say that people rationally
You’re not going to put much of a dent in that with two or
figure out how much they need to have when they retire and
three years of a low saving rate. But if a country’s saving rate
they figure out a rational plan to get there. The problem is,
is low for 20 or 30 years, then you end up a lot poorer.
now that we have discovered serious flaws in the rational
I do think that before the crisis our saving rate was lower
optimizing model for how people make those decisions,
than is wise or sustainable. There’s an emerging consensus
we’re kind of a bit at sea at being able to say, “Suppose we
that the decline mostly reflected the fact that it was getting
changed the tax rates on 401(k)s, or suppose we do this
easier to borrow during that time. So I think that something
policy or that policy. What consequence would it have?”
did have to ultimately put an end to it, but exactly when was
given that we don’t know why people are making those
unclear. The problem from a macroeconomic point of view
decisions in the first place.
is when you try to reverse that all of a sudden. If we could
The explanation I proposed at the conference was to say
have gradually inched the saving rate up 1 percentage point a
that, within some range, people trust that their employer
year for 10 years, that would have been a healthy way to deal
has figured this out for them. The job of the human
with the problem. But having it go up by 5 percentage points
resources department is to figure out what my default conin the course of a year is a huge economic shock.
tribution ought to be, and it would be too hard to solve this
It would be very hard to come to an agreement about
problem myself, so I’m just going to trust that somebody
what an “equilibrium” saving rate should be. What’s clearer
else has done it. It’s not different from when you take an
is that when there is a really dramatic change in the saving
airplane and you trust that the FAA has made sure that it’s
rate, either an increase that we saw in the Great Recession or
safe, or when you go to the doctor and you trust that the
the drop that we saw in the mid-2000s, that ought to be a
advice makes sense and is not going to poison you. Maybe
danger signal for policymakers. The economy really can’t
people trust that the default option is going to be a reasonefficiently handle rapid changes in aggregate demand.
able choice for them.
One way of saying a little bit more about that is to look at
That makes a little bit of progress in the sense that you
a longer history for countries that have been in a reasonably
ECON FOCUS | FIRST QUARTER | 2013

31

could think though under what circumstances one would
expect people to trust that decision has been made well.
Are people who are not very trusting less likely to go with
the default decision? What are the forces that reinforce
people’s trust in the employers to make a good decision?
What are the circumstances that encourage employers to
make a decision that deserves to be trusted? Maybe the
employer needs to have some fiduciary responsibility to
have made a good decision. If people are going to trust the
employer to make a good decision, we ought to make some
effort to give the employer the incentives to actually make
that good decision.
EF: What about puzzles at the macro level?
Carroll: I think there’s a really big one for which the profession has not reached a consensus or even come close. That is
the very strong relationship across countries going from
high growth to high saving. The theory in every textbook
says that if you know you’re going to be richer in the future
because you’re a fast-growing country, why in the world
would you save now, when you’re poor, making your future
rich self better off? It makes much more sense to borrow
now since it’ll be easy for you to pay off that debt in the
future when you’re richer.
The latest example that’s on everybody’s minds is, of
course, China, a country that has grown very fast for the last
20 years and has had a saving rate that just seems to get
higher every year. If China were the only example, then it
might be plausible to say that the phenomenon reflects
some unique aspect of China’s history or culture. There are
some papers that argue the one child policy has something
to do with it, or it’s the end of communism and the transition to capitalism, or that it’s Confucian values. But what
China is doing right now actually looks virtually identical to
Japan 30 years ago. Japan didn’t have a particularly high
saving rate in the 1950s, and by the 1970s it had the highest
saving rate in the world, and that was a period of high
growth in Japan. It’s also true in South Korea. It grew at a
very rapid rate starting from the early 1960s, and its saving
rate went up and up. We also see this in Taiwan, Singapore,
and Hong Kong. And it’s not just East Asian countries; the
same is true of Botswana and Mauritius. It’s also true in
the opposite direction for European countries, which were
growing pretty fast after World War II. That fast growth
came to an end in the early 1970s, and afterward the
saving rate declined, just as it declined in Japan after
Japan slowed down starting about 1990. So it seems to be
a pretty pervasive, large effect that is really very much the
opposite of what you’d expect from the standard off-theshelf models.
I have a couple of papers proposing that habit formation
has something to do with it. There are a lot of Chinese
people whose idea of a good standard of living was formed
back in the 1960s and 1970s, when China was much poorer.
If you have this reference standard in your mind, you might
32

ECON FOCUS | FIRST QUARTER | 2013

respond to rapid income growth by saving more because it’s
easier to save if you feel rich.
I have another paper that asks whether it’s really about a
precautionary motive. In that paper, a country makes a deal:
In order to get the rapid growth, everybody is going to have
to live with an economy that is constantly transforming
itself, experiencing churn and creative destruction. All of
the old ways of doing things have to be abandoned and
everyone has to live through lots of disruptions. Then maybe
the increases in saving reflect a precautionary motive.
In fact, what I really think is the right story is one that
combines habit formation and a precautionary motive, such
that they intensify each other. If I have these habits, then a
good reason to resist spending when my income goes up is
uncertainty over whether the factory that I’m working for
will close down and I’ll have to go back to my rural peasant
roots. But in the academic publishing context, it’s hard
enough to introduce one novel thing in a paper.
EF: Milton Friedman’s work in the 1950s on the “permanent income hypothesis,” the idea that people smooth
consumption over their lifetimes, was initially seen as
a very important contribution. Yet many economists
spent a lot of time in the 1970s and 1980s seemingly
disproving his main predictions. What does that debate
reveal about how economics is done?
Carroll: When Friedman wrote his famous book, the available mathematical tools were very primitive compared
to what we know how to do today. So he used his gifts as
a writer to lay out in good solid prose, of course supported
by data and charts, his vision of how he thought things
worked.
The book was very famous, so everybody wanted the
prestige of being the one to formalize the model’s main
predictions. When you have a rigorous mathematical model,
everyone can agree on what that model means. They might
not agree on whether it’s right as a description of how the
world works, but they can all agree on what it says. So a big
priority in the economics profession in the 25 years after
Friedman wrote was coming up with the mathematical tools
to analyze the optimal consumption choice problem that
Friedman described informally. Friedman himself wrote a
couple of papers trying to clarify his own views.
The first generation of those models had to make the
radical simplifying assumption of perfect foresight: no
uncertainty in the world, everyone knows what’s going to
happen for all of future history. There was a lot that those
models said which was directly contradictory to things that
Friedman said. For one thing, Friedman emphasizes the role
of uncertainty and precautionary buffers, and he presents
some data showing that people who face greater uncertainty
tend to hold larger buffers. That, of course, is completely
outside the cognizance of a perfect foresight model. Perfect
foresight models also predicted that your spending out
of a windfall shock to income — a 100 dollar bill on the

employer, and the equilibrium detersidewalk — would be about onemination of trustworthiness. What
tenth of the size that Friedman
➤ Present Position
you can do is publish empirical
predicted. One reason is that
papers that reject a rigid mathematiFriedman defined “permanent
Professor of Economics,
Johns Hopkins University
cal model as a test of that model,
income” to mean roughly what you
but then we’re left in the nihilistic
would expect your income to be on
➤ Other Positions
position of saying, “We know that
average over a three-year period,
Senior Economist, President’s Council
this benchmark model that everyone
whereas the perfect foresight model’s
of Economic Advisers (1997-1998 and
understands is wrong, but until the
definition was your entire income
2009-2010); Staff Economist, Board of
complete fully specified alternative is
stream from now to infinity. The marGovernors of the Federal Reserve
generated in someone’s brain, we
ginal propensity to consume is so low
System (1990-1995)
can’t propose half-baked theories
in perfect foresight models because
that may have a lot of truth to them
you’re spreading your windfall over
➤ Education
like Friedman did in 1957.” I wish the
all of history.
Ph.D., Massachusetts Institute of
profession would back off on that
In the subsequent 25 years, we
Technology (1990)
degree of rigidity. And maybe we
learned how to incorporate uncer➤ Selected Publications
have backed off a little bit.
tainty seriously into the models, so
Author of numerous articles in such
That’s one of the reasons blogs are
we don’t have to have this silly
journals as the American Economic
where some of the most interesting
perfect foresight assumption anyReview, Quarterly Journal of Economics,
economics is being done these days.
more. And we have learned how to
Econometrica, Journal of Monetary
That is an outlet where you can say,
incorporate financial constraints. In
Economics, and Journal of Economic
“Here’s how I think this is working,”
the perfect foresight models, if you
Perspectives
and people can criticize you and
know your income is going to be high
point out places where you’ve made
in the future, you can borrow 100
factual errors, but there’s not the counterproductively high
percent of that future income to finance your spending
barrier to having something to say that we have in formal
today. The moment that you get admitted to medical school,
academic publishing.
your spending should triple because you’re going to have a
high doctor’s salary. In the real world, maybe the bank is not
willing to believe that you’re going to repay them if you go
EF: So, would you say the permanent income hypoon a big spending spree right now. We now have the mathethesis is back in favor (if it was ever really out)?
matical tools and technology to build in these kinds of constraints on people’s access to their future income.
Carroll: There’s been a lot of evidence in the last 10 or 15
years confirming the basic dynamics that Friedman was
The combination of uncertainty and borrowing contalking about for how households make their year-to-year
straints pretty radically changes the implications of the
consumption saving choices. The term that is often used
mathematical models. And the thing that’s really striking is
now for such models is “buffer stock saving” models, and I’ve
that what you get is something that corresponds remarkably
written a number of papers on that topic. There are a lot of
well to the words that Friedman wrote in 1957. Arguably, he
ways in which those models match our data reasonably well.
had a very good mathematical intuition. He didn’t know
So I suspect that a good description of the typical househow to formalize that math, but he could see the contours of
hold’s behavior is that they figure that their employer has
what optimal behavior looked like.
It’s an interesting story, not only because it makes you
got the retirement saving thing figured out, and they just go
think, “Boy, that Friedman guy was pretty smart,” but also
with whatever the default is, and then they do this buffer
because now it’s very hard to get anything published until
stock saving thing with respect to whatever money is left
you have already worked out the fully specified rigorous
over. A lot of the data that we use to test these models have
mathematical formulation. You can’t just say, “Well, my
been really focused on the buffer stock aspect of things and
intuition tells me something works like such and such, and it
has ignored the retirement saving part of things.
would be nice if somebody could work out the math for that
I think people who work in this area would say that the
in the future.” Friedman was able to get away with that
buffer stock model is a pretty good description of everybefore the profession got so hung up on rigorous mathematthing except for the retirement saving part of people’s
ical proofs. Today, for example, we discussed that maybe the
behavior. And the buffer stock saving model is essentially
reason people go with their employer’s default retirement
just providing the mathematical formalization of what
contribution is that they’re trusting the employer to have
Friedman was trying to say in 1957. So in that sense, I think
worked out the problem. I could never publish a paper
the permanent income hypothesis has come into its own:
making that claim. I would need to have the formal
We have a rigorous mathematical formulation of what
dynamic optimizing model of trust, and the formal set of
Friedman was trying to say.
beliefs that people have about the trustworthiness of their
The terminology has changed somewhat. For a while, the

Christopher Carroll

ECON FOCUS | FIRST QUARTER | 2013

33

profession referred to the “permanent income hypothesis”
as being the perfect foresight formulation that was developed after Friedman, but that I think is really inconsistent
with what Friedman himself said. That’s why what I’m
speaking of tends to be called the buffer stock model today.
Although my name is associated with the buffer stock terminology because I wrote some of the early papers on it, my
own interpretation of it is that Friedman got it right and
we’ve finally just figured out the math.
EF: You were a senior economist at the Council of
Economic Advisers in 2009 and 2010. Was there a stark
juxtaposition of views about the 2009 fiscal stimulus
inside the CEA versus outside of it?
Carroll: I came on Aug. 1, 2009, so the stimulus had already
been passed by the time I got there. A lot of what we were
trying to do was monitor it, and figure out what effects it
was having and how to explain those effects to the public.
That was a difficult task. The public was not necessarily
going to be persuaded by regression equations and statistical
evidence. But it was a fascinating experience. When you’re
working at a job like that, of course, you read everything
that’s in the popular press and you see what’s on TV. Seeing
things from the two perspectives of being inside and the
outside was interesting.
There’s one particular point that I was struck by several
times. The CEA tends to vet speeches that the president
and sometimes other officials are going to make, and to help
set the priorities for what’s going to be in the speeches.
A number of times we would help to reshape the speech
to make sure that key points were highlighted, and the
arguments that we thought were the soundest economic
arguments were made. And then the president would go
out and give the speech, and I would later hear from economist friends, who would write to me complaining, “Why
didn’t the president say this obvious point in the speech
that he just made?” And that obvious point was the thing
that the CEA had deliberately made sure was actually a highlight of the speech! But, of course, what your friend actually
sees is the 15 seconds that gets excerpted on the news or
some blogger’s two-paragraph reaction to the president’s
speech.
So the narrowness of the communications channel is
something that you get a very different perspective on from
the inside. It has made me more circumspect in my own
criticisms of the White House and the communications
strategies they’ve pursued after I have come back to Johns
Hopkins, because now I understand they might well agree
with everything that I have to say on the subject and just not
be able to get the message through. The president has a
greater ability to express his point of view and get it heard
than any other single person. But I think the extent to which
even the president can’t penetrate through the fog of
information and the vast number of sources of data that
people pay attention to is underappreciated.
34

ECON FOCUS | FIRST QUARTER | 2013

EF: You’ve been the Placement Director for new economics Ph.D.s at Johns Hopkins since 2002. Given what
you’ve described as an overemphasis on math relative to
concepts in the economics profession, what can Ph.D.
programs do to better prepare students to become
effective professional economists?
Carroll: It’s sort of an equilibrium problem. The profession
demands a high level of mathematical expertise, and so
nobody can responsibly back off of making sure that their
students have that training. To do so would endanger their
ability to get jobs.
I do think that the profession is much too insistent on
the proposition that the only good economics is highly
mathematical economics. For example, one of the most
insightful things that I have read about the current crisis in
Europe is not about the current crisis at all. It’s a book called
Lords of Finance by Liaquat Ahamed, about Europe in the
interwar period and the collapse of the gold standard. It’s a
brilliant book. It includes all sorts of fascinating and
compelling economics that I think really sheds light on the
problems of the eurozone today, and there’s not a single
equation in it.
The profession ought to be more eclectic, I think. We
ought to recognize that a much better knowledge of history,
the history of economic thought, and insights from evolutionary psychology and all sorts of other fields have a lot to
contribute. At present we, as a profession, are not willing to
tolerate that. Partly it’s an arms race problem in the sense
that mathematical tools are easy to judge and rank people
on. So we tend to focus on that.
I think most of my colleagues in the macro group at
Hopkins would agree with most of what I have just said.
What is a feasible choice for us in the current environment
is to focus preferentially on real world policy questions.
Of course, students need to have the ability to use the latest
statistical techniques and to understand and to manipulate
state-of-the-art models, but it’s a real talent to be able to
take those mathematical tools and use them to illuminate
practical policy questions that the International Monetary
Fund or the central bank or a fiscal policymaker might face.
A lot of macroeconomics doesn’t even try to address serious
real world policy questions. Our department, for a variety
of historical reasons, is full of people for whom I think
those are the most interesting and important questions to
study. That’s for us, I think, the sweet spot. They use the
full range of techniques that are available, but they use them
to a purpose and not as a goal in and of themselves, which
is often what they seem to become in the hands of many
academics.
So that has been the response of Johns Hopkins in partial
equilibrium. One consequence is that the students that we
train tend to be particularly attractive to policy institutions
like the IMF and the Fed and the European Central Bank
and places where you need some ability to grapple with the
EF
real world.

ECONOMIC HISTORY
Disney’s America
BY K A R L R H O D E S

A local zoning issue ignited a national firestorm when Disney
tried to build a theme park in Virginia
ick and Mary Lynn Kotz declared war on the Walt
Disney Co. in the spring of 1994. They were
sitting on the porch of their home near Broad
Run, Va., reading the Washington Post and savoring their
unspoiled view of the mountains that separate rural
Fauquier County from suburban Prince William County.
The afternoon sun warmed the porch and illuminated the
mountains, but the newspaper reminded them of what
Disney was plotting on the other side of those bright blue
ridges. The company had announced plans to develop 3,000
acres near the tiny town of Haymarket in the northwestern
reaches of Prince William County. Phase one featured
Disney’s America, a history theme park and recreation area.
Long-range plans called for a golf course, houses, hotels, and
other unspecified mixed-use development.
Nick looked up from the newspaper and gazed at
Thoroughfare Gap, the pass that Stonewall Jackson
traversed to raid the Union supply depot at Manassas
Junction in 1862.
The Disney invasion, Nick remarked, seemed like
“a surprise attack.”
“We’ve got to get the historians involved,” Mary Lynn
suggested. “Let’s get in touch with Shelby Foote right now!”
Foote was a Southern author and historian whose fame
had mushroomed after he appeared in Ken Burns’ popular
PBS documentary series about the Civil War. Mary Lynn did
not know Foote, but she knew someone who did, and by
afternoon’s end, she had Foote’s phone number. Networking
was second nature to Mary Lynn, an accomplished writer
and public relations pro. Nick, a Pulitzer Prize-winning
journalist, also maintained an impressive list of contacts.
Together, they marshaled a nationwide army of writers and
historians to fight Disney’s America at every tactical turn.
Despite their enthusiasm, the writers and historians —
joined by preservationists and environmentalists — seemed
no match for Disney, which enjoyed strong support from the
Virginia governor, the Virginia General Assembly, and
the Prince William Board of Supervisors. But in the final
analysis, the pen, amplified by media relations expertise,
proved mightier than the mouse.

N

The Debate Continues
Nearly two decades after Disney scrapped plans for Disney’s
America, the debate continues — both in Prince William
County and across the country. Who should influence landuse decisions? In addition to local residents, should people

in adjacent counties have a say? What about state and
national governments?
One modern-day blog commenter described the 1990s
confrontation as a “battle between Disney and almost everybody living in Virginia.”
“Uh, no,” a second commenter shot back. “It was really a
battle between Disney and a very small, but very rich and
influential group of Virginians.”
The breadth and depth of opposition among Virginians
at the time is still being debated, but there is little doubt that
the anti-Disney movement emanated primarily from
Fauquier and Loudoun counties. Three years after Disney
backed down, Nick Kotz co-authored an article with fellow
journalist Rudy Abramson titled “The Battle to Stop
Disney’s America.” Their story focused primarily on Protect
Historic America (PHA), the nonprofit organization that
Nick and Mary Lynn started to fight Disney. The co-authors
listed the PHA’s initial organizers as residents of Fauquier
or Loudoun. None of them hailed from Prince William.
“I would say that about 80 percent of the opposition
came from outside the county,” recalls Kathleen Seefeldt,
who chaired the Prince William Board of Supervisors at the
time. Inside the county, most residents welcomed Disney,
and a clear majority of supervisors would have approved
the project, she says. “We thought we saw a very good opportunity for nonresidential growth.”
Prince William had been struggling for decades to
diversify its real estate tax base. In the 1970s, when Northern
Virginians started rezoning their dairy farms, “Fairfax
County got the cream, and Prince William got the skim
milk.” That’s how Virginia Business magazine described the
situation in 1988. Fairfax attracted upscale malls, hotels,
and office buildings, while Prince William became a bedroom community for people who could not afford to live
closer to their jobs in Washington, D.C. Prince William’s
tax base was further constrained by large swaths of land
that could never generate revenue: Nearly 19 percent of the
county’s acreage is owned by the federal government, most
notably Marine Corps Base Quantico, Prince William Forest
Park, and Manassas National Battlefield Park.
In the late 1980s, a developer acquired land adjacent to
the battlefield park and proposed William Center, a 542-acre
project that would have included a 1.2 million-square-foot
mall. The county enthusiastically supported William Center,
but local and national preservationists rallied against
building a mall near a Civil War battlefield. Eventually the
ECON FOCUS | FIRST QUARTER | 2013

35

This rendering of
Disney’s America
shows replicas of the
Monitor and
Merrimac battling
in the middle of
a man-made lake
flanked by a mocked
up Civil War fort and
a building resembling
the Ellis Island
Immigration Station.

Down the Road
Five years later and about four miles down the road, Disney
assembled 3,000 acres to build its first theme park in the
United States beyond the sunny climes of California and
Florida. Disney’s America would have been closer to
Disneyland than Disney World in size, but opponents of the
project associated Disney’s America with the sprawling
Disney World complex south of Orlando.
Disney’s think-big approach had worked well in Florida,
but in the early 1990s, the company was struggling to replicate that success with its new Euro Disney theme park in
Paris. “Chastened by the rising costs of Euro Disney, we
began to look for ways to develop smaller-scale theme
parks,” wrote CEO Michael Eisner in his memoir, Work in
Progress. (He declined to be interviewed for this story.) After
visiting Colonial Williamsburg and reading books about
John Smith and Pocahontas, he became passionate about
building a theme park based on American history.
When Eisner unveiled Disney’s America in 1993, he
said the company would “create a totally new concept using
the different strengths of our entertainment company …
to celebrate those unique American qualities that have
been our country’s strengths and that have made this nation
a beacon of hope to people everywhere.”
According to Disney’s press release, the theme park
would include a Civil War fort and village with nearby
re-enactments of the battle between the Monitor and the
Merrimac on a man-made lake. The release also named five
other themed sections: Presidents’ Square would celebrate
“the birth of democracy and those who fought to preserve
it.” A section called “We the People” would interpret the
36

ECON FOCUS | FIRST QUARTER | 2013

American immigrant experience “inside a building resembling Ellis Island.” Native America would pay tribute to the
continent’s first inhabitants with “a harrowing Lewis
and Clark river expedition.” Enterprise would highlight
American ingenuity with a thrill ride called “Industrial
Revolution.” And Victory Field would let guests “experience
what America’s soldiers faced in the defense of freedom.”
Disney projected that the theme park and recreation
area alone would create nearly 3,000 jobs and generate
$500 million in tax revenues for Prince William over 30
years. State and local officials were thrilled. Steady streams
of tourists seemed like the perfect way to bolster the county’s tax base. Tourists would generate more traffic, but they
would spend their money and go home without requiring as
many local services as residents do. Most of the Prince
William supervisors enthusiastically supported the project,
as did Gov. George Allen. The Virginia General Assembly
even agreed to provide $163 million for transportation
improvements and worker training to help seal the deal.

National Outcry
Soon after Disney announced its plans, leaders of the
nonprofit Piedmont Environmental Council started looking
for ways to scuttle the project. Based in Warrenton, Va.,
the council works to protect the historic and rural character
of Virginia’s upper piedmont region. The council focuses
on nine counties, including Fauquier and Loudoun,
but Prince William is just outside the organization’s core
territory.
Even so, the council has been engaged in land-use issues
throughout Northern Virginia, including Prince William,
says Council President Chris Miller, who coordinated the
organization’s opposition to Disney’s America. The council
produced “an alternative location map” with more than
20 potential sites where the council would have supported
the theme park. Miller focused on land-use arguments
against the Disney’s America site, but he quickly realized
that the PHA was helping the council’s cause by portraying
the theme park as a bastardization of American history.
So the council helped raise some initial funding for PHA.
“Money just seemed to fall from the sky,” Mary Lynn

ILLUSTRATION: ©THE WALT DISNEY CO.

federal government purchased the site, via “legislative
taking,” for $134 million ($249 million in today’s dollars).
The acquisition generated a huge profit for the developer,
but it made Prince William’s tax base even smaller.
Seefeldt recalls discussing the William Center project
with Sen. John Warner shortly before Congress purchased
the land. “I believe it was he who said, ‘Can’t you just move
this project five miles down the road? Then there probably
wouldn’t be any problem.’”

recalls. More importantly, she and Nick enlisted the help of
many well-known academic historians — including C. Vann
Woodward of Yale, John Hope Franklin of Duke, and James
McPherson of Princeton. They also recruited many famous
writers, including Foote, David McCullough, William
Styron, Roger Wilkins, and Tom Wicker.
While the PHA founders had strong NIMBY (not in my
backyard) motives, some of the better-known historians and
writers they enlisted opposed the project’s theme as much,
if not more, than its location. They expressed grave concerns
about the “Disneyfication” of American history that they
had witnessed in the company’s movies and theme parks.
“Anything Disney has ever touched — whether it’s
fantasy or fact, whether it’s Davy Crockett or whoever —
has always been sentimentalized,” Foote said in a 1994
interview with Naval History Magazine. “And every good
historian, every great artist, knows that sentimentality is the
greatest enemy of truth.”
In his memoir, Eisner confessed that Disney’s “first
important misstep was the decision to call the park ‘Disney’s
America.’” The possessive name “implied ownership of the
country’s history, which only antagonized our critics,” he
explained. “That was unfortunate because we were never
interested in a park that merely reflected a Disneyesque
view of American history.”
Filmmaker Ken Burns had no objections to Disney’s
popular history theme. “I am in the same business,” he wrote
in the Potomac News. “Many in my generation have been
drawn to history in part through the films of Walt Disney.”
Burns opposed the project because “it is in the wrong place.
It will distract visitors from the real places of history, and it
will damage the beauty and character of the area.”

Edge City Averted?
So did Prince William County drop the brass ring of
economic development or dodge the brass knuckles of
suburban sprawl?
The project’s opponents and proponents agreed on one
thing. Disney’s America would have transformed the town
of Haymarket (population 460 at the time) and the county
of Prince William (population 239,000 at the time).
The PHA commissioned a study that predicted Disney’s
America would spur the development of a new edge city
“equivalent to 17 others in the Washington area combined.”
The study also estimated that the project would attract
230,000 new residents who would overwhelm the region’s
existing infrastructure.

The edge city prediction was vastly overstated, Seefeldt
says, and most of the projected population growth was going
to happen with or without Disney’s America. A theme park
would have generated more traffic, she admits, but Disney
would have contributed to building the necessary infrastructure. “All those transportation improvements have pretty
much been put in place without any private sector assistance, and what we have out there now is a sea of rooftops,”
which created demand for several new schools.
Prince William residents will never know which vision of
Disney’s America was more realistic because in September
1994, the company abruptly abandoned the project. In his
book, Eisner noted several issues that factored into the
decision. He was devastated by the death of Disney
President Frank Wells in April and distracted by the tumultuous departure of Disney Studios chief Jeffrey Katzenberg
in August. Eisner also underwent bypass surgery. While he
was recovering, the theme park’s projected startup costs
and revenues took a turn for the worse, but Eisner made it
clear that widespread opposition to the project — mustered
mostly by the PHA and the council — was the underlying
reason for pulling the plug.
“I still believed that it was possible to get Disney’s
America built, but the question now was at what cost,” he
wrote. “The issue was no longer who was right or wrong.
We had lost the perception game. Largely through our
own missteps, the Walt Disney Co. had been effectively
portrayed as an enemy of American history and a plunderer
of sacred ground.”
Prince William may have dropped the brass ring of
economic development, but it did not dodge suburban
sprawl. The sacred ground that preservationists saved from
Disney’s America instead became a golf course and country
club along with several upscale subdivisions and retirement
communities. Haymarket’s population nearly quadrupled to
1,800, and Prince William’s population grew to 419,000.
As for the real estate tax base, commercial and industrial
properties accounted for only 14.1 percent in 2011 — down
from 16.7 percent in 1993.
Fauquier County has grown as well, but the county
retains its rural and historic character. Nick and Mary Lynn
Kotz still enjoy their unspoiled view of Thoroughfare Gap,
and Mary Lynn bristles a bit when she recalls Washington Post
stories that portrayed opponents of Disney’s America as
wealthy NIMBYs protecting their horse farms in Fauquier
County. Nick and Mary Lynn don’t raise thoroughbreds,
they raise cows. “But mostly,” she says, “we raise Cain.” EF

READINGS
Abramson, Rudy, and Nick Kotz. “The Battle to Stop Disney’s
America.” Cosmos Journal, 1997.

Wallace, Michael. Mickey Mouse History and Other Essays on American
Memory. Philadelphia: Temple University Press, 1996.

Eisner, Michael D., and Tony Schwartz. Work in Progress. New York:
Random House, 1998.

Zenzen, Joan M. Battling for Manassas: The Fifty-Year Preservation
Struggle at Manassas National Battlefield Park. University Park, Pa.:
Penn State University Press, 1998.

Ginsberg, Steven. “Disney’s Defeat Didn’t Stop Growth — Or End
Debate — in Prince William.” Washington Post, Nov. 24, 2003, p. A1.

ECON FOCUS | FIRST QUARTER | 2013

37

POLICY UPDATE

Under Pressure
BY T I M S A B L I K

conomic sanctions, such as trade embargoes, have
a long history as a tool of foreign policy. In the
aftermath of World War I, economic sanctions were
increasingly considered an alternative to war. World leaders
hoped that placing economic pressure on nations by withholding access to goods or finances would allow individual
countries or groups like the League of Nations (and later
the United Nations) to resolve conflicts without bloodshed. Indeed, Woodrow Wilson optimistically remarked in
1919, “Apply this economic, peaceful, silent, deadly remedy
and there will be no need for force.”
Economic sanctions did not end armed conflict among
nations. Nevertheless, they have been used by individual
countries and coalitions to apply pressure short of military
force and to demand anything from humanitarian reform to
complete regime change. The United States has engaged in
sanctions more than any other nation, over 100 times in the
last century, according to data from the Peterson Institute
for International Economics. Some U.S. sanctions have
fallen short of the mark, such as the long-standing embargo
against Cuba. In other cases, the government declared
success and lifted sanctions, such as following the collapse of
apartheid in South Africa after the United States and other
nations imposed sanctions in the 1980s.
Most recently, the United States lifted restrictions
against investment and trade with Myanmar, also known as
Burma. Myanmar has been the subject of international
scrutiny and sanctions since 1990, when its military suppressed democratic elections and imprisoned opposition
party members, including Nobel Peace Prize winner Aung
San Suu Kyi. In 2003, President George W. Bush signed into
law a blanket ban on the importation of goods from
Myanmar to put additional financial pressure on the
nation’s military leaders to institute democratic reform.
In November 2010, Myanmar held its first election since
1990, and in December 2011, Secretary of State Hillary
Clinton visited the country, meeting with political leaders
and signaling a willingness to ease sanctions in response to
more democratic reform. The next spring, Suu Kyi and several members of her party won election to Parliament, and
Myanmar freed hundreds of political prisoners. The United
States lifted most of its import restrictions in November
2012, just before President Barack Obama became the first
sitting U.S. president to visit the country.
On the surface, the sanctions against Myanmar appear to
have been at least partly successful. But how successful are
economic sanctions generally, and how do social scientists
measure their success? Gary Hufbauer, Jeffrey Schott,
Kimberly Elliot, and Barbara Oegg of the Peterson Institute

E

38

ECON FOCUS | FIRST QUARTER | 2013

sought to answer that question in their book Economic
Sanctions Reconsidered, a comprehensive study of sanction
cases over the last century, updated in 2008.
“We would say in 25 to 30 percent of cases, there has
been a resolution which we classified as successful,” says
Hufbauer. “Some people would say that’s low, but diplomacy
seeks a lot of objectives and they’re not always achieved.
So I think it is rather good for diplomacy.”
Hufbauer says they looked at whether the stated goals
of the sanctions were achieved and what role sanctions
specifically played in achieving those goals. Determining the
influence of sanctions on final outcomes is open to some
interpretation, however. Robert Pape, a political science
professor at the University of Chicago, has disputed many of
the cases deemed successes by the Peterson study. He argued
that the true success rate is actually lower than 5 percent,
painting a much less optimistic picture.
“Pervasive nationalism often makes states and societies
willing to endure considerable punishment rather than
abandon what are seen as the interests of the nation,” wrote
Pape. “Even when … ruling elites are unpopular, they can still
often protect themselves and their supporters by shifting
the economic burden of sanctions onto opponents or disenfranchised groups.”
Because of this and other factors, many critics have
argued that sanctions can actually slow the pace of regime
change. Hufbauer agrees that sanctions can reinforce the
power of regimes that already have a large degree of control
over the country. The sanctions that tend to be more
successful, he says, are ones with more modest goals.
In the case of Myanmar, Hufbauer says that the Obama
administration’s willingness to remove sanctions in exchange
for reforms short of regime change led to success.
“It was the withdrawal of the sanctions, the carrot aspect,
which was successful,” he says. “I would score it as success in
a modest goal case. That’s progress in this business.”
Ultimately, it is difficult to say for certain whether the
import sanctions (which the Peterson Institute estimated
affected 1.7 percent of Myanmar’s gross national product)
were the primary catalyst for change. It is also unclear
whether the reforms will last. Although Myanmar’s government has released many prisoners, opponents claim that
several hundred political prisoners are still incarcerated. In
March, Myanmar declared martial law in four townships in
response to sectarian violence between Muslim and
Buddhist groups that began late in 2012. Additionally,
reports of military strikes against rebels earlier in the year
have also contributed to doubts about Myanmar’s commitment to reform.
EF

BOOKREVIEW

Capitalism, Meet Politics
A CAPITALISM FOR THE PEOPLE
BY LUIGI ZINGALES
NEW YORK: BASIC BOOKS, 2012,
260 PAGES

REVIEWED BY DAVID A. PRICE

conomist Luigi Zingales of the University of
Chicago’s Booth School of Business is like a vocal
moviegoer watching a horror film for the second
time. He’s the audience member who calls out, “No. Not
the stairs. Don’t go up the stairs.”
Zingales, a native of Italy, argues in A Capitalism for the
People that American economic policy is moving in a dangerous direction, toward the “crony capitalism” that he says he
witnessed in his homeland. The term refers to an economy
in which companies prosper on the basis of their influence
with government officials rather than their ability to succeed in the marketplace. Companies with connections may
gain in the form of public contracts, subsidies, bailouts, or
regulatory protection against rivals. According to Zingales,
the system is well entrenched in Italy; in a survey to identify
the top factors in business success, managers there ranked
“knowledge of influential people” number one.
“It has robbed my home country of much of its potential
for economic growth,” Zingales writes. “I do not want it to
rob the United States as well.”
While crony capitalism in the United States is not
entirely new — Zingales cites congressional earmarks to
specific recipients, which became widespread beginning in
the 1980s, as an example — he contends that the movement
toward cronyism has quickened here in recent years. But
why was crony capitalism far slower to take root in the
United States than overseas in the first place? Zingales contends that in countries with significant Marxist political
parties, advocates of free-market policies had little choice
but to throw in their lot with large businesses that sought to
use government for their own ends. The United States did
not face that problem. In addition, he says the federal structure of the U.S. government, which grants sizable powers to
state and local governments, creates a check on cronyism by
forcing jurisdictions to compete with each other.
Zingales does not explicitly indicate when he believes
that this country’s resistance to crony capitalism started to
weaken. He regards the transition as having been well along
by the time of the 2007-2008 financial crisis and its aftermath, however. The crisis brought a series of policies that
Zingales regards as crony-capitalist in nature, including the
Troubled Asset Relief Program, or TARP, which he calls
“a pillage of defenseless taxpayers that benefited powerful

E

lobbies,” and parts of the Dodd-Frank Act of 2010.
The causes of the change, in his view, are many: Declining
real incomes, combined with the rise of a superstar or “winner-take-all” economy, bred a loss of faith in free markets. In
the financial sector, financial innovation made it easier to
hide implicit subsidies to institutions, while the growth of
the biggest banks made the concept of “too big to fail” more
plausible to policymakers. The federal government became
bigger, creating a greater incentive for businesses to loot it.
Indeed, echoing Gordon Tullock, a pioneer in “public
choice” analysis, Zingales says the rewards of rent-seeking
are so great that companies’ spending on lobbying and campaigns is, if anything, surprisingly low; the reason, he
suggests, may be that “they are still in their learning phase.”
The solutions he proposes center on curbing corporate
involvement in politics: not through increased regulation of
campaign finance, which he believes could not be effective
within the constraints of Supreme Court decisions, but
instead through measures that would curb lobbying. These
include enforcing antitrust laws with a view to restraining
not only the market power of firms, but also their political
power — for instance, by imposing limits on lobbying as a
condition of a merger that would create a politically powerful company. He favors a tax on lobbying. More ambitiously,
he wants a law banning public subsidies to businesses, with
enforcement through private class-action lawsuits. Looking
beyond law, Zingales wants business schools to imprint on
their students an aversion “opportunistic actions that are
detrimental to society at large,” among them the practice of
lobbying for subsidies.
Although the economic benefits enjoyed by politically
connected firms have been known to economists at least
since George Stigler’s famous 1971 article “The Theory of
Economic Regulation,” A Capitalism for the People gives a
uniquely accessible and engaging account of the perils of
cronyism. If it has a weakness, it is in its less developed
policy prescriptions. What, for example, is limited by a limit
on “lobbying”? At times, Zingales seems to include any effort
to influence policy, even giving factual information to policymakers. Moreover, while teaching business students to
abhor subsidies may well be worth trying, it would seem that
most students have already made their ethical commitments
by the time they reach that point in their education.
Yet Zingales’ larger point is convincing: The most
durable defense against crony capitalism is not laws, but a
social consensus against it. If Zingales is right that cronyism
is on the rise, then such a consensus surely will be harder to
build once cronyism comes to be viewed, with resignation,
as simply business as usual.
EF
ECON FOCUS | FIRST QUARTER | 2013

39

DISTRICTDIGEST

Economic Trends Across the Region

Leverage and the Fifth District Economy
BY R . A N D R E W B A U E R A N D S O N YA R AV I N D R A N AT H WA D D E L L

keep their debt-to-income ratios under a target level, a
decline in house prices that results in a fall in net worth
could lead a household to reduce spending in order to pay
down debt and move back to the leverage target. Indeed,
this intuitive idea is captured in modern models of
consumption and savings. Christopher Carroll of Johns
Hopkins University outlined a model in a 1992 article in the
Brookings Papers on Economic Activity in which households not
only target a “buffer-stock” of wealth, but respond to
increases in uncertainty (say, a greater risk of becoming
unemployed) by attempting to move their financial wealth
to higher target levels. (See his interview in this issue of
Econ Focus, page 30.) In addition, the model suggests that
consumers are reluctant to increase debt and that they
become uncomfortable holding previously assumed debt
when there is an increase in labor market uncertainty. As a
result, when faced with greater uncertainty, consumers are
more likely to reduce consumption in order to raise their
level of wealth and/or reduce debt levels.

he sluggish pace of the economic recovery in the
past few years has been driven in part by unusually
slow growth in consumption. Some economists have
suggested that the rise in household indebtedness before
the recession contributed to both the severity of the
recession and the sluggishness of the recovery. These economists have analyzed the extent to which higher debt levels
caused consumers to rein in spending during the recession
and reduce consumption to lower outstanding debt (or
“deleverage”) during the recovery, leading to a deeper recession and a more tepid economic recovery. Although many
of these studies looking at national data have found that
larger increases in leverage led to more severe deterioration in consumer spending and labor market conditions,
the situation in the Fifth District economy during the
recession seems to have been different.

T

Economic Theory and Consumer Debt
Standard economic models tell us that a household’s consumption is determined by its income, wealth, preferences,
and return on savings. More complicated models will include
a household’s ability to borrow or the economic uncertainty
it faces. In the simplest models, debt does not exert an
influence on consumption independent of other factors.
Instead, all that matters in these models for the levels
of consumption and savings at any point in time is the
“permanent” lifetime wealth of a household. And for most of
us, this is primarily the present discounted value of incomes
over our lifetimes.
Yet there may be good reason to consider debt as an independent influence on household spending and saving
decisions. For example, in a model where households try to

Leverage and Economic Outcomes

40

ECON FOCUS | FIRST QUARTER | 2013

12

20

0

11

20

20
1

9

8

20
0

20
0

6

5

07

20

20
0

20
0

3

2

4

20
0

20
0

1

20
0

20
0

00

20

19

99

TOTAL DEBT-TO-INCOME RATIO

In the papers that have emerged from the housing crisis and
its aftermath, there has been evidence that increases in
household leverage were a driving factor in the consumption
decline from 2007-2009. Atif Mian of the University of
California, Berkeley and Amir Sufi of the University of
Chicago have written a number of papers documenting
the rise in debt to income in U.S. counties and analyzing
the relationship between leverage and other economic outcomes. In a 2010 working paper, they found that household
leverage predicts variation in mortgage default, house price
movements, unemployment, residential investment, and
durable goods consumption from 2007 to 2009.
The recession began earlier and became more
Leverage Ratios in Select States
severe in counties with high leverage growth
180
than in counties with low leverage growth.
160
In a 2011 paper with Kamalesh Rao of
140
MasterCard Advisors, Mian and Sufi argued
120
that households in high-leverage counties expe100
rienced a severe shock to their balance sheets
80
in 2007 and 2008 as house prices in those
60
areas declined, in aggregate, by almost 30 per40
cent. This balance sheet shock was followed
20
by a significant drop in consumption.
0
They concluded that a one-standard-deviation
increase in household leverage as of
CA
WV
SC
NC
VA
MD
FL
DC
2006 was associated, all else equal, with a
9 percent to 13 percent drop in durable goods
SOURCES: Federal Reserve Bank of New York Consumer Credit Panel/Equifax, Bureau of Economic Analysis/
Haver Analytics (With calculations by the Federal Reserve Bank of Richmond)
consumption and a 5 percent to 8 percent drop

TRILLIONS OF DOLLARS

in nondurable goods consumption.
Total Debt Balance and Its Composition: Fifth District
In a 2012 working paper, Karen Dynan
1.4
of the Brookings Institution examined
1.2
whether households with the greatest
mortgage leverage reduced their spending
1.0
the most. She found that following the
0.8
collapse of real estate prices, highly lever0.6
aged households had larger declines in
spending than their less-leveraged coun0.4
terparts despite having smaller changes
0.2
in net worth, suggesting that their mort0
gage leverage weighed on consumption
1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012
above and beyond what would have been
Mortgage
Other Student Loan Credit Card
Auto Loan HE Revolving
predicted by wealth effects alone.
SOURCES: Federal Reserve Bank of New York Consumer Credit Panel/Equifax (With calculations by the Federal Reserve
Despite the findings of these papers,
Bank of Richmond)
there is not unanimous agreement on the
relationship between debt and consumpstandard for U.S. debt-to-income calculations. If we use total
tion, independent of other variables. In a 2012 Public Policy
personal income from FF data as our denominator in the
Brief, Daniel Cooper of the Federal Reserve Bank of Boston
U.S. leverage calculation, the debt-to-income peaks at about
defined deleveraging as a deliberate household balance sheet
103 percent.) In the end, states such as California, Florida,
adjustment that lowers consumption beyond what would be
Nevada, and Arizona, which saw the largest real estate
predicted by changes in income and wealth. In his analysis,
losses, have probably played a significant role in the findings
he found little evidence that deleveraging has had a sizeable
of most of the empirical papers written on both household
effect on U.S. consumer spending. He wrote that consumpcredit conditions and mortgage default in the past five years.
tion changes prior to, during, and following the Great
Not surprisingly, most of the rise in debt within the Fifth
Recession are consistent with those implied by fluctuations
District was mortgage debt. (See chart.) From 1999 to the
in household income and net worth using standard ecopeak in the third quarter of 2008, total outstanding debt in
nomic relationships. In fact, Cooper argued that households
the Fifth District rose by more than $800 billion, an
potentially underspent relative to income and net worth
increase of 166 percent. Nearly 80 percent of that increase
during the housing boom and overspent since the recession
was a rise in mortgage debt (either first mortgage or home
began.
equity installment loans). Although student loan debt rose
The empirical literature, in short, provides somewhat
more than fivefold, it still made up only 4.4 percent of the
diverging evidence concerning the role of leverage in the
increase in debt and only just more than 3 percent of total
recession and recovery.
debt at the end of 2008. These numbers are remarkably
similar to those for the United States as a whole, where
Leverage and Economic Outcomes in States
almost 77 percent of the $7.8 trillion debt increase from
The trend in the ratio of household debt to disposable
1999 through the third quarter of 2008 was from rising
personal income preceding and during the recession of
mortgage debt.
2007-09 has been well documented. According to Federal
Meanwhile, the decline in debt that occurred from the
Reserve Board Flow of Funds Accounts (FF) data, household
third quarter of 2008 to the first quarter of 2012 was also
debt peaked at the end of 2007 at almost 130 percent of
driven by a decline in mortgage debt. While total debt in
disposable personal income, declined abruptly, then
the Fifth District fell by about $43 billion over the period
rebounded to almost that high in the first quarter of 2009
(despite a $45 billion increase in student loan debt), outstandbefore beginning a steady decline. The Equifax data
ing mortgage debt dropped by over $52 billion. This decline
(FRBNY CCP/Equifax) indicate a slightly lower peak — at
was driven by the unprecedented number of foreclosures
about 115 percent — but the same rebound in the second half
and mortgage write-downs that occurred over this period.
of 2008 before a steady decline beginning in the first quarter
In the United States as a whole, the decline in outstanding
of 2009.
mortgage debt accounted for 98 percent of the $1 trillion net
The Fifth District experienced a similar rise in debt, with
decline in outstanding debt from 2008 to 2012.
the highest-leverage states of Virginia and Maryland peaking
Because mortgage debt makes up most total household
in the first quarter of 2009 at just more than 120 percent of
debt, it makes sense to start with housing markets when
total personal income. (See chart.) California and Florida are
analyzing the effect of leverage on the broader economy of a
included in the chart to contrast the experiences of these
state or locality. In fact, at the state level, there was a strong
states with those of the Fifth District states. (Because of
relationship between leverage and housing outcomes in the
data availability, we use total personal income for states
recession. The correlation between all U.S. states’ debt to
rather than disposable personal income, which is the
ECON FOCUS | FIRST QUARTER | 2013

41

income in the fourth quarter of 2006 and the change in
house prices from 2007 through 2009 was -0.72, reflecting a
strong and statistically significant negative relationship.
The higher the debt-to-income level in 2006, the sharper
the decline in house prices from 2007 through 2009. In
addition, the states with the sharpest declines from 2007 to
2009 were also those with the biggest increases prior to
2007. In other words, the states that saw the sharpest house
price growth in the years before 2007 (boom) were the states
where homebuyers took on considerably more debt to buy a
house and where house prices fell the most sharply (bust).
Leverage also appears to be correlated with labor market
conditions at the state level. With a statistically significant
correlation of -.57, the data indicate that among states, the
higher the average leverage in 2006, the deeper the labor
market deterioration from 2007 to 2009. This is similar to
the county-level result found in the empirical work of Mian
and Sufi. Of course, correlation does not indicate causation;
it is reasonable to think that the housing boom in states like
California or Nevada resulted in both higher debt levels and
a housing crash that hurt labor markets. In other words, it is
possible that the only mechanism through which debt levels
affected employment was through the housing market and
that the driving force in the relationship between leverage
and labor markets was the few states that saw sharp booms
and busts. Indeed, the relationship between leverage and
employment is the strongest in California, Arizona, Nevada,
and Florida; excluding those four states alone pushes the
correlation between leverage in 2006 and change in employment from 2007 to 2009 to -0.40, in addition to a reduction
in statistical significance. This is not solely the result of a
change in sample size — the correlation does not decline to
the same extent with the exclusion of any other four states.
In fact, the relationship between leverage and labor markets
strengthens with the exclusion of four Fifth District states:
Virginia, Maryland, North Carolina, and South Carolina.
The relationship between leverage and house prices, however, is relatively consistent. Looking at state-level data, then,
it appears that the states with the largest housing busts
strongly influence the overall relationship between leverage
and employment.

Counties in the Fifth District
Household debt-to-income levels vary considerably across
counties in the Fifth District. Since Maryland and Virginia
experienced the greatest increase in household leverage
prior to the recession, it is not surprising that the majority
of highly leveraged counties in the District are in those
two states. The Carolinas and West Virginia had markedly
fewer highly leveraged counties than Virginia and Maryland.
In fact, in the fourth quarter of 2006, 18 of the 25 most
leveraged counties or cities were located in Virginia or
Maryland. (For a corresponding table, please see this article
on our website at www.richmondfed.org/publications.)
Leverage varied notably within both states, however. Prince
George’s and Charles counties in Maryland had the highest
42

ECON FOCUS | FIRST QUARTER | 2013

debt-to-income levels, with 240 percent and 230 percent,
respectively. In contrast, the Maryland counties with the
lowest debt-to-income levels were Allegany County
(110 percent) and Garrett County (120 percent). Although
county debt-to-income levels were generally higher in
Maryland than in Virginia, there were a number of Northern
Virginia counties with extremely high levels. Prince William
and Loudoun counties, for example, had the highest percentages at 280 percent and 260 percent, respectively. At the
same time, there were a number of Virginia counties with
debt-to-income percentages below 100 percent.
So what are some explanations for the differences in
leverage across Fifth District counties? As already illustrated
in the state analysis, local housing market conditions play an
important role in household debt levels. Those areas of the
District where home prices rose the fastest also experienced
the greatest increase in leverage due to higher levels of
mortgage debt. This partly explains the higher leverage
percentages in Northern Virginia and Maryland counties
and cities. The average increase in home prices across
Maryland counties was 132 percent from the beginning of
2001 through the fourth quarter of 2006. In Northern
Virginia, home prices increased by 162 percent in Prince
William County and by 123 percent and 135 percent in
neighboring Loudoun and Stafford Counties, respectively.
In contrast, in those counties where there was less increase
in leverage, home price increases over the period were more
moderate. Home prices rose by roughly 40 percent across
counties in North Carolina, 49 percent in South Carolina,
and 73 percent in West Virginia, for example. For the entire
Fifth District, the correlation between home price changes
from the first quarter of 2001 to the fourth quarter 2006 and
increases in debt-to-income ratios over the same period was
fairly strong at 0.51.
A related factor that likely influenced household debt
levels was the strength of the local economy. In other words,
just as standard models suggest that increases in unemployment risk lead to attempts by households to reduce debt,
they imply that stable employment prospects allow
households to carry debt. Although Northern Virginia and
Maryland counties saw the sharpest housing boom and bust
in the Fifth District, these counties also have strong labor
markets, with lower unemployment rates, higher job growth,
and greater income growth than other areas. For example,
the unemployment rate in the fourth quarter of 2006 was
3.8 percent in Maryland and just 2.1 percent in Northern
Virginia — compared to 4.8 percent, 6.2 percent, and
4.4 percent in North Carolina, South Carolina and West
Virginia, respectively. Stronger labor markets and income
prospects may have helped households to assume greater
debt loads than those in areas with weaker labor markets and
income prospects. In fact, the correlation between debt-toincome levels and unemployment rates in the fourth quarter
of 2006 was -0.43, indicating a relatively strong negative
relationship between labor market conditions and household leverage.

PERCENTAGE POINT CHANGE FROM Q42006

Fifth District counties that experienced
Change in Unemployment Rate for High- and Low-Leverage Counties*
an increase in leverage from 2000 to 2006
7
were more likely to experience a sharp fall in
6
* Top 10 percent of counties with highest leverage ratios
home prices between 2006 and 2009. The
as of Q4 2006 considered “high-leverage” counties and bottom
5
correlation between the two was -0.54. This
10 percent of counties with lowest leverage ratios as of
is a relationship found in previous empirical
4
Q4 2006 considered “low-leverage.”
studies, such as in Mian and Sufi’s 2010
3
paper. Unlike in that study, however, increas2
es in leverage in the Fifth District did not
1
seem to have a negative impact on housing
0
construction. The correlation between lever-1
age increase from 2000 to 2006 and the
-2
2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2012
change in housing permits from 2006 to
Low Leverage Counties
High Leverage Counties
2009 was slightly positive (0.20). Once the
recession began, the decline in construction
SOURCES: Bureau of Labor Statistics, Federal Reserve Bank of New York Consumer Credit Panel/Equifax, Bureau of
Economic Analysis/Haver Analytics (With calculations by the Federal Reserve Bank of Richmond)
activity was relatively widespread across
the District with little distinction between
income). The average increase in the unemployment rate
counties with high or low leverage.
from the end of 2006 to the end of 2009 was roughly
In fact, an initial look at household finances suggests that
6 percentage points in low-leverage counties, while in highhousehold leverage did not have a considerable impact on
leverage counties, the increase was 4 percentage points.
the Fifth District economy during the recession. Some of
(See chart.)
the studies cited earlier found that increases in leverage led
Housing construction, as measured by housing permits,
to more severe declines in consumer spending and labor
showed little difference between high- and low-leverage
market conditions during the recession. Those effects
counties during the recession. Prior to 2006, permit activity
were not readily evident in the data for the Fifth District,
was considerably higher in the high-leverage counties,
however. The correlation between an increase in leverage
reflecting the heightened activity during the housing boom,
between 2000 and 2006 and the change in the unemploywhile for low-leverage counties the level of activity was only
ment rate from the end of 2006 through 2009 was negative
moderately higher in 2003 to 2005. During the recession, in
— opposite the result found in some studies. There were a
contrast, the decline in permits for both high- and low-levernumber of counties in the Fifth District that had large
age counties was fairly similar in depth and duration. This is
increases in leverage prior to the recession, yet relatively
notable given the very different path of home prices. As
smaller increases in the unemployment rate during the
expected, home prices rose more quickly and fell considerrecession. In addition, the relationship between increasing
ably faster and further in high-leverage counties than in
leverage and the change in employment between the end of
low-leverage counties.
2006 and the end of 2009 was positive, suggesting that areas
with greater increases in leverage prior to the recession also
had stronger employment conditions. There seems to be no
Conclusion
relationship between changes in leverage and the change in
Debt-to-income levels varied considerably across the Fifth
the number of establishments between 2006 and 2009.
District during the recession and recovery, driven in part by
Increases in leverage within the Fifth District may have
changes in housing market conditions as well as by the
reflected stronger local economies and income prospects in
strength of local economic conditions. Not surprisingly,
addition to rising home values and increased mortgage debt
those areas within the Fifth District that experienced large
associated with the housing boom. As a consequence, when
house price increases also experienced sharper increases in
the housing market collapsed and the recession began, the
mortgage debt and leverage. Increases in leverage did not
impact of higher levels of consumer indebtedness was parnecessarily translate to a more severe downturn during the
tially buffeted by a more resilient local economy and
recession, however. In fact, some areas that experienced the
relatively stronger income prospects.
largest increase in leverage were areas with relatively
In fact, labor market conditions worsened to a greater
stronger economic performance. Further work will continue
extent in low-leverage counties (the bottom decile of counto investigate the robustness of these initial observations
ties, by debt to income) during the recession than
as well as contrast these observations with some of the
high-leverage counties (the top decile of counties, by debt to
previous empirical findings.
EF



ECON FOCUS | FIRST QUARTER | 2013

43

STATE DATA, Q3:12
DC

MD

NC

SC

VA

WV

733.8

2,573.8

3,954.0

1,853.2

3,721.4

750.4

Q/Q Percent Change
Y/Y Percent Change

-0.5

-0.1

0.0

0.1

0.0

-1.0

0.9

0.9

0.8

0.9

1.1

-0.8

Manufacturing Employment (000s)
Q/Q Percent Change
Y/Y Percent Change

1.0
-6.3
-6.3

110.6
-0.9
-2.6

438.1
0.4
0.7

223.4
0.4
2.3

226.9
-1.1
-1.0

47.1
-1.9
-5.0

Professional/Business Services Employment (000s) 148.5
Q/Q Percent Change
-1.7
Y/Y Percent Change
-0.8

412.6
1.3
3.9

522.4
0.8
1.9

233.5
-0.6
1.7

667.0
0.3
1.1

63.3
-0.2
1.0

Government Employment (000s)
Q/Q Percent Change
Y/Y Percent Change

242.0
-1.7
-0.9

510.5
0.4
0.4

691.8
-1.4
-0.5

342.5
0.5
0.0

710.1
-0.8
0.3

150.3
-1.5
-1.5

Civilian Labor Force (000s)
Q/Q Percent Change
Y/Y Percent Change

355.0
0.9
3.7

3,076.0
-0.3
0.2

4,656.7
-0.1
0.0

2,136.5
-0.7
-1.0

4,328.9
-0.2
0.4

799.5
-0.6
0.1

8.8
9.3
10.5

7.0
6.8
7.2

9.6
9.4
10.7

9.5
9.1
10.4

5.9
5.6
6.4

7.5
6.9
8.1

40,582.5
-0.1
1.5

262,901.7
0.1
1.4

309,769.4
0.0
1.4

139,743.3
0.3
2.0

331,564.2
0.1
1.2

55,428.0
-0.2
1.6

Building Permits
Q/Q Percent Change
Y/Y Percent Change

1,302
30.7
46.5

3,724
12.1
15.4

11,442
-5.5
45.9

4,611
-15.9
26.7

6,653
-3.0
8.0

450
-22.9
-13.0

House Price Index (1980=100)
Q/Q Percent Change
Y/Y Percent Change

587.3
1.3
3.5

407.5
1.2
-1.1

301.4
0.9
-1.5

305.4
0.7
-0.7

396.9
1.0
-0.3

216.0
0.9
0.6

Nonfarm Employment (000s)

Unemployment Rate (%)
Q2:12
Q3:11
Real Personal Income ($Mil)
Q/Q Percent Change
Y/Y Percent Change

44

ECON FOCUS | FIRST QUARTER | 2013

Nonfarm Employment

Unemployment Rate

Real Personal Income

Change From Prior Year

First Quarter 2002 - Third Quarter 2012

Change From Prior Year

First Quarter 2002 - Third Quarter 2012

First Quarter 2002 - Third Quarter 2012

8%
7%
6%
5%
4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%

10%

4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%

9%
8%
7%
6%
5%
4%
3%
02

03 04

05

06

07

08 09

10

11

02

12

03 04

05

06

07

08 09

10

11

Fifth District

12

02

03 04

05

06

07

08 09

Unemployment Rate
Metropolitan Areas

Building Permits

Change From Prior Year

Change From Prior Year

First Quarter 2002 - Third Quarter 2012

First Quarter 2002 - Third Quarter 2012

First Quarter 2002 - Third Quarter 2012

03 04

05

Charlotte

06

07

08 09

Baltimore

10

30%
20%
10%
0%
-10%
-20%
-30%
-40%
-50%

Washington

03 04

05

06

Charlotte

07

08 09

Baltimore

10

FRB—Richmond
Manufacturing Composite Index

First Quarter 2002 - Third Quarter 2012

First Quarter 2002 - Third Quarter 2012

30

30
20

20

10

11

12

0

-50
03 04

05

06

07

08 09

10

11

12

08 09

United States

First Quarter 2002 - Third Quarter 2012

-30

02

07

16%
14%
12%
10%
8%
6%
4%
2%
0%
-2%
-4%
-6%
-8%

-40

-30

06

Change From Prior Year

-20
-20

05

House Prices

-10

-10

03 04

Fifth District

0

10

02

Washington

FRB—Richmond
Services Revenues Index

40

11 12

40%

02

11 12

10

Change From Prior Year

13%
12%
11%
10%
9%
8%
7%
6%
5%
4%
3%
2%
1%
02

11 12

United States

Nonfarm Employment
Metropolitan Areas
7%
6%
5%
4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%
-7%
-8%

10

02

03 04

05

06

07

08 09

10

11

12

02

03 04

05

Fifth District

06

07

08 09

10

11 12

United States

NOTES:

SOURCES:

1) FRB-Richmond survey indexes are diffusion indexes representing the percentage of responding firms
reporting increase minus the percentage reporting decrease.
The manufacturing composite index is a weighted average of the shipments, new orders, and employment
indexes.
2) Building permits and house prices are not seasonally adjusted; all other series are seasonally adjusted.

Real Personal Income: Bureau of Economic Analysis/Haver Analytics.
Unemployment rate: LAUS Program, Bureau of Labor Statistics, U.S. Department of Labor,
http://stats.bls.gov.
Employment: CES Survey, Bureau of Labor Statistics, U.S. Department of Labor, http://stats.bls.gov.
Building permits: U.S. Census Bureau, http://www.census.gov.
House prices: Federal Housing Finance Agency, http://www.fhfa.gov.

ECON FOCUS | FIRST QUARTER | 2013

45

METROPOLITAN AREA DATA, Q3:12
Washington, DC
Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q2:12
Q3:11
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Nonfarm Employment ( 000s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q2:12
Q3:11
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q2:12
Q3:11
Building Permits
Q/Q Percent Change
Y/Y Percent Change

46

Baltimore, MD

Hagerstown-Martinsburg, MD-WV

2,461.8
-0.3
1.2

1,298.5
-0.6
0.3

99.4
-0.8
1.8

5.4
5.5
5.9

7.3
7.3
7.4

7.9
8.0
9.0

5,534
-4.4
14.6

1,742
11.5
35.6

199
30.1
27.6

Asheville, NC

Charlotte, NC

Durham, NC

170.8
-0.3
1.5

835.2
-0.6
1.2

277.6
0.2
2.1

7.8
7.8
8.5

9.7
9.5
11.0

7.5
7.6
8.2

384
4.1
10.0

3,150
0.9
68.6

1,172
149.4
122.4

Greensboro-High Point, NC

Raleigh, NC

Wilmington, NC

345.9
-0.7
2.1

522.6
0.4
2.8

134.7
-1.1
-2.3

9.9
9.8
11.1

7.6
7.8
8.7

10.0
9.8
10.9

360
-24.5
-20.2

2,581
-14.8
81.9

839
25.0
62.9

ECON FOCUS | FIRST QUARTER | 2013

Winston-Salem, NC

Charleston, SC

Columbia, SC

203.4
-1.3
-1.1

301.8
0.2
2.1

352.1
-0.4
2.3

9.0
9.0
9.9

7.7
7.7
8.6

8.3
8.1
9.1

174
-65.5
-49.6

978
-48.2
26.5

895
-22.8
25.2

Greenville, SC

Richmond, VA

Roanoke, VA

300.8
-1.0
-1.2

616.1
-0.4
1.1

155.8
-0.2
0.0

Unemployment Rate (%)
Q2:12
Q3:11

7.9
7.7
8.6

6.4
6.2
7.0

6.1
6.0
6.6

Building Permits
Q/Q Percent Change
Y/Y Percent Change

585
-1.5
41.6

1,244
33.5
44.0

94
-22.3
-4.1

Virginia Beach-Norfolk, VA

Charleston, WV

748.3
0.1
0.8

147.9
-0.4
-0.5

115.0
-0.9
2.3

6.5
6.4
7.1

6.9
6.5
7.3

7.1
7.2
8.3

1,475
18.2
4.7

39
-18.8
-9.3

12
33.3
-42.9

Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q2:12
Q3:11
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change

Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q2:12
Q3:11
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Huntington, WV

For more information, contact Jamie Feik at (804) 697-8927 or e-mail Jamie.Feik@rich.frb.org

ECON FOCUS | FIRST QUARTER | 2013

47

OPINION

On Economic History and Humility
BY J O H N A . W E I N B E RG

episodes are at least as important as the similarities. A key
conomists have been arguing about the ability of
difference between the earlier episodes and our more recent
monetary policy to affect real economic activity
experience is in the behavior of prices. In both of the earlier
when the Fed has already pushed short-term interperiods, the doubts expressed by some policymakers and
est rates close to zero, a condition known as the “zero lower
other observers about the Fed’s ability to have an effect
bound.” This debate remains important as the recovery
included doubts about its ability to affect the path of the
from the Great Recession continues to yield disappointprice level — to stem the deflation of the early 1930s or the
ing results in terms of economic activity and, especially,
inflation of the 1970s.
employment — results that have persisted despite a lengthy
In the early stages of the Great Depression, many saw
period of accommodative monetary policy.
the
gold standard as taking the control of the price level
For many Fed leaders, including Federal Reserve Board
entirely out of the hands of the Fed’s monetary policy. In the
Governors and presidents of Federal Reserve Banks, this
1970s, inflation was seen as being driven by an array of nonstate of affairs underscores the limits of our knowledge
monetary forces, and many thought that monetary action to
about the effects of unconventional monetary policy; morebring down inflation would have unacover, it counsels in favor of a degree of
ceptably high costs in terms of economhumility when considering the course
Disappointing results in
ic activity and employment. By conof future policy. For example, Chairman
terms of economic activity
trast, the consensus today is that moneBernanke, while supporting continued
underscore the limits of
tary policy most certainly can increase
action by the Fed to attempt to stimuour knowledge about the
or decrease nominal price levels.
late the economy through monetary
effects of unconventional
Indeed, the Fed has since taken pains to
policy, noted in his December press
monetary policy.
maintain credibility regarding inflation,
conference that “we are now in the
recognizing that only monetary policy can affect the general
world of unconventional policy that has both uncertain
level of prices over time.
costs and uncertain efficacy or uncertain benefits.”
The question now, rather, is the extent to which the
I find this point of view persuasive — but it has generatcentral bank can affect real activity — particularly employed controversy. Some observers, such as Christina and David
ment — without putting its hard-won credibility for price
Romer of the University of California, Berkeley, have pointstability at risk. Many observers favor continued monetary
ed to past instances in which the Fed’s monetary policy was,
expansion on a large scale, on the belief that economic slack
in their view, the product of too much humility. By this, they
will restrain any incipient inflationary pressures. Others
mean that policy was not sufficiently aggressive in one direcargue that, in view of the magnitude of the monetary and fistion or another, and Fed leaders justified their restraint on
cal policy tools that have already been employed, it is not
the basis of doubts about the likely costs and benefits of
clear that the Fed can remedy the situation, while avoiding
more ambitious moves. The first of these episodes is the
other hazards to the economy, by increasing what it has
early Great Depression period of 1929-1933, when the Fed
already been doing. Given that people on both sides of the
rejected monetary expansion and, in fact, allowed the money
issue are necessarily reaching their conclusions on the basis
stock to fall by 26 percent. The second is the inflationary
of limited information about the use of unconventional
1970s (prior to the chairmanship of Paul Volcker), when Fed
tools, it is appropriate that all of us do so with an awareness
leaders believed that the rising price levels of the time could
of the limitations of what we know. But such prudence does
not be tamed through contractionary policy.
not reflect doubts about the ability of monetary policy to
These critics of the Fed draw a line from the early
affect inflation.
Depression and the 1970s to Fed policy of the past several
Far from believing that monetary policy doesn’t matter,
years and to the cautionary public statements of Fed policyas critics have suggested, Chairman Bernanke and others
makers during that period. They have cited, for example,
involved with monetary policymaking have acted both with
Chairman Bernanke’s statement in October that “monetary
boldness and with circumspection precisely because they
policy is not a panacea,” and the statements of Reserve Bank
are mindful of the power of monetary policy — power that
presidents at various times, including Richmond Fed
has led to both good and bad results in history. Responsible
President Jeffrey Lacker, that further accelerating monetary
leaders owe the public nothing less.
EF
expansion would increase the risk of inflation.
Comparisons across historical episodes can be instructive, and are in fact essential if policymaking is to improve
John A. Weinberg is senior vice president and director
over time. But in this regard, the differences between
of research at the Federal Reserve Bank of Richmond.

E

48

ECON FOCUS | FIRST QUARTER | 2013

NEXTISSUE
Is China Cheating at Trade?

Economic History

Many trade economists think China holds its currency artificially weak relative to the dollar to make its exports cheaper
for the rest of the world. Economists agree less on the policy’s
effect on U.S. trade, jobs, and consumers — and therefore what,
if anything, should be done about it. Regardless, the policy may
be brought to an end by mounting imbalances it creates for
China itself.

Economic history isn’t always the history of
progress. In the decades leading up to
the Civil War, the domestic slave trade
flourished in Virginia. Supply and demand
for human assets facilitated the rapid transfer of slave labor to cotton-producing states
in the Deep South and the continuation of
the “peculiar institution.”

Caring for the Mentally Ill

Federal Reserve

Recent tragedies raise questions about how society cares for
people with serious mental disorders. Significant changes in the
provision of care over the last 50 years reveal that mental
illness makes the economics of health care even more complex.
Have we gotten closer to understanding how society should
devote resources to mental health?

The Fed has relied on a variety of rules to
guide its monetary policy over the years.
Rules help set the market’s expectations of
future monetary actions, but not all of
them have resulted in optimal economic
performance. In many ways, the history of
the Fed can be viewed as the search for the
best monetary rule — one that helps meet
the central bank’s goals of “maximum
employment, stable prices, and moderate
long-term interest rates.”

Green Jobs
Policies to promote “green jobs” seem like an obvious solution
to both the nation’s high unemployment rate and growing
concerns about global climate change. But defining a green
job is a difficult task, and it’s not certain that their promotion
is the most efficient way to help either the environment or
the economy.

Pass Along Econ Focus
To subscribe to the magazine or request additional
copies, please call Research Publications at
(800) 322-0565 or email us at
research.publications@rich.frb.org

The Profession
Female economists have won high-profile
awards in recent years, but they’re underrepresented in the field as a whole by some
measures. Women hold only one in eight
tenured professor positions, and earn only
one-third of economics Ph.D.s, even though
women earn the majority of U.S. doctoral
degrees overall. In the first installment of
this new department, we ask: Why do relatively few women enter the field?

Visit us online:
www.richmondfed.org
• To view each issue’s articles
and Web-exclusive content
• To view related Web links of
additional readings and
references
• To add your name to our
mailing list
• To request an email alert of
our online issue posting

Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261

Change Service Requested

To subscribe or make subscription changes, please email us at research.publications@rich.frb.org or call 800-322-0565.