View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

THIRD QUARTER 2011

THE

FEDERAL

RESERVE

BANK

OF

RICHMOND

VOLUME 15
NUMBER 3
THIRD QUARTER 2011

COVER STORY
12

Why Aren’t We Creating More Jobs? Job growth usually rebounds quickly after a
severe recession, but this time is different
The United States has gained a little more than 1 million jobs since the end of the most recent
downturn — far from the number needed to put 14 million people back to work. Given the
factors holding back job growth, traditional policy tools might not be able to offer a solution.

FEATURES
17

Don’t Know Much About Financial Literacy: In this classroom, the right choice may be
(d) all of the above
The financial crisis revealed a widespread need for better financial education. What have we learned
since then about how to improve financial literacy?
20

Our mission is to provide
authoritative information
and analysis about the
Fifth Federal Reserve District
economy and the Federal
Reserve System. The Fifth
District consists of the
District of Columbia,
Maryland, North Carolina,
South Carolina, Virginia,
and most of West Virginia.
The material appearing in
Region Focus is collected and
developed by the Research
Department of the Federal
Reserve Bank of Richmond.
DIRECTOR OF RESEARCH

The End of Nowhere: What can ghost towns teach us about saving
small communities?

John A. Weinberg
EDITOR

With only seven residents, Thurmond, W.Va., is the smallest incorporated town in the
Fifth District. Once bustling and prosperous, Thurmond has become practically a ghost town.
Is the fate of places like Thurmond a tragedy or a “monument to American dynamism”?

Aaron Steelman
SENIOR EDITOR

David A. Price
MANAGING EDITOR

24

STA F F W R I T E R S

The Fish Market: What happened when Virginia brought tradable quotas
to the commons?

Renee Haltom
Betty Joyce Nash
Jessie Romero

Resources without ownership — commons — are easily exploited, even wiped out. Individual
fishermen have little incentive to conserve as long as others are busy catching. Evidence from a
Virginia program suggests that if fishermen can buy and sell shares of the fishing quota, the fish
and the fishermen may be better off.

E D I TO R I A L A S S O C I AT E

Tim Sablik
CONTRIBUTORS

28

A New Kind of Farm
“Server farms” run by Apple, Google, and Facebook won’t replace manufacturing jobs lost in
western North Carolina. But these massive data centers may bring other benefits.

Toil and Trouble for Revenue Forecasters: Greater sensitivity to business cycles has
made state tax revenues more difficult to predict
State revenue forecasts have become progressively less reliable following each of the past three
recessions. As capital gains become more unpredictable, tax revenue experts advise states to build
larger rainy-day funds.
34

Gas prices have fluctuated over the past few years. What drives prices at the pump?
Two-thirds of the price is determined by the price crude oil brings on the world market.

DEPARTMENTS

1 President’s Message/Is Joblessness Now a Skills Problem?
2 Upfront/Regional News at a Glance
6 Federal Reserve/The Dodd-Frank Act and Insolvency 2.0
9 Policy Update/Incentives for Greener Transportation
10 Jargon Alert/Utility
1 1 Research Spotlight/Ties that Bind
36 Interview/Derek Neal
41 Around the Fed/The Uncertain Effects of Economic Uncertainty
42 Economic History/Wartime Wilmington
46 Book Review/A Great Leap Forward
47 District Digest/Economic Trends Across the Region
56 Opinion/A Focused Approach to Financial Literacy
PHOTOGRAPHY: GETTY IMAGES

R. Andrew Bauer
Jake Blackwood
Charles Gerena
Karl Rhodes
Louis Sears
Sonya Ravindranath Waddell
DESIGN

31

Volatility at the Pump: Where do high gas prices come from?

Kathy Constant

BIG (Beatley Gravitt, Inc.)
Published quarterly by
the Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261
www.richmondfed.org
Subscriptions and additional
copies: Available free of
charge through our Web site at
www.richmondfed.org/publications
or by calling Research
Publications at (800) 322-0565.
Reprints: Text may be reprinted
with the disclaimer in italics
below. Permission from the editor
is required before reprinting
photos, charts, and tables. Credit
Region Focus and send the editor a
copy of the publication in which
the reprinted material appears.
The views expressed in Region Focus
are those of the contributors and not
necessarily those of the Federal Reserve Bank
of Richmond or the Federal Reserve System.
ISSN 1093-1767

PRESIDENT’S MESSAGE
Is Joblessness Now a Skills Problem?
oday, long-term unemployment — that is, unemployment lasting six months or longer — is at a
record high. The share of unemployed Americans
whose job searches have lasted this agonizingly long is
43.1 percent, a figure that is unprecedented since the
Bureau of Labor Statistics began keeping these records
in 1948.
A growing number of observers have argued that this
state of affairs is caused in significant part by a mismatch
between available jobs and available workers, especially a
mismatch in skills.
I agree that the long-term component of unemployment
has structural origins, including a substantial degree of skills
mismatch. I hear a fair number of stories from around our
District of hard-to-fill job vacancies in certain specialties.
Looking at the world around us, it is reasonable to assume
that employers need higher skill levels from their workers
today, on average, than they did a generation ago. Indeed,
the unemployment rate of college-educated workers lately
has been only around half that of workers without a high
school diploma. Economic research indicates that the
relationship between unemployment and the job vacancy
rate changed during the recession; we’re seeing more unemployment for a given rate of job vacancies — which suggests
matching problems.
But critics of the skills-mismatch story argue that the
empirical evidence does not fully support it. They point to
studies that have looked at vacancy and unemployment
rates according to industry and occupation, which have
estimated that the portion of unemployment attributable
to matching problems is between 0.6 percentage point and
1.7 percentage points.
In my view, such statistics do not disprove the mismatch
theory. The occupation-level and industry-level data on
which these studies rely can hide significant differences
within broad categories. There is a wide range of positions
within any given occupational or industry category, some of
them in high demand and some not; for example, “professional, scientific, and technical services” includes such
disparate businesses as law firms, advertising agencies, and
interior design firms. Not only do these data combine very
different categories of businesses, but they also combine
highly different jobs within a given business — both experienced patent attorneys, who may be in high demand, and
typists, whose demand has declined as lawyers have adapted
to the computer age. Aggregating such jobs together may
obscure the existence of scarcity, and skills mismatch, in
some of them. Moreover, even the estimates that are cited
by critics suggest a major role for mismatch: A percentage
point, or 1.5 percentage points, is significant even within the

T

context of today’s unemployment rate of roughly 9 percent.
In short, I think it is quite
plausible that skills mismatch
is an important factor holding
back improvements in the
labor market. The question is
how important — and that’s
an issue that economists are
working to answer as precisely
as possible.
What are the policy implications of the mismatch issue?
One is that public programs to support job training can be a
good investment. A more-skilled worker typically has a
higher marginal product — he or she can contribute more to
the economy — which means training programs are potentially beneficial to both the worker and the economy. But
such programs can be costly and time-consuming, so it is
unrealistic to expect such policies to transform the current
job landscape overnight. Moreover, there are questions
about the ability of government-directed programs to
identify and target the appropriate skills. Community
colleges and other providers do so in a decentralized way by
responding to demand from individuals, as well as demand
from firms for in-house training. Such efforts equip unemployed workers with the tools needed to land jobs that
actually exist, and arguably are more effective than largerscale, more centralized programs.
Another, more immediate implication is the extent to
which monetary policy can make a difference in getting more
Americans into jobs. To the extent that skills mismatch is
identified as a significant portion of the long-term unemployment problem, monetary policy will have difficulty
making meaningful inroads into the jobs problem without
increasing inflation. Monetary policy, after all, doesn’t train
people.
Labor-market mismatch is an example of the kind of
problems that have made policymaking so challenging
since the Great Recession — and that will likely be the
subject of vigorous yet collegial discussion at Federal Open
Market Committee meetings in the months ahead.

JEFFREY M. LACKER
PRESIDENT
FEDERAL RESERVE BANK OF RICHMOND

Region Focus | Third Quarter | 2011

1

UPFRONT

Regional News at a Glance

Tapping the Crowd

The Web Widens Funding Options

PHOTOGRAPHY: SEAN SCHEIDT

As donors and lenders tightened purse strings during the financial crisis and afterward,
artists and entrepreneurs have tapped a less traditional source of funds: the Internet.
“Crowd funding” describes the efforts of people or groups to solicit donations and investments for projects via the Web.
These donations are typically small, but collectively
the potential can be significant. In Baltimore, Scott
Burkholder and artist Michael Owen started the
Baltimore Love Project, a nonprofit community art
project with the goal of painting 20 murals that depict
the word “love” — in hands — all across the city.
For their seventh mural, Burkholder, the project’s executive director, set up a donation page on Kickstarter, a
crowd-funding website that caters to the arts.
Kickstarter allows online visitors to donate toward a
stated funding goal. The site processes payments
through Amazon’s payment system, charging between
8 percent and 10 percent for marketing and credit card
processing. Transactions only go through if the project’s
goal is reached within a set time of up to 60 days.
The Baltimore Love Project exceeded its goal of
$5,000, raising more than $6,500 in May.
“It’s such a great means to connect with your community; it’s a great opportunity to make your project
accessible and help people realize that $5 can do something,” Burkholder says. “I think as we progress, we’re

almost going back to the old community bank models,
where money stays much more local.”
Another
Baltimore
crowd-funding
effort,
GiveCorps, aims to build community interaction.
The website pairs Baltimore area nonprofits with local
businesses, which offer coupons and discounts to
contributing donors.
“It’s a way to both engage and encourage the generosity of a whole new generation of givers, who
collectively could have a big impact, even though individually their donations may be small,” says GiveCorps
Chief Executive Officer Jamie McDonald. GiveCorps
takes 10 percent for marketing and processing, and disburses donated funds regularly to nonprofits. There is
no minimum funding target or time constraint.
Business startups are also looking at the crowd as a
viable source of funds. In February 2010, Michael
Migliozzi II and Brian Flatow received more than
$200 million in pledges from 5 million would-be
investors for a plan to purchase Pabst Brewing Co.
The Securities and Exchange Commission (SEC)
ultimately shut down the effort because
the two failed to register the public
offering, but it demonstrates the potential of crowd funding.
Recognizing this demand, websites
like ProFounder help entrepreneurs
legally tap into potential investors.
ProFounder enables entrepreneurs to
retain ownership. In exchange for a
funding commitment, investors receive a
fixed revenue share — for example,
2 percent over four years.
This approach to business investment eventually may change regulations.
SEC Chairman Mary Schapiro has
spoken in favor of reevaluating rules,
and President Obama’s recent jobs
bill proposes loosening regulations on
Kickstarter funded the Baltimore Love Project’s mural on
raising capital through crowd funding.
Broadway East and North Avenue in East Baltimore. The online

campaign had 101 backers, more than half of whom pledged $35 or more.

2

Region Focus | Third Quarter | 2011

— TIM SABLIK

Crowds are Coming

Democrats Pick Charlotte for 2012 Convention
harlotte will host the Democratic presidential
convention in September 2012, the first major-party
convention for a Fifth District city since Baltimore also
hosted the Democrats a century earlier. The Democratic
National Committee’s selection rewards the efforts of
Charlotte officials who have worked many years to
raise the profile of the Queen City. The benefits of such
“mega-events” may be hard to quantify, however.
Based on the experiences of cities that hosted the last
three Democratic and Republican conventions, the
Charlotte Regional Visitors Authority (CRVA) estimates
that as much as $200 million will be spent in the weeks
before, during, and after the five-day convention in
Charlotte. This figure includes direct spending by
delegates, reporters, and others participating in convention-related activities, as well as the “multiplier effect” of
any money that circulates through the economy as a result
of the initial round of spending.
But money spent in Charlotte may not stay in Charlotte,
according to Victor Matheson, an economist at the College
of the Holy Cross who has studied the economic impact of
mega-events.
“During the convention, hotels may double or triple
room prices, but they won’t double or triple the wages they
pay their desk clerks or room cleaners,” he notes. “The
extra money doesn’t go into the pockets of local workers.”
Second, says Matheson, any marginal increases in revenue that result from the Democratic convention may be
offset by a displacement or “crowding-out” effect — the
throngs of visitors may prevent locals from spending on
activities they normally do.
The Charlotte Arts and Science Council contacted
cultural venues in other convention cities to see how the
events affected them. Robert Bush, the council’s senior vice
president, noted in a recent Mecklenburg Times article that
venues should expect local visitation to drop as residents
avoid the crowds and traffic congestion.

C

William Miller, who leads the committee that organized
Charlotte’s bid for the Democratic convention, is more
optimistic. September is a slow time of the year for the
city’s cultural institutions, so “it’s a good time to have
people here. We won’t be eating into any tourism revenues.”
He also hopes that people who normally visit the Charlotte
region for Labor Day weekend will stick around for the
convention.
Finally, there are the direct costs of mega-events to consider such as improvements to local transportation
networks. This should be less of an issue for Charlotte,
according to Tim Newman, the CRVA’s chief executive
officer. “Due to our history hosting large events such as the
Central Intercollegiate Athletic Association college basketball tournament — which drew 175,000 attendees in 2010
— [and] major NASCAR events, we have the infrastructure
to support the Democratic convention.”
There is one big difference between hosting a political
convention and hosting a NASCAR race: protection for
President Obama and Democratic politicians. “This event
will require greater security than anything we have ever
done,” says Miller. It will require additional local and state
police on top of the Secret Service. Convention cities
receive a federal grant that is supposed to cover security
costs, including purchases of specialized equipment and
training.
Apart from its uncertain economic impact, a megaevent may have less tangible benefits to consider. It may
raise the profile of the host city and, if nothing goes terribly
wrong, boost its image. This can signal to businesses that it
would be a good place in which to relocate or hold other
events.
Miller says the 2008 Republican convention in the
Minneapolis-St. Paul region generated free advertising that
would have cost millions of dollars. “Charlotte hasn't done
any international marketing, so any publicity would be a
— CHARLES GERENA
benefit.”

Carolina Gold

Exploration at a Historic Mine in South Carolina Is Put on Hold
he Haile Gold Mine is nestled deep in the Carolina
slate belt, a region known for hosting one of the
nation’s first major gold rushes of the early 19th century.
The area may once again be a significant source of gold
extraction.
Romarco Minerals Inc., a Canadian exploration — and
development — stage gold mining company, has broken
ground on a new facility. But construction at the Lancaster

T

County, S.C., mine will have to wait an additional year.
The U.S. Army Corps of Engineers ruled in July that an
environmental impact statement (EIS) is necessary before
the mine can receive a wetlands permit. The mine’s construction will involve digging or filling 162 acres of
wetlands, and the use of the chemical cyanide. Both need
further evaluation, according to the Corps.
Romarco had hoped to complete construction at the

Region Focus | Third Quarter | 2011

3

2011 DOLLARS

The Price of Gold in Real Terms
2,000
1,800
1,600
1,400
1,200
1,000
800
600
400
200
0
1990

2000
2010
Monthly Average Price of Gold
NOTE: Gold’s price reached higher levels briefly in the early 1980s.
SOURCES: KITCO.COM/BLS
1980

mine and begin producing gold bars by early 2013,
with plans to employ roughly 800 workers, including
500 during construction, and about 300 once production
begins. The company estimates that the mine may hold
an estimated 3.1 million ounces of gold, or about 40 percent of the 8.1 million ounces produced annually in the
United States.
“From a business perspective, there are some positives
to the [EIS] report process,” says Romarco Chief
Executive Officer Diane Garrett. “It gives the public an
opportunity to comment on what we’re doing, though it
delays spending and hiring and the company’s ability to
move forward.”
The county could use the jobs. Its unemployment
rate was 14.1 percent in May 2011, well above the state
average of 10 percent. Garrett says that there are no

immediate plans to lay off any of the company’s 140
current employees in the county.
Gold mining as an industry is not new to the Carolinas
or to the Haile Mine, which dates back to 1827. Since then,
mining has stopped and started as prices fluctuated. The
United States currently ranks behind China and Australia
in gold production, with most U.S. gold mined on federal
lands in the West, primarily Nevada.
Indeed, gold mining in the Fifth District had been
abandoned since the early 1990s. Recent technological
advances in Australia have allowed miners to grind the soil
to a finer consistency and extract even tiny particles of
gold. This, along with advances in drilling and high gold
prices, has given new hope to mining companies. The
renewed interest has even started a wave of amateur
prospecting in South Carolina, according to Scott
Howard, the state’s chief geologist.
The sharp rise of gold’s price, from below $750 per
ounce in late 2009 to roughly $1,750 at the end of October
2011, is commonly believed to be the result of higher inflation expectations, though economists differ on the causes
of price spikes of precious metals.
As Romarco prepares to complete its assessment
for the Corps, the gold mining industry is keeping an
eye on the results. As Garrett puts it, “People are waiting
and watching to see if you can produce gold in
South Carolina.”
— LOUIS SEARS

Boy Scout Bonanza

BSA Develops $400 Million Complex in West Virginia
ore than 1,000 Boy Scouts converged on the New
River Gorge National River in Fayette County,
W.Va., in July to build miles of multipurpose trails and
remove acres of invasive vegetation. The project was one
of the largest youth service projects in the history of the
Boy Scouts of America (BSA), but it was only a small
sample of things to come.
In 2009, the BSA purchased 10,600 acres — mostly
in Fayette County — and announced plans to develop
a $400 million scouting complex adjacent to the national
park. The complex has been named the Summit Bechtel
Family National Scout Reserve in recognition of a
$50 million grant from the S.D. Bechtel Jr. Foundation.
(Engineering and construction magnate Stephen Bechtel
Jr. was an Eagle Scout.)
The Summit, as it has become known, will serve as the
permanent home of scouting’s National Jamboree beginning in July 2013. Every four years, about 40,000 scouts

M

4

Region Focus | Third Quarter | 2011

and 8,000 volunteers will spend 10 days at the complex.
The first phase of development will prepare the Summit
for the 2013 jamboree. Over the following six years, the
BSA plans to develop a year-round “high-adventure base”
similar to the Philmont Scout Ranch in New Mexico. By
2019, when the complex will host the International
Jamboree, BSA officials expect the Summit to serve at
least 30,000 Boy Scouts each year in addition to those
who attend jamborees. The complex will employ about 80
people year round and 1,000 additional workers each
summer.
To select a site for the Summit, the BSA considered
80 proposals from sites in 28 states. The organization
initially planned to hold the National Jamboree in
Rockbridge County, Va., and put the high-adventure base
in Fayette County, but the Rockbridge site was “simply
too restrictive from a land-utilization perspective,” says
Jack Furst, president of Arrow WV, a nonprofit subsidiary

of the BSA that purchased the land. So the BSA decided
to combine both projects in Fayette County.
That was welcome news for a county with 21.6 percent of
its 46,000 residents living below the poverty line of $22,050
per year for a family of four. By August, the Summit had
generated 285 jobs — mostly construction — with 50 percent of the workers coming from Fayette and other nearby
counties. The project also had paid nearly $16 million to
West Virginia-based contractors and $9 million to local
consultants and service providers.
The Summit’s operation also will create 207 full-time
equivalent jobs, mostly seasonal camp workers, according
to an economic impact study commissioned by the BSA and
conducted by Syneva Economics, a consulting firm based in
Asheville, N.C. That projection increases to 486 full-time
equivalent jobs in later years as the complex becomes
substantially complete. Those numbers do not include the
impact of the National Jamboree, which is expected to
generate more than $41 million in local expenditures every
four years.

Boy Scouts build a trail at the New River Gorge National River.

The steady stream of scouts also will help the surrounding area in nonfinancial ways. The BSA plans to develop
service-learning programs with the Park Service and
neighboring communities. Every scout who attends a jamboree, an adventure camp, or a leadership school at the
Summit will be expected to provide at least six hours of
community service.
— KARL RHODES

Upwind, Downwind

Rule Restricts Interstate Pollution
he Environmental Protection Agency (EPA) has
issued a rule aimed at slashing emissions that drift
from state to state and dirty the air far from the pollution source. The rule will force the electric power sector
to ratchet down emissions levels from plants in 27 states
in the eastern half of the United States, where most of its
population lives.
Tall smokestacks limit local air pollution by thrusting
emissions high into the atmosphere. But wind and weather
spread the pollutants — precursors to soot and smog — far
and wide.
The Cross-State Air Pollution Rule (CSAPR), six years
in the making, will cut sulfur dioxide (SO2) levels from
power plants in affected states by 73 percent from 2005
levels, and nitrogen oxide (NOx) levels by 54 percent. The
rule will take effect Jan. 1, 2012. It replaces the 2005 Clean
Air Interstate Rule (CAIR), found to violate the Clean Air
Act following a 2008 lawsuit North Carolina brought
against the EPA.
North Carolina plants may be in better shape to meet
new limits because of standards established by the North
Carolina Clean Smokestacks Act passed in 2002. The law
required scrubbers. The state also sued the Tennessee Valley
Authority (TVA) in 2006 over downwind emissions from
TVA’s coal-fired plants. The action culminated in a settle-

PHOTOGRAPHY: GARY HARTLEY/BOY SCOUTS OF AMERICA

T

ment that will cut emissions across the TVA system.
SO2 and NOx react in the atmosphere and contribute to
fine-particle pollution. Particles are one-tenth the diameter
of a human hair and may impair lung function. Particle
pollution can also increase the risk of heart disease and lung
cancer, and the incidence of asthma attacks. SO2 is also a
chief culprit in acid rain, which has polluted air, streams,
and forests and has damaged buildings.
Such regulations both impose costs on and accrue benefits for society, and it’s sometimes hard to evaluate
trade-offs. Power companies bear the brunt of compliance
costs, which may entail plant closures and job losses. If electricity rates rise, that affects individual ratepayers and the
broader economy. The biggest benefit comes from savings
associated with lower mortality among children and the
elderly; the pollution is associated with an estimated 13,000
to 34,000 premature deaths.
Compliance costs vary according to the number and age
of coal-fired plants in electricity companies’ generating
fleets. Strategies such as added pollution controls, shutdowns of aging coal plants, or fuel-switching will cost
money — but some controls are in place already to meet
existing state regulations or CSAPR’s predecessor rule.
Charlotte-based Duke Energy has spent $5 billion on
continued on page 30

Region Focus | Third Quarter | 2011

5

FEDERALRESERVE
The Dodd-Frank Act and Insolvency 2.0
BY DAV I D A . P R I C E

A new tool lets
regulators order
troubled financial
companies into
receivership to
avoid systemic risk

he financial crisis of 2008,
particularly the bailout of the
investment firm Bear Stearns
and the bankruptcy of Lehman
Brothers Holdings, led many policymakers to reach two conclusions: first,
that the bankruptcy process lacks the
expertise and agility needed to handle
the failure of systemically important
financial institutions, or SIFIs, such
as Bear Stearns and Lehman; and
second, that bailouts of financial institutions are unacceptable to voters and
are themselves a source of excessive
risk-taking.
In the wide-ranging Dodd-Frank
Wall Street Reform and Consumer
Protection Act of 2010, better known
as the Dodd-Frank Act, Congress
sought to address these issues by creating a new regime for handling the
failure or expected failure of a SIFI.
This regime, known as Orderly
Liquidation Authority, has significant
implications for the largest, most
interconnected financial companies —
both in death and in life.

T

How It Works
Orderly Liquidation Authority covers
a subset of nonbank financial
companies: bank holding companies,
brokers and dealers registered
with the Securities and Exchange
Commission, and nonbanks designated for supervision by the Fed on the
basis that they are systemically important. (On the latter category, see
“Sifting for SIFIs,” Region Focus,
Second Quarter 2011.) In addition,
Orderly Liquidation Authority covers
financial subsidiaries of bank holding
companies and designated nonbanks,
other than insured depository institutions and insurance companies.
Companies in these categories are
eligible to be placed into orderly liquidation if certain conditions are met.
The Treasury Department must determine that the company is “in default

6

Region Focus | Third Quarter | 2011

or in danger of default,” that its resolution under otherwise-applicable law
(normally bankruptcy law) “would
have serious adverse effects on the
financial stability of the United
States,” and that “no viable private
sector alternative” is available to prevent the company’s default, among
other requirements. (See box.) The
process for reaching this determination is a complex one: It begins with a
recommendation by both the Fed’s
Board of Governors and the board of
the Federal Deposit Insurance Corp.
(FDIC) — unless the company is a
broker-dealer or an insurance company, in which case the FDIC’s role in
the process is taken instead by the
Securities and Exchange Commission
or the newly created Federal Insurance
Office of the Treasury Department,
respectively. Once the designated
agencies have made a recommendation (supported by a detailed analysis),
the Secretary of the Treasury is
required to consult with the President
and decide whether the company
meets the statute’s requirements for
orderly liquidation.
When the Treasury makes such a
determination, the company’s board
has a limited opportunity to challenge
it in federal district court. The only
findings of the Treasury Secretary that
it can challenge are that it is indeed “in
default or in danger of default” and
that the company is a financial
company as defined by the Act.
The court can reject the Treasury
Secretary’s decisions on these issues
only if it finds them to have been
“arbitrary and capricious.” Moreover,
if the district court does not act within 24 hours, then the law deems the
Treasury Secretary’s determination to
have been upheld. (The U.S. District
Court for the District of Columbia,
the court that will hear companies’
objections, has issued a rule requiring
Treasury to give 48 hours’ advance

Ordering Orderly Liquidation
notice to allow time to prepare; Treasury and the FDIC have
objected to this requirement.)
At that point, the company involved is placed into a
receivership, with the FDIC as receiver. In the FDIC’s
words, it is required to use its best efforts “to liquidate the
covered financial company in a manner that maximizes the
value of the company’s assets, minimizes losses, mitigates
risk, and minimizes moral hazard.” The FDIC’s powers over
the company as receiver are analogous to its powers over a
failing FDIC-insured depository institution. Among other
things, it may sell the company or any of its assets; it may
create a company to receive the failing company’s assets; and
it may repudiate contracts and leases to which the financial
company is a party.

An Expansion of Discretion
The Orderly Liquidation Authority provisions of the Act
attempt to place numerous limits on regulators’ discretion,
both at the stage of designating a company for orderly liquidation and during the receivership process. The two
agencies that recommend designation must agree with one
another and must follow criteria set out in the law; the
Secretary of the Treasury must also agree that designation is
warranted on the basis of the law’s criteria for him or her to
apply. In carrying out its receivership, the FDIC must
ensure that its actions conform to the criteria of maximizing
the value of the company’s assets, minimizing losses,
mitigating risk, and minimizing moral hazard
Still, in comparison with the bankruptcy process, orderly
liquidation gives regulators more discretion in the triggering
of the process and in its administration. At least in the short
term, and perhaps in the longer term, this difference may
create a higher level of uncertainty in orderly liquidation
than in bankruptcy.
For example, the incentives facing regulators with political accountability are likely to differ from those facing
creditors, who have a distinct kind of accountability — their
own money is at stake. Creditors have an incentive to
provide more funding if they believe the company is a viable
going concern and if they believe doing so will be profitable.
But the motivations affecting regulators in deciding whether
to pursue orderly liquidation are not so clear. In the postfinancial-crisis era, will regulators consider it anathema to
designate financial companies for orderly liquidation,
knowing that an arranged acquisition of a company will
almost certainly lead to a more concentrated industry in the
hands of the companies left standing? Or, alternatively, will
skittish regulators perceive dangers of default and systemic
risk everywhere they look?
When the bankruptcy process is underway, it is overseen
by trustees and bankruptcy judges with the involvement of
the company and its creditors. In orderly liquidation, in contrast, the FDIC is not required to allow any party to be
represented in the process. When the FDIC sells a financial
company to another entity, for example, it is not required to
consult or even give notice to the company’s shareholders or

If the Secretary of the Treasury receives recommendations from the Fed and another
designated regulatory agency that a nonbank financial company should be placed in
receivership, the Dodd-Frank Act requires him or her to determine whether certain
criteria are met. If so, he or she must pursue receivership for the company (with the
consent of the company’s directors, if they agree, or by court order, if not).
The statute sets out this list of determinations that the Secretary of the Treasury
must make for the process to go forward:

1

the financial company is in default or in danger of default;

2

the failure of the financial company and its resolution under otherwise
applicable Federal or State law would have serious adverse effects on
financial stability in the United States;

3

no viable private sector alternative is available to prevent the default
of the financial company;

4

any effect on the claims or interests of creditors, counterparties, and
shareholders of the financial company and other market participants
as a result of actions to be taken under this title is appropriate, given
the impact that any action taken under this title would have on financial
stability in the United States;

5

any [orderly liquidation] would avoid or mitigate such adverse effects,
taking into consideration the effectiveness of the action in mitigating
potential adverse effects on the financial system, the cost to the general
fund of the Treasury, and the potential to increase excessive risk taking
on the part of creditors, counterparties, and shareholders in the
financial company;

6

a Federal regulatory agency has ordered the financial company to
convert all of its convertible debt instruments that are subject to the
regulatory order; and

7

the company satisfies the definition of a financial company under
section 201.

SOURCE: Dodd-Frank Wall Street Reform and Consumer Protection Act, sec. 203(b)

creditors. The same has long been true in the receivership of
an insured depository institution.
“Nobody has any standing except for the administrator
who makes all the decisions,” says Robert Bliss, a business
professor at Wake Forest University. “They’re going to be
making massive decisions by themselves. The Act will have
them doing it in a way that is not transparent and not subject
to any substantial right of appeal. So it’s an enormous
amount of power and conflicting objectives.”

Paying for Liquidation
Resolving a failing company typically requires an infusion of
capital. In the case of failing banks, this may mean the
government taking liabilities or bad assets off the institution’s balance sheet, or promising sweeteners to an acquirer
(such as loss-sharing agreements). When the FDIC resolves
a nonbank in the orderly liquidation process, where will this
money come from, if needed?
The statute emphatically states that it will not come
from taxpayers. (“Taxpayers shall bear no losses from the
exercise of any authority under this title,” it directs at one
point.) This mandate upholds one of the primary purposes
of Orderly Liquidation Authority: to put companies and

Region Focus | Third Quarter | 2011

7

markets on notice that there will be no government bailouts
of “too big to fail” institutions.
Instead, the first source of the funds needed to liquidate
a company will be the disposition of the company’s assets.
If those funds are not enough, the FDIC is to recover the
rest through assessments on other companies — initially on
creditors that received preferential treatment during
orderly liquidation, and then on other companies in the
financial sector (specifically, large bank holding companies
and Fed-supervised nonbanks).
During Congress’ consideration of the Act, FDIC
Chairman Sheila Bair argued for the creation of a reserve
funded by the industry in advance rather than after-the-fact
assessments on the industry. She contended that an advance
reserve would avoid the pro-cyclical effects of assessments
that would tend to hit other financial companies — and
perhaps weaken them — during a downturn. Congress opted
for the assessment approach, however.
Government money may flow into the process on an
interim basis, however. The Act allows the FDIC to borrow
from the Treasury Department in connection with a liquidation — for example, to make loans to the company (or a
bridge company formed from the company), to guarantee its
obligations, or to pay other costs of liquidation. Some
observers have questioned whether the use of taxpayer
funds to support the financial company, even if it is formally
required to be repaid, may signal to markets that the
government is likely to back the company further if necessary to get its money back and to avoid potential systemic
consequences.
“In a bankruptcy process, there is no presumption that
the court is going to put money into the insolvent company;
the court doesn’t have any money,” says Bliss. “In the administrative process, the FDIC does have access to funds.
There is, therefore, a presumption that the government is
going to back anything that they put into a bridge bank. You
have a potential for a much bigger commitment in the
administrative process.”

Treatment of Derivatives and Repos
One area where Orderly Liquidation Authority does draw
upon existing bankruptcy law is in its treatment of so-called
“qualified financial contracts” — primarily derivatives and
repos. Federal bankruptcy law gives counterparties to these
contracts special treatment; most notably, they are free to
close out the contracts with the bankrupt company,

overriding the automatic stay in bankruptcy and normal
bankruptcy preference rules. This special treatment has
been viewed as a means of averting the systemic risk that
could be created by the default of derivatives counterparties.
A counterparty in the context of an orderly liquidation
enjoys the same special treatment, but with an exception: It
cannot exercise those rights if the FDIC transfers the qualified financial contract to a private acquirer or a newly
created bridge company within one day from the start of the
receivership. This one-day automatic stay gives the FDIC
the opportunity to avoid close-outs of qualified financial
contracts if they would be problematic to the institution.
As a policy matter, the desirability of special treatment
for counterparties to qualified financial contracts — in both
bankruptcy and orderly liquidation — has been criticized by
some scholars. At a workshop on financial firm bankruptcy
in July, co-sponsored by the Richmond Fed and the
Philadelphia Fed, several business and law professors argued
against the special treatment. David Skeel of the University
of Pennsylvania Law School, Franklin Edwards of the
Columbia University Graduate School of Business, and
Douglas Diamond of the University of Chicago Booth
School of Business contended that it reduces counterparties’
incentives to investigate risks (since they have, in effect, a
priority claim on the company’s assets), that it leads to
excessive use of derivatives (by making them cheaper
relative to debt), and that it contributes to runs on the
troubled financial company.

Waiting For A Stress Test
The Treasury Secretary has not yet placed any nonbank
financial companies in orderly liquidation, so there is still
much to be learned about how it will operate in practice and
how effective it will be. Indeed, some of the regulations relevant to orderly liquidation are still being written. The ideal,
though perhaps unlikely, outcome is that it will never need
to be used. Some of its more severe provisions — from the
perspective of directors, management, creditors, and equity
investors — may have the beneficial effect of encouraging
systemically-important companies to seek additional capital
(even at highly dilutive terms) when they are facing trouble,
rather than risk entering the orderly liquidation process.
Almost certainly, sooner or later, a crisis in the finances of a
major nonbank will shed light on how the existence of
Orderly Liquidation Authority shapes the behavior of
RF
private parties and regulators alike.

READINGS
Bliss, Robert R., and George K. Kaufman, “Resolving Large
Complex Financial Institutions: The Case for Reorganization.”
Paper presented at the Federal Reserve Bank of Cleveland’s
“Conference on Resolving Insolvent Large and Complex
Financial Institutions,” April 14-15, 2011.

8

Region Focus | Third Quarter | 2011

Federal Deposit Insurance Corp. “The Orderly Liquidation of
Lehman Brothers Holdings under the Dodd-Frank Act.”
FDIC Quarterly, 2011, vol. 5, no. 2, pp. 31-49.
U.S. Government Accountability Office. “Bankruptcy: Complex
Financial Institutions and International Coordination Pose
Challenges.” GAO-11-707, July 2011.

POLICYUPDATE
Incentives for Greener Transportation
BY R E N E E H A LT O M

orth Carolina owners of cars with the newest
environmentally friendly technologies can now
drive on high-occupancy vehicle (HOV) lanes
regardless of the number of passengers they carry. The new
policies, signed into law in May and June of this year,
are intended to encourage car buyers to choose plug-in
electric vehicles and vehicles powered by certain alternative fuels. A small but growing number of states have similar
HOV incentives for these types of vehicles.
The policies are somewhat ahead of the curve since those
up-and-coming technologies are not yet widely available —
unlike hybrid vehicles, which comprise about 3 percent of
total vehicle sales (more than half of which are the popular
Toyota Prius). Hybrids combine battery power with a gasfueled internal combustion engine. HOV access for hybrids
has been around in some areas since 2000, when Virginia
became the first state to offer the incentive. Such policies
are typically administered by a special marking — a sticker
or a special license plate — available for purchase that
authorities can watch for on vehicles traveling in HOV
lanes.
The evidence is mixed on whether HOV access has been
successful at spurring hybrid purchases. Of the five states
that introduced HOV hybrid exemptions between 2000 and
2006, only in Virginia was there a positive and significant
effect on hybrid purchases, according to economists Kelly
Sims Gallagher at Tufts University and Erich Muehlegger at
Harvard University.
Probably explaining Virginia’s success, Muehlegger says,
is that most of the area’s HOV lanes exist in Washington,
D.C. That area has what is commonly considered to be the
worst traffic in the nation, making HOV access particularly
valuable. (Los Angeles and San Francisco also tend to be
included in lists of worst traffic areas, but California’s
program allocated 60 percent of its 85,000 HOV access
stickers to people who already owned hybrids. That may
explain why research has been unable to link HOV access to
new purchases in that state.)
The lesson for policymakers might be that HOV access is
most likely to change consumer behavior near extremely
highly trafficked corridors. Usually, however, “HOV lanes
are not the trigger that get people to buy a hybrid relative to
a regular vehicle,” Muehlegger says.
But consumers do seem to find HOV access valuable —
and policymakers may be able to exploit that toward a
positive end. Economists Sharon Shewmake at Vanderbilt
University and Lovell Jarvis at the University of California,
Davis, looked at California’s program, which sold HOV
access stickers between August 2005 and February 2007.
They looked at the used car market for hybrids and found

N

that ones adorned with HOV-access stickers, which conveyed with the cars, sold for a premium to the average tune
of $625 for each year the buyer knew the HOV policy would
be in effect. At that valuation, what the state sold for
$8 a pop could have generated $270 million in revenue,
they calculate.
By auctioning HOV access to the highest value users with
any type of car, and using that revenue to encourage hybrid
purchases via sales tax waivers, policymakers could both
allocate HOV space to the people who value it most and
encourage a greater number of hybrid purchases than HOV
access has apparently been able to do, Shewmake and Jarvis
argue. Based on estimates of the effectiveness of tax waivers
to spur hybrid purchases, they further estimate that a sales
tax waiver between $1,000 and $2,000 would have encouraged the same number of hybrid purchases as Virginia’s
policy did, with revenue left to spare.
Harvard’s Muehlegger argues that the real hurdle to
green car adoption is getting people comfortable with the
technologies. For example, people were initially unsure
about the long-run performance of hybrids — would the
batteries eventually wear out? — but this problem receded
as people observed more and more of their peers having positive experiences with hybrids.
Both of the technologies being promoted by the new
North Carolina law are troubled by additional adoption
hurdles. For plug-in electrics, consumers are worried about
“range anxiety,” or how far the car can go before requiring a
charge-up. For alternative fuel vehicles, will fueling stations
be widely available?
Policies that encourage these technologies could help to
lower those adoption hurdles. But they also arbitrarily favor
certain types of green technology over others. Most economists would ask: Why pass a law subsidizing one type of
technology — say, hybrid vehicles or plug-in electrics —
instead of a law that encourages any technology that yields
environmental gains above a certain desired threshold?
And if the policy incentives don’t prove enough to boost
green technology, there’s always vanity. One reason people
might buy a Toyota Prius, with its unique, recognizable
design, is the signal it sends to the driver’s peers about that
person’s environmental enlightenment. Economists (and
siblings) Steven and Alison Sexton, graduate students at the
University of California, Berkeley, and the University of
Minnesota, respectively, estimated that for people living in
Colorado and Washington communities that are more
“green” (as identified by voting preferences), this “green
halo” was worth up to several thousand dollars per car,
helping private markets produce what some consider to be a
public good.
RF

Region Focus | Third Quarter | 2011

9

JARGONALERT
Utility
iamonds are attractive gems but water is essential
to life. How can it be, then, that under most circumstances people are willing to pay far more for
diamonds than water? Economists have struggled with this
seeming paradox since Adam Smith famously proposed it
in 1776, and in the process have changed how we understand and assess utility.
Utility, broadly, represents how useful or satisfying a
good, service, or action is to an individual. Since economists
believe that people want to live as happy and fulfilling
lives as possible, understanding the utility that different
outcomes create for individuals can help in understanding
and predicting how they will behave. This tells businesses
which goods they should produce, lets politicians know
which policies they should enact, and
allows people to understand the motives
of those around them.
Much of the early theory of utility has
its roots in the 18th and 19th century utilitarian philosophy of Jeremy Bentham and
John Stuart Mill. Both authors believed
that society’s aim should be to promote
the greatest happiness for all involved
— “the greatest good to the greatest
number,” in Bentham’s phrase. Bentham
believed that this happiness was dependent on, and could be measured through,
the intensity of pleasure or pain that a
good or action produced for an individual,
as well as several other factors. In fact,
Bentham believed that by using these
measurements as well as 32 traits of each
person, society could measure and compare
the happiness of all individuals. While still believing that
maximizing mankind’s utility was the most moral approach
for governance, Mill argued that it was best to allow individuals to make their own choices, as long as this didn’t
interfere with the happiness of others.
While economists still look for ways of improving the
utility of society, their conception of the nature of utility and
how it should be measured has changed significantly since
the time of Mill and Bentham. In most contexts, economists
today generally reject the concept of trying to measure
numerically the utility that someone derives from an outcome (that is, its “cardinal utility”) and to compare different
people’s utility from different outcomes. Instead, they look
at the order in which an individual desires various outcomes,
that is, the person’s “ordinal utility.” To understand this
ordering, they observe the choices individuals make
between alternatives, and assign a higher utility value to the

D

10

Region Focus | Third Quarter | 2011

outcome which is eventually chosen. By keeping track of
these revealed preferences, economists are able to compare
the utility of all kinds of goods and actions to the individual.
Because it is impossible to compare the utility levels of
different people, modern utility theory does not allow the
economist to combine individual utilities into one number
for all of society. In other words, if building a bridge makes
some residents happy by improving their commute to and
from work, but angers an equal number of others who do not
own cars but must pay for the project, most economists
would say that it is impossible to judge whether the happiness of the first group outweighs the dissatisfaction of the
second group. Rather, in the tradition of Italian economist
Vilfredo Pareto, economists can only state whether or not
the decision improves the lot of some without hurting anyone else, or causes a Pareto
improvement.
Economists do recognize that the
utility a good brings an individual can vary
according to his or her current situation.
The idea that a good can bring different
amounts of happiness depending on the
current state of the individual leads economists to look at the effect of an
additional unit of a good on the individual
— that is, the good’s marginal utility. The
willingness of an individual to pay for a
good does not depend directly on how
costly it was to produce the item, or the
usefulness of the item on the whole, but
instead rests on the satisfaction that each
additional unit of the good provides. Since
individuals generally satisfy their most
important needs with the first few units of the good they
acquire, additional units are likely to have progressively less
value to the acquirer. Economists call this the principle of
diminishing marginal utility: The first unit of a desired good
holds more utility than the second one, and so on.
This brings us back to the matter of diamonds and water.
While the overall utility of water to an individual is much
higher than that of diamonds, the marginal utilities of the
two are a different story. At any given moment, most people
do not have a strong desire for more water (unless the person
happens to be crossing a desert, or, say, has just finished a
workout); for them, the marginal utility of additional water is
modest. On the other hand, most people are far from feeling
saturated with diamonds, and would derive considerable
utility from owning another one. But not everybody: As
economics teaches, utility — like beauty — is in the eye of
RF
the beholder.

ILLUSTRATION: TIMOTHY COOK

BY LO U I S S E A R S

RESEARCH SPOTLIGHT
Ties that Bind
BY T I M S A B L I K

records of the drivers who were in-network in one period
dam Smith famously observed that people acting
and out of network in another. Presumably, driving ability
in their own self-interest are often “led by an
did not systematically change between the two periods.
invisible hand” to advance the interests of society
Although the sample of drivers who switched networks is
as a whole. There are cases, however, where an economic
relatively small (3.2 percent of the sample), the authors note
agent is protected against losses from his risk-taking,
that this subsample is representative of the whole in terms
allowing him to take risks that are personally optimal, but
of driving experience, age, and countries of birth.
which negatively affect a third party. This is referred to by
In addition, Jackson and Schneider test a second model
economists as moral hazard.
incorporating the distance between residences of drivers
Moral hazard is commonly an issue in the context of
and owners from the same couninsurance; for example, a hometry. They reason that drivers who
owner might opt to spend less
C. Kirabo Jackson and Henry S.
happen to live close to owners
time and money protecting his
Schneider. “Do Social Connections
from the same country will be
property from theft because his
Reduce Moral Hazard? Evidence from the
more likely to lease from them.
insurance policy would cover
Proximity of owners and drivers
the losses. Thus, insurance comNew York City Taxi Industry.” American
from the same country of origin
panies typically limit moral
Economic Journal: Applied Economics,
is therefore correlated with
hazard by requiring deductibles
July 2011, vol. 3, no. 3, pp. 244-267.
in-network driving, but not
and by increasing premiums
correlated with driver ability,
following a pattern of losses.
allowing the researchers to isolate the in-network effect on
Since moral hazard can lead to undesirable economic outdriving outcome.
comes and hamper market transactions, economists are
In the sample as a whole, approximately one in three drivinterested in learning how parties avoid this inefficient
ers receives a summons in a six-month period, or an average
behavior. C. Kirabo Jackson of Northwestern University and
of 0.39 summonses per driver. Testing the data without conHenry Schneider of Cornell University explore the issue of
trolling for ability, the authors find that in-network drivers
moral hazard in the New York City taxi industry. They look at
have 0.09 fewer summonses per six-month period, which is
taxi drivers who lease their vehicles from owners who pay the
a fairly small improvement. When controlling for individual
costs of maintenance and repair. The drivers keep all fares
ability by using the data for drivers who switched networks,
they earn, minus lease fees and an accident deposit, capped
however, the researchers find that in-network drivers have
by the city’s Taxi and Limousine Commission (TLC) at $650
0.334 fewer summonses per six-month period, a statistically
and $500, respectively. The costs of vehicle maintenance or
significant improvement. The results from the second test
accident repair can run much higher, and since the drivers are
method are similar, though statistically somewhat weaker.
not liable for the majority of the downside, but reap the full
The authors then explore whether the strength of the
profits of the upside, they are susceptible to significant moral
owner’s and driver’s social network motivates this improvehazard. They might drive more aggressively to pick up more
ment in driver performance. The researchers measure social
passengers, unconcerned by the increased wear and damage
network strength as the density of residents from a particular
this inflicts on the car.
country who live in a neighborhood. They posited that a
In their article, Jackson and Schneider ask whether leasgreater density of residents from the same country leads to a
ing to a driver from the same country of birth might reduce
stronger social network, and in-network drivers who are part of
the moral hazard problem, and thereby reduce the owner’s
a strong social network will have greater incentive to perform
losses. They theorize that social connections could provide
better or risk being cut off from that network of support.
the pressure that formal contracts lack in this situation.
In fact, this is what the authors find: The interaction
Using data on New York taxi drivers from 2005 and 2007,
between leasing in-network and the owner’s network
they find that 44 percent of drivers lease “in-network”
density had the greatest influence on driving outcome.
(that is, from owners with the same country of origin). The
This suggests that the ability of the owner to enforce social
authors then perform a series of tests to isolate the causal
sanctions on the driver accounts for most of the improved
effects of driving in-network on driving behavior, which
behavior demonstrated by in-network drivers.
they measure using summonses for TLC violations.
In order to isolate the in-network driving effect, the
Jackson and Schneider conclude that even in developed
authors must control for the effect of individual driver
economies, social ties can reduce the effect of moral hazard
ability. They do this in two ways. First, they look at the
in cases where formal contracts might fall short.
RF

A

Region Focus | Third Quarter | 2011

11

S T O R Y

or many Americans, the recession hasn’t ended.
The unemployment rate in the United States has
been above 9 percent for 28 of the past 30 months.
More than 6 million people — nearly half of all
those who are unemployed — have been out of work for
longer than 27 weeks. Most of the nearly 9 million jobs lost
due to the recession haven’t come back; the job creation
rate fell to a historic low in 2008 and is still well below its
rate for the previous two decades. At the same time,
corporate profits are above their prerecession levels
and companies are holding a record-high $1.9 trillion
in cash reserves. Policymakers — and households — are
asking what the government can do to encourage those
companies to start hiring.
Policymakers also are asking what has changed since the
last very severe recession, in 1981-82. Then, the overall
unemployment rate reached a higher level, but returned to
its prerecession level about 18 months after the end of the
recession. The share of long-term unemployed workers
peaked at 26 percent, compared to the recent high of 46 percent. GDP growth averaged almost 7 percent during 1983
and 1984, but only 2.5 percent during the past two years (see
chart on page 13). The divergence suggests broader changes
in the labor market and in the economy that may be less
amenable to traditional policy tools for job creation, such as
stimulus spending, tax rebates, and direct hiring subsidies.
The effectiveness of those tools, and economic growth in
general, might be hampered by an environment of considerable economic, fiscal, and regulatory uncertainty. If so, then
restoring the American economy to previous levels of
growth and employment is likely to be a long and challenging process, but one that could be fostered by stable,
credible policies that provide the private sector with the
tools and incentives to recover.

F

The Keynesian Approach
Since the mid-20th century, federal policymakers have often
taken what could broadly be called a “Keynesian” view of
economic downturns: When aggregate demand falls, the
government should move to fill the gap by increasing
government spending or, in some cases, cutting taxes. How

BY J E S S I E RO M E RO

12

Region Focus | Third Quarter | 2011

much of the gap it can fill depends on the size of the “fiscal
multiplier,” the amount by which a dollar of government
spending increases the economy’s output.
The size of the fiscal multiplier is not a static number.
Instead, it depends on what is already happening in the
economy, and how the economy’s characteristics are represented in a model. The multiplier also can vary with the type
of stimulus, whether it is in the form of government purchases or tax cuts. In some models, the multiplier is less than
one, meaning that one dollar of spending yields less than one
dollar of additional output, often because it is assumed that
households and firms expect higher taxes and interest rates
in the future. In other models, the multiplier is larger than
one, especially when interest rates are at the zero bound, as
at present.
The American Recovery and Reinvestment Act (ARRA),
enacted in 2009, was designed to stimulate the economy.
The Congressional Budget Office (CBO) estimates the act
will cost $830 billion, with about two-thirds of the cost coming from increased federal spending and one-third from tax
cuts and credits. Before the ARRA was passed, the White
House’s Council of Economic Advisers (CEA) projected that
it would save or create at least 3 million jobs by the end of
2010. The CEA used a model with a multiplier of about
1.5 on government purchases (that is, one dollar yields $1.50
in output).
Actually measuring the number of jobs created by
stimulus spending is impossible because it requires knowing
for certain what would have happened without the stimulus
— what economists call the “counterfactual.” Physically
counting the number of jobs created doesn’t paint a
complete picture; recipients of ARRA funding are required
to report the number of jobs they create, but there is significant over- and under-reporting, and the reports don’t
capture any indirect effects such as increased demand by the
newly employed. As a result, official estimates of the impact
of ARRA are based on models, often the same models that
were used to predict its impact. The CBO estimates that the
ARRA saved or created between 1.2 million and 3.3 million
jobs in the first quarter of 2011, based on multipliers of
between one and 2.5.

PHOTOGRAPHY: GETTY IMAGES

C O V E R

Recession and Recovery, Then and Now

Temporary Tax Breaks
The government also can try to boost demand, and thus
create jobs, via tax policy. That was the goal of the rebates
issued to households in 2001 and 2008, and of the Job

12

1981-82 Recession

10
PERCENT

8
6
4
2
0
T-4 T-3 T-2 T-1 T T+1 T+2 T+3 T+4 T+5 T+6 T+7 T+8
12

2007-09 Recession

10
PERCENT

8
6
4
2
0
T-4 T-3 T-2 T-1 T T+1 T+2 T+3 T+4 T+5 T+6 T+7 T+8
Unemployment Rate
Share of Long-Term Unemployment

GDP Growth
15
10
PERCENT

An analysis by Daniel Wilson, an economist at the San
Francisco Fed, also suggests that the stimulus had a large
effect on employment. Rather than using a model with an
implied multiplier, Wilson compared state-level differences
in ARRA funding and employment levels and found that the
ARRA saved or created more than three million jobs by
March 2011.
Other economists, including John Cogan and John Taylor
of Stanford University, are skeptical about the ARRA’s
impact. They argue that very little of the money was actually used to increase government purchases. They found that
the increase in purchases due to the ARRA was only 0.1 percent of GDP during 2009 and 2010. A large portion of the
money was in the form of grants to states, which, instead of
increasing their own purchases, used the money primarily to
reduce borrowing, according to Cogan and Taylor. The
implication is that the stimulus had little effect on consumption, and thus on employment.
As Wilson notes, however, job creation was not the only
objective of the stimulus package. A significant portion of
ARRA funds went to extended unemployment benefits and
Medicaid reimbursements to states. “If the sole goal was to
create employment it probably could have been designed in
a different way,” he says. “But maintaining health spending
for low-income households was a goal, and so was
providing unemployment benefits and increasing the safety
net in general.”
Revised GDP numbers released by the Bureau of
Economic Analysis (BEA) in July show that the recession
was much more severe than originally thought — GDP actually declined at an average annual rate of 3.5 percent,
compared to the earlier estimate of 2.8 percent — so some
argue that the stimulus package was simply not large enough
to counter the magnitude of the downturn. Other commentators have noted that growth slowed in 2011, when the
stimulus started winding down, which suggests that it did
have an effect in boosting the economy.
On the other hand, this decline could also suggest that
the stimulus did not fix what truly ails the economy, given
that the effects do not appear to be sustainable. For example, more than 300,000 educators’ jobs were saved by the
stimulus, according to the Department of Education, but
now more than half of U.S. school districts are planning
layoffs. The Savannah River Site, a nuclear cleanup project in
South Carolina, was awarded $1.6 billion in ARRA funding,
which allowed the contractor to save the jobs of 800 fulltime employees and hire 2,200 temporary workers. As of
August, only 1,200 stimulus-funded workers remained, and
their projects were scheduled to end in October, along with
the funding. About 1,000 full-time workers have been laid
off or accepted buyouts since the beginning of the year.

5
0
-5
-10
T-4 T-3 T-2 T-1 T T+1 T+2 T+3 T+4 T+5 T+6 T+7 T+8

1981-82 Recession

2007-09 Recession

NOTE: Data in all panels are for the four quarters before and the eight
quarters after the trough (T) of each recession. GDP growth is the quarterly
percentage change, at an annual rate.
SOURCES: Bureau of Labor Statistics, Haver Analytics; Region Focus
calculations

Creation Act of 2010, which expanded several tax credits for
lower-income families and allowed businesses to expense all
of their investments in 2011. The ARRA included the
Making Work Pay tax credit, a $400 employee-side reduction in payroll taxes.
Forecasting how households will respond to a temporary
boost in income is difficult. Economic theory says that
short-term income changes shouldn’t have much effect on
households’ spending decisions, but people don’t always
respond as theory would predict. A study of the 2001 tax
rebates — cited by many policymakers in support of the
Job Creation Act credits — found that households, particularly lower-income households, actually spent a substantial
portion of the rebates. The authors, David Johnson of the
U.S. Census Bureau, Jonathan Parker of Princeton
University, and Nicholas Souleles of the University of
Pennsylvania, concluded that the rebates likely had a large
effect on aggregate consumption.
In a survey about the effects of the 2008 tax rebates,
however, Claudia Sahm of the Federal Reserve Board and

Region Focus | Third Quarter | 2011

13

Matthew Shapiro and Joel Slemrod of the University of
Michigan found that most recipients used the rebate to save
or pay down debt, instead of spending it. The personal
savings rate at the end of 2008 reached 6 percent, and
remained around 5 percent until the summer of 2011, double
the rate prior to the recession, according to the BEA. Other
studies have found that the rebates did have a positive effect
on consumption. Overall, however, they were not enough to
stave off a more than 5 percent drop in spending in the
fourth quarter of 2008, and a continued decline throughout
the first half of 2009.
One reason for the different responses to the 2001 and
the 2008 tax rebates might be that the 2001 rebates were
perceived to be part of a longer-term tax cut, whereas the
2008 rebates were a one-time event. A study by Christina
Romer and David Romer of the University of California,
Berkeley, found that a tax cut equal to 1 percent of GDP
could raise output by as much 3 percent, but that the effects
were highly dependent on the economic conditions at the
time. Countercyclical tax cuts — that is, those enacted in
response to an economic downturn — tend to have a much
smaller effect on the economy than tax cuts enacted to promote long-term growth.
A more significant factor in the divergence between 2001
and 2008, however, is likely that households’ wealth had
declined much more severely, and consumer confidence was
significantly lower, in 2008 than in 2001, making consumers
more cautious about spending. And that caution appears to
be persisting. A new payroll tax deduction for employees
went into effect at the beginning of 2011, but consumer
spending has stayed fairly flat, according to BEA data. This
suggests that the deductions are not having the desired
effect on consumption, and by extension on employment,
although they might be helping to offset a larger decline that
could have been caused by higher gas and food prices.

Subsidize Private Hiring
The government can also try to encourage firms to hire new
workers by offering direct hiring credits. Firms hire new
workers when they believe the marginal benefits of new
workers outweigh the marginal costs. Lower the cost of
labor, for example by offering a tax credit for new employees, and in theory firms should be more willing to expand
their payrolls. But estimates of how responsive firms are to
changes in the price of labor vary widely, and most studies
suggest that the effect of a hiring credit is relatively small.
Before 2010, the only broad-based national tax credit for
employment was the New Jobs Tax Credit (NJTC), which
was in effect from mid-1977 through 1978. (Other programs,
such as the Targeted Jobs Tax Credit and the Welfare-toWork Tax Credit, were designed to aid specific groups of
disadvantaged workers. The NJTC was the first to target
unemployed workers in general.) Studies of the NJTC
showed it did create jobs, perhaps as many as 670,000,
although it’s possible that many of the companies who
claimed the credit would have created jobs anyway,

14

Region Focus | Third Quarter | 2011

according to a 1996 review by Lawrence Katz of Harvard
University.
State-level programs have had modest results. Wilson of
the San Francisco Fed and Robert Chirinko of the University
of Illinois at Chicago have been conducting a study of two
dozen state-level hiring credits. Their preliminary results
point to a positive — but small and transitory — effect on
employment. A 2010 study of the MEGA tax credit
program in Michigan concluded that the credit might only
be decisive in 8 percent to 16 percent of hiring decisions
(meaning that the remaining credits are earned for jobs that
would have been created anyway), and that the program
could boost the state’s employment by one-third to twothirds of a percent. The study was conducted by Timothy
Bartik and George Erickcek of the W.E. Upjohn Institute
for Employment Research. But as the authors noted, the
effects of the MEGA program might be small because the
program itself is relatively small.
A recent program at the federal level was the Hiring
Incentives to Restore Employment (HIRE) Act, which
became law in March 2010. The Act allows employers to
claim a payroll tax exemption for qualified workers (those
unemployed for eight weeks or longer) hired between
February and December of 2010. At the end of 2010, the
Treasury reported that 10.6 million workers who were hired
during that period were eligible for the HIRE exemption.
But that number represents only 11.6 percent of all the people who spent more than eight weeks unemployed during
that period, and it’s not certain how many of those workers
were actually hired as a direct result of the program.
David Neumark, an economist at the University of
California, Irvine, has studied the effectiveness of hiring
subsidies. While a new subsidy may have small employment
gains relative to the number of jobs needed, he considers it
one of the better options available. “If we’re really serious
about increasing hiring, then let’s focus on the things that
directly incentivize hiring,” he says.

What’s Behind the Jobless Recovery?
Following the recessions of 1990-91 and 2001, economists
asked whether the “jobless recoveries” that followed were a
function of those recessions being shallow, or if instead they
reflected more permanent changes in the economy. During
the 2007-09 recession, it was hoped that the answer was
shallow, and that the recovery would follow the same path as
the similarly severe recession in 1981-82, when the economy
rebounded quickly and sharply. Instead, it seems that there
has been “a change in how the labor market responds to
shocks,” says Jason Faberman, an economist at the Chicago
Fed. “Even though this was a bigger shock, we’re seeing the
same kind of response in the labor market that we saw in
1991 and 2001. We don’t see a sharp return to the labor
market like we used to.”
Faberman has documented a persistent decline in the
job creation rate over the last two decades, exemplified by
changing employment patterns after recessions. During and

Job Creation and Destruction Rates

PERCENT

after the 1990-91 recession, the job creation rate
10.0
was fairly constant, but the unemployment rate
9.5
remained high because the job destruction rate
9.0
was high. After the 2001 recession, in contrast, the
8.5
job destruction rate quickly returned to prereces8.0
sion levels, but the job creation rate continued to
7.5
decline well into 2003.
7.0
The most recent recession has followed the
6.5
pattern of the 2001 recession: Initially, the job
6.0
destruction rate spiked and the job creation rate
fell to historical lows. Now, the job destruction
5.5
rate has subsided, but the job creation rate
5.0
remains very low —far below the level needed to
4.5
recoup the recession’s losses (see chart). The rea4.0
1990 91 92 93 94 95 96 97 98 99 2000 01 02 03 04 05 06 07 08 09 2010
sons for the decline in the job creation rate could
be structural changes in how the labor market
Job Destruction Rate
Job Creation Rate
operates. Productivity increases and new technolNOTE: Data are for private non-farm employment. The job creation (destruction) rate in each quarter is gross job
creation (destruction) divided by the average of total employment in that quarter and the preceding quarter.
ogy have reduced the need for labor overall, and
Shaded areas denote recessions.
the availability of temporary workers enables comSOURCES: Bureau of Labor Statistics, Haver Analytics, Faberman (2008); Region Focus calculations
panies to ramp up production without hiring new
Uncertainty about the country’s regulatory and fiscal
permanent workers. In addition, firms are more likely to use
environment could also be contributing to companies’ retipermanent rather than temporary layoffs, for reasons
cence. The pending implementation of new health care and
including changes in how unemployment insurance costs are
environmental regulations, for example, has led many busicharged to employers and the decrease in union contracts.
ness owners to express concern about how their companies
will be affected. In the Fifth District, respondents to
The Cost of Uncertainty
Richmond Fed surveys report that potential regulatory
The effects of structural changes in the labor market are
changes make it difficult to plan new hires. A greater source
amplified by the considerable uncertainty facing business
of concern is the resolution of nation’s debt and deficit probowners. While uncertainty is difficult to measure, a large
lems. According to Dallas Fed surveys, for example, the lack
body of theoretical and empirical research suggests that
of clarity from legislators about future economic policies
when businesses are uncertain — whether it’s about taxes,
contributes to business owners’ pessimism about the future.
regulation, interest rates, future demand, or other factors —
Monetary policymakers on the Federal Open Market
they delay investment decisions, which could include hiring.
Committee recently indicated that they are likely to keep
A 2009 study by Nicholas Bloom of Stanford University
interest rates low through 2013, but the fear of higher taxes
found that “uncertainty shocks,” caused by economic or
and interest rates in the future might be enough to
political events, can lead to a 1 percent drop in employment
discourage businesses in the present.
and output in the months immediately after the shock.
How to resolve this uncertainty is a matter of debate.
Bloom also found that output rebounds as uncertainty
No one can predict for certain the jobs impact of specific
diminishes, but the sources of today’s uncertainty remain
regulations or the effect of various tax and spending policies.
persistent. The largest source is the future of the economy:
But there is broader agreement about creating economic
GDP grew at an annual rate of only 1 percent in the second
conditions that promote long-term growth. Numerous
quarter of 2011, and 0.4 percent in the first quarter.
cross-country comparisons have shown that countries with
According to the BEA’s initial estimates, growth improved
less regulated, more competitive markets have higher levels
to 2.5 percent in third quarter, still well below the rate needof business investment and faster growth. The United States
ed for a robust recovery. Its July release showing that the
generally has had less regulated product and labor markets
recession was deeper than previously thought also showed
than European countries, which could account for the fact
that growth had been slower than estimated since the recesthat average GDP growth during much of the 1990s was
sion ended — leading some to dub the period a “recoveryless
about 2 percentage points higher in the United States than
recovery.” The Federal Reserve recently lowered its growth
in France, Germany, and Italy, according to a 2005 paper by
and unemployment forecasts for 2012, and measures of
Alberto Alesina and Silvia Ardagna of Harvard University,
consumer and business confidence remain very weak. The
Giuseppe Nicoletti of the Organization for Economic
University of Michigan’s Index of Consumer Sentiment fell
Cooperation and Development, and Fabio Schiantarelli of
10 percent between October 2010 and October 2011, and
only 19 percent of CEOs surveyed by The Conference
Boston College.
Board, a business research association, expected the
New business formation is especially affected by regulaeconomy to improve in the next six months.
tion. Multiple studies have shown that high barriers to entry

Region Focus | Third Quarter | 2011

15

for new firms limit the “creative destruction” that is an
engine of economic growth. In the United States, new businesses account for almost 20 percent of gross job creation,
according to a 2010 paper by John Haltiwanger of the
University of Maryland and Ron Jarmin and Javier Miranda
of the U.S. Census Bureau. While startups are also more
likely to go out of business in a given year, those that survive
grow much more quickly than their older counterparts.
“The startups are a critical component of the experimentation process that contributes to restructuring and
growth in the United States on an ongoing basis,” the
authors concluded.
The United States also has one of the highest corporate
tax rates, and one of the most complex tax systems, in the
world. In addition, the United States is the only major developed country that taxes the foreign earnings of domestically
based companies when the earnings are repatriated, which
could encourage multinational companies to keep their
earnings offshore instead of investing them in the United
States. Many lawmakers and economists have suggested that
simplifying the tax code and making it more transparent,
among other reforms, would help lower costs for businesses
and increase the incentives to invest in productive activities
at home.
There is consensus that restoring fiscal balance is essential to the country’s long-term health. In a June speech, Fed
Chairman Ben Bernanke noted that “a large and increasing
level of government debt relative to national income risks
serious economic consequences…. [F]ailure to put our fiscal
house in order will erode the vitality of our economy.” Large
debts and deficits can hinder growth if they lead to higher
interest rates or taxes, and thus crowd out private investment. The United States currently has a national debt of
$14.8 trillion and a deficit of $1.3 trillion. Examining 200
years of data on 44 countries, Carmen Reinhart of the
Peterson Institute for International Economics and
Kenneth Rogoff of Harvard University have found that
countries with a debt-to-GDP ratio of more than 90 percent
have substantially slower growth. In the United States,
growth has averaged negative 1.8 percent when the ratio was
more than 90 percent and 4 percent when the ratio was

below 30 percent. Without substantial spending cuts and
increased tax revenue, the CBO projects that the U.S. debtto-GDP ratio could exceed the historical peak of 109
percent (reached at the end of World War II) by 2023, and
approach 190 percent by 2035.
Whether lowering the deficit should be achieved via
spending cuts, revenue increases, or a mix thereof is a question being debated by economists and policymakers. But all
sides agree that committing to a credible, enforceable plan is
essential to restoring business and consumer confidence,
and thus to promoting the country’s short-term and
long-term economic health.

The Path Forward
The U.S. economy is facing a number of significant challenges. Long-term changes in the labor market, an
unsustainable fiscal situation, and the continued effects of
the most severe recession since the Great Depression
suggest that a near-term solution to the unemployment
problem is not at hand.
The considerable uncertainty facing business owners
makes these challenges more difficult to overcome. Work by
economist Robert Pindyck of the Massachusetts Institute of
Technology suggests that uncertainty not only decreases
investment but also makes companies less responsive to
stimulus. And Bloom’s results show that uncertainty makes
firms less likely to respond to an increase in demand.
Resolving the uncertainty surrounding so many political and
economic decisions thus could help spur businesses to
increase their payrolls once the economy picks up.
Policy tools such as tax incentives or stimulus spending
can help on the margin of the labor market. But broad-based
job creation, which so far has dragged more slowly than the
recovery generally, will require sustained improvement in
economic conditions — and that means policymakers must
credibly address the hardest problems rather than working
only on the edges. This does not help the millions of people
who need to find work now, but in the long run, the best
hope for growth in output and employment is to create
the conditions that allow new and existing businesses to
RF
flourish.

READINGS
Alesina, Alberto, Silvia Ardagna, Giuseppe Nicoletti, and Fabio
Chiantarelli. “Regulation and Investment.” Journal of the European
Economic Association, June 2005, vol. 3, no. 4, pp. 791-825.
Bloom, Nick. “The Impact of Uncertainty Shocks.” Econometrica,
May 2009, vol. 77, no. 3, pp. 623-685.
Cogan, John, and John Taylor. “What the Government Purchases
Multiplier Actually Multiplied in the 2009 Stimulus Package.”
National Bureau of Economic Research Working Paper No. 16505,
October 2010, Revised January 2011.
Faberman, R. Jason. “Job Flows, Jobless Recoveries, and the

16

Region Focus | Third Quarter | 2011

Great Moderation.” Federal Reserve Bank of Philadelphia Working
Paper No. 08-11, June 2008.
Haltiwanger, John, Ron Jarmin, and Javier Miranda. “Who Creates
Jobs? Small vs. Large vs. Young.” National Bureau of Economic
Research Working Paper No. 16300, August 2010.
Neumark, David. “How Can California Spur Job Creation?” Public
Policy Institute of California, February 2011.
Wilson, Daniel. “Fiscal Spending Jobs Multipliers: Evidence from
the 2009 American Recovery and Reinvestment Act.” Federal
Reserve Bank of San Francisco Working Paper No. 2010-17,
May 2011.

In this classroom, the right choice may be (d) all of the above
BY T I M S A B L I K

uppose you had $100 in a savings account with an
interest rate of 2 percent. After five years, how much
would you have in the account if you left the money
to grow?
This was one of three questions asked of adults participating in a recent financial literacy study. They were given
three answers to choose from: more than $102, exactly $102,
or less than $102.The good news: About 68 percent of
respondents answered the question correctly. The depressing news: The other 32 percent either answered wrong or
could not answer the question.
Troubling evidence about Americans’ financial literacy
abounds. In a 2003 survey of investors administered by the
National Association of Securities Dealers (now merged into
the Financial Industry Regulatory Authority), only 35 percent of participants received a passing grade. Many thought
that stock market losses were insured. Among high school
students surveyed by Jump$tart.org, a nonprofit organization that promotes financial literacy training for students,
about half believed either that sales tax was set nationally at
6 percent or that the federal government deducted it from
paychecks.
The events of the 2007-09 recession and its aftermath
have brought the need for financial knowledge sharply into
focus. Even prior to the crisis, the Federal Deposit
Insurance Corporation observed that “the extraordinary
transformation of consumer financial markets over the past
decade has made financial literacy nothing less than an
essential survival tool.”
Financial literacy can encompass many different traits,
but the National Financial Educators Council defines it as
“possessing the skills and knowledge on financial matters to
confidently take effective action that best fulfills an individual’s personal, family, and global community goals.” As
surveys indicate, financial skills and knowledge among many
Americans seem to be lacking.
In response, federal, state, and private organizations have
put a number of initiatives in place to improve financial
literacy levels. In 2010, President Barack Obama declared
April as National Financial Literacy Month and announced
the creation of MyMoney.gov, a website designed to provide
free financial resources and guidance. The Federal Reserve
banks, including the Richmond Fed, also conduct economic
and financial education programs, both independently and
in cooperation with other nonprofit organizations.

PHOTOGRAPHY: JUNIOR ACHIEVEMENT USA™

S

Students participating in Junior Achievement’s Finance Park
simulation budget their assigned salary around typical household
expenses, taxes, mortgages, insurance, and savings.

According to the 2009 “Survey of the States” by the Council
for Economic Education, 13 states mandate a course in
personal finance as a high school graduation requirement, up
from just seven states in 2007. In the Fifth District, Virginia
signed into law a requirement that high school students take
a class in economics or financial literacy in order to graduate.
Maryland passed a similar requirement in May.

Starting Early: Can High School Classes Shape
Future Behavior?
Financial literacy programs aren’t free, however. The
Maryland high school program, for example, is expected to
cost $15.6 million in salaries, textbooks, and other materials.
With a significant amount of money being invested in financial education for young students, there are important
questions to ask: Does it work? Do students who participate
in financial education programs end up with better financial
skills? Although research on the topic of financial literacy is
still in early stages, a few studies provide some clues.
Researchers Bruce Ian Carlin of the University of
California at Los Angeles and David Robinson of Duke
University studied data from a Junior Achievement Finance
Park activity in California. Students between the ages of 13
and 19 were asked to make a budget by visiting various stations that handled house loan decisions, health insurance
purchases, and savings accounts, among other things. Prior

Region Focus | Third Quarter | 2011

17

to the event, some of the students took part in 19 hours of
classroom instruction on financial literacy.
According to John Box, Junior Achievement’s senior vice
president of education, the curriculum is designed to teach
students how to budget around four key categories: spending, saving, investing, and giving.
“We’re really trying to get young people to understand
that there is a balance,” says Box. “Regardless of how much
money they make, they should be thinking about, certainly,
spending some, because they need to pay their bills and
cover their expenses. But they also need to be saving for
more immediate needs, they ought to be investing for more
long-term needs, and they ought to be giving back.”
Carlin and Robinson found that both the students who
received training and those who did not allocated roughly
the same portions of income at the different stations and
had comparable completion rates. Seemingly, then, the
training did not make a difference. The authors noted, however, that the schools that opted out of financial literacy
training had better academic performance records and
served less economically challenged populations. After controlling for these school effects, the authors found that the
students who received the training were about 35 percent
more likely to complete the activity with a balanced budget.
There were some signs that the training did not always
affect students’ behavior in the way the instructors might
have hoped. When it came to choosing health insurance,
students with the classroom training were more likely to
economize on monthly costs by choosing insurance plans
with lower premiums, even if those plans left them open to
higher and more volatile future costs in the event of an
emergency.
This illustrates one of the problems in identifying effective financial training: The soundness of many financial
decisions is highly specific to the individual. Although lowerpremium health care packages are not necessarily a bad
financial choice depending on individual circumstances, the
researchers classified some of these students as “underinsured” based on the family size they had been told to assume.
Grey areas like these make it that much more difficult to
teach a general set of “best practices.”
Box says Junior Achievement’s programs do encourage
students not to spend more than they make. However,
he recognizes that in reality students will need to weigh
financial decisions based on individual circumstances, and
financial education can help them at least make informed
decisions. “Regardless of what decision you’re making, you
ought to at least have that baseline of understanding.

18

Region Focus | Third Quarter | 2011

Whether it’s loans, insurance, buying a car, or buying a
house, there’s a body of knowledge that’s economically and
financially solid that kids should understand.”
The students themselves recognize that they lack this
baseline knowledge. Annamaria Lusardi of George
Washington University and Olivia Mitchell of the University
of Pennsylvania found in a recent working paper that
although high school and college-age consumers performed
poorly on an objective test of financial literacy (which
included the question on interest rates with which this
article started), they were also more aware of their limitations than any other age group. In self-assessment tests, they
rated their level of financial knowledge the lowest on average of all surveyed age groups.
The key to approaching high school financial literacy,
says Lusardi, is laying a foundation. “I always make this analogy: Financial literacy is no different than English,” says
Lusardi. “We don’t teach English so that you can go and
write War and Peace. We teach English so that you can appreciate a good book. And the same should be done for financial
literacy. You learn the basics: demand and supply, interest
compounding, risk diversification. This is something upon
which you can build.”

Financial Literacy and Retirement Planning
While laying a foundation for kids today may help them
navigate financial waters in the future, what about working
adults?
One of the arguments made by advocates of financial
literacy is that workers approaching retirement are woefully
underprepared. In the same study that surveyed high school
and college-age consumers, Lusardi and Mitchell found that
the disparity between self-described financial literacy and
actual financial literacy among retirement-age persons is
striking. Those over age 50 rated themselves, on average,
well above the median in terms of financial literacy.
Yet less than half of those between 51 and 65 correctly
answered three simple financial literacy questions; for those
older than 65, that amount dropped to slightly more than a
quarter.
Lusardi and Mitchell found that after controlling for a
variety of factors, such as education and income, financial
literacy did have a significantly positive effect on retirement
planning. Those who could correctly answer the three literacy questions were about 10 percent more likely to plan for
retirement. The researchers noted, “those who do not plan
reach retirement with half the wealth of those who do.”
Interestingly, they also found that those who had suffered a
significant income shock had a similar boost to their planning behavior. It could be that learning from the “school of
hard knocks” is as effective in changing behavior as taking
formal classes in financial literacy. Or perhaps the traditional classroom is not the best way to reach adults. Lusardi says
it’s unrealistic to expect adults to take time from their busy
schedules to attend financial literacy classes.
“We have to add financial literacy where it matters for

people,” she says. “Adults are not in the classroom, and it’s
very hard to bring them to the classroom, and it’s not obvious that that’s the best way that people really want to receive
the education.”
She argues that educators must be more creative in how
they reach out to working adults. Part of the problem with
current studies, Lusardi says, is that some researchers are
testing the effectiveness of training regimens which are not
likely to work from the start.
The challenge of finding evidence-based approaches to
financial literacy training is recognized by other researchers
as well. In a summary of existing studies of financial literacy
education, William Gale and Ruth Levine of the Brookings
Institution wrote in October 2010 that no approach “has
generated unambiguous evidence that financial literacy
efforts have had positive and substantial impacts.”
Nevertheless, there are studies that point to areas of
knowledge which have a measurable impact on financial
decisionmaking. A study by Kristopher Gerardi of the
Federal Reserve Bank of Atlanta, Lorenz Goette of the
University of Lausanne, and Stephan Meier of Columbia
University surveyed subprime mortgage borrowers to find
out if their level of financial literacy influenced their loan
decisions or likelihood of delinquency and default. They
found that the aspect of financial literacy that had the most
significant relationship with delinquency rates was numerical ability — the person’s ability to perform math
calculations. Those in the bottom quartile of numerical
ability were about 18 percent more likely to suffer foreclosure than those in the top quartile.
An experiment under way in India adds evidence to the
claim that mathematical ability, not just financial knowledge, plays a major role in financial behavior. Researchers
Fenella Carpena and Bilal Zia with the World Bank, Shawn
Cole of Harvard University, and Jeremy Shapiro of Yale
University presented their preliminary findings of an experiment in which they randomly selected participants to take
part in a video financial literacy curriculum.
They found that not only is mathematical ability positively correlated with financial literacy, but also those with
higher mathematical ability are much more likely to contribute to a savings program. Of course, this doesn’t
necessarily mean that greater financial knowledge is imparted by training in mathematics, but perhaps that those who

are more comfortable with math have an easier time calculating the results of financial decisions or at least looking at
them methodically.
Such findings bolster the importance of teaching fundamentals — early and often. Lusardi says many people who
think about teaching financial literacy are thinking about
teaching the wrong things. Rather than teaching the finer
points of mortgages or other financial instruments, which
are always changing, instructors should be providing students with a framework to make sound financial decisions.
“For example, take interest compounding,” she says. “It’s
very hard to make financial decisions if you don’t know
interest and interest compounding. People understand what
the law of gravity is. And interest compounding is the same
as the law of gravity — it applies everywhere. If you borrow
at a high interest rate, you’re going to pretty quickly double
your debt — it’s a law. And people need to know this law
when they are making financial decisions.”

Where Should Financial Literacy Go from Here?
There is a general consensus that a substantial number of
Americans have limited financial knowledge, but the best
way to increase an understanding of financial issues and
decisions is a point still widely debated. Reaching out to
students in the classroom, while not without some shortcomings, at least targets the younger population where they
spend much of their time.
With regard to adults, the evidence suggests that financial literacy training does raise awareness of opportunities to
save and invest for the future. For example, the researchers
in India found that adult financial literacy training had a
large and positive effect on basic financial knowledge and
also made participants significantly more aware of some of
the financial options available to them and more likely to
suggest them to co-workers. While it is probably unwise to
attempt to impose one “correct” model of savings and
spending, financial literacy training seems likely to benefit
adults as they make some of the most significant decisions of
their lifetimes.
“We are not going to go back to a world of defined benefit pensions,” Lusardi says. “Every country is facing this
problem of shifting responsibility to the individual, and
everyone is facing the problem of making decisions in a
RF
world that is more complex.”

READINGS
Carlin, Bruce Ian, and David T. Robinson. “What Does Financial
Literacy Training Teach Us?” National Bureau of Economic
Research Working Paper No. 16271, August 2010.
Carpena, Fenella, Shawn Cole, Jeremy Shapiro, and Bilal Zia.
“The ABCs of Financial Literacy — Experimental Evidence on
Attitudes, Behavior and Cognitive Biases.” Paper presented
at the annual meeting of the Allied Social Science Associations,
Denver, Co., January 2011.

Gerardi, Kristopher, Lorenz Goette, and Stephan Meier. “Financial
Literacy and Subprime Mortgage Delinquency: Evidence from a
Survey Matched to Administrative Data.” Federal Reserve Bank of
Atlanta Working Paper 2010-10, April 2010.
Lusardi, Annamaria, and Olivia S. Mitchell. “Financial Literacy and
Retirement Planning in the United States.” National Bureau of
Economic Research Working Paper no. 17108, June 2011.

Region Focus | Third Quarter | 2011

19

A freight train
shatters the solitude
of Thurmond’s
abandoned
commercial row.
Thurmond was
beginning to slow
down when this
photo (below) was
taken around 1930.

What can ghost towns teach us about saving small communities?
he tree-canopied road to Thurmond, W.Va., winds
along the banks of Dunloup Creek past waterfalls,
wildflowers, and herds of grazing deer. The final
approach to town spans the New River Gorge with a
one-lane road cantilevered off a rusty railroad bridge.
Most people would call Thurmond a ghost town.
Abandoned commercial buildings, three and four stories
tall, loom over an empty rail yard. A passing freight train
shatters the solitude, but as its warning chords fade in the
distance, the loudest sound is once again the rushing waters
of the New River.
Thurmond was not always such a peaceful place.
A century ago, locomotives constantly jammed the rail yard
— belching steam, smoke, and hot cinders. Twenty passenger trains arrived and departed daily, bringing hundreds of
visitors to the area’s hotels, boarding houses, and saloons.
Drinking, gambling, and prostitution were 24/7 pursuits
across the river in the Ballyhack district, where the Dunglen
Hotel is said to have hosted a poker game that lasted
14 years.
At one point, Thurmond was called the “Dodge City of
West Virginia,” an image the town’s marshal promoted by

T

20

Region Focus | Third Quarter | 2011

wearing a broad-brimmed Stetson hat and wielding a
notched gun. Estimates of his “official killings” ranged from
seven to 18, according to a book about the town by historian
Ken Sullivan, executive director of the West Virginia
Humanities Council. Thurmond’s Wild West reputation has
been cultivated and embellished, but Thurmond’s other
nickname — “Biggest Little Town” — was well-deserved.
Trains, coal, people, and money flowed through this tiny
town in copious quantities. In 1910, Thurmond generated
nearly $5 million in revenue for the Chesapeake and
Ohio (C&O) Railroad, approximately $110 million in today’s
dollars.
“For years this little city, without a highway leading into
or out of it, was known as the greatest banking center in the
country in proportion to population,” wrote newspaperman
Eugene Lewis Scott in 1943. Scott was hyperbole-prone, but
other sources confirm that Thurmond’s two banks were
among the most prosperous in West Virginia.
The town’s census population peaked in 1930, but coal
mining already was declining in the surrounding county, and
the Great Depression was hitting the town hard. The
National Bank of Thurmond closed, the Dunglen Hotel

PHOTOGRAPHY: COURTESY OF JOHN H. BOWMAN; COURTESY OF THE NEW RIVER GORGE NATIONAL RIVER COLLECTION

BY K A R L R H O D E S

burned down, and, according to local legend, the fire ended
the 14-year poker game. The biggest blow, however, came in
the 1950s, when the C&O switched from steam locomotives
to diesel engines, making Thurmond’s rail yard obsolete.
Today, with only seven residents remaining, Thurmond is
the smallest incorporated town in the Fifth District, but its
story raises big questions about what should be done — if
anything — to save small towns that no longer seem
economically viable.
Throughout the Midwest, across the Great Plains, and in
swaths of Appalachia, many small towns are losing population rapidly. Sociologists Patrick Carr of Rutgers University
and Maria Kefalas of St. Joseph’s University document this
dramatic trend in their 2009 book, Hollowing Out the Middle:
The Rural Brain Drain and What It Means for America. They
plead passionately for saving small towns, but economists
note that mobility of resources is essential to economic
growth. The national economy benefits greatly from people
moving easily to places where their talents can be put to
better use.
“It’s just economics 101,” says Mario Polèse, an economist
and geographer at the Institut National de la Recherche
Scientifique in Montreal. Polèse explains that people have
migrated from rural communities to big cities since the
beginning of the Industrial Revolution, primarily because
nations need fewer farmers and intermediaries to produce
and transport agricultural commodities.
“All industrialized countries basically went from 80 percent rural to 80 percent urban,” he says. “We’ve all gone
through that stage — England, France, Canada, the United
States, Japan — and now China and India are going through
the same thing.”
Ghost towns will develop only in extreme cases, Polèse
predicts. “What is much more the rule is an increasing number of towns that are going to fall to a population level that
is commensurate with what is economically reasonable.”
Most of these declining towns occupy a funnel-shaped
region that begins at the Texas-Mexico border and extends
across the Great Plains states and into the prairie provinces
of Manitoba and Saskatchewan.
“Hollowing out” accurately describes this continental
migration, but then again, “one shouldn’t overdramatize,”
Polèse cautions. Migration levels have “started to level out,”
he notes. But in North America — especially in the United
States — the population remains quite mobile. And that is a
good thing. In a 2007 article, The Economist took this view
one step further: “Ghost towns are sad places,” it said, “but
also monuments to American dynamism.”

Ghost Town Model
From the 1880s to the 1980s, Thurmond completed the
boom-bust cycle that produced thousands of ghost towns in
the western reaches of the United States and Canada. In a
2009 article in the Journal of Regional Analysis & Policy, economists Philip Graves and Emily Elizabeth Tynon, of the
University of Colorado, and Stephan Weiler of Colorado

State University analyzed data from two Colorado examples
— Cripple Creek and Leadville — to test a model they developed to study the economics of ghost towns.
“Most of these towns were based on intensive mining
booms typifying the extractive industries of the 19th and
early 20th centuries,” they noted. The mines often were
located in harsh, isolated areas. They generated quick
profits for owners, high wages for workers, and short-term
perspectives for everyone involved. The resulting economies
had little diversification, rapid cycles, and fixed investment
that was limited and disposable. These boomtowns also
attracted the Wild West’s most mobile population — single
men — in disproportionate numbers.
Cripple Creek and Leadville certainly fit the model, as
did many West Virginia towns with coal-based economies.
But the Colorado researchers note that “coal towns have
declined more slowly than would be expected based on the
histories of their Rocky Mountain brethren,” due primarily
to higher levels of homeownership and the deeper socioeconomic roots that come with it.
Thurmond’s early residents rented houses from the
town’s founder, Capt. William Dabney Thurmond. But by all
accounts, he did not intend to build a temporary town. His
homes and commercial buildings were well-constructed, and
the town’s residents developed a strong sense of community.
The Colorado researchers encountered those same
factors when they applied their ghost town model to struggling agricultural communities in the Midwest. Farm
employment has decreased steadily in the United States
since World War I, but the people who built Midwestern
towns invested considerably more in residential and commercial structures than their counterparts in Colorado.
These substantial investments eventually attracted residents with higher mobility costs who now seek to reverse
their towns’ economic fortunes. Typically they pursue economic diversification, seek government assistance, and
promote tourism and historic preservation. Yet the
Colorado researchers are pessimistic about the future of
these small towns.
“To some extent, the ubiquitous nature of Midwestern
disamenities (e.g., gray winters, humid summers) implies
that such programs may be fruitless in the long run,” they
conclude. “More slowly decaying extractive regions, such as
coal mining in Appalachia and farming in the rural
Midwestern United States, seemingly face similar difficulties in the late 20th century and early 21st centuries.”

Worthwhile Canadian Initiatives?
Small towns are battling more than bad weather, Polèse says.
They are competing against the enormous economic advantages that big cities enjoy.
“Trying to stop people from moving to cities will not
work, certainly not as long as opportunities are more plentiful in the city than in the countryside,” he writes in his 2009
book, The Wealth & Poverty of Regions: Why Cities Matter. And
some rural development initiatives can accelerate migration

Region Focus | Third Quarter | 2011

21

22

to cities, he contends. Programs to boost education levels or
agricultural productivity are prime examples. These are
laudable goals, but more productive farms require fewer
workers, and better-educated young people become even
more likely to move to cities where their education will help
them earn higher salaries.
Polèse lives in Canada, a nation with many ghost towns —
economic casualties of declines in mining, fishing, forestry,
and farming. In some extreme cases, Canada’s provincial
governments have offered relocation incentives to persuade
residents to abandon remote towns that have become too
expensive to maintain. Residents of Newfoundland, for
example, started abandoning small fishing villages in 1945.
The provincial government encouraged this trend by offering modest relocation incentives that emptied out many
isolated towns. Some of these economic refugees moved to
the coastal town of Great Harbour Deep, but the town’s cod
fishing industry collapsed in the early 1990s, and in 2002,
residents abandoned the town in exchange for resettlement
payments of CAD$100,000 per family (about $63,000 in
U.S. currency at the time).
The deal must have caught the attention of residents of
Murdochville, a remote mining town about 370 miles northeast of Quebec City. The town was struggling to survive after
Noranda Inc. closed its copper mine in 1999 and its copper
smelting operation in 2002. About 5,000 people lived in
Murdochville in the 1970s, but by 2003, its population had
dwindled to 734. Nearly two-thirds of the town’s remaining
electorate voted to abandon Murdochville if the provincial
government would compensate them for relocation expenses and loss of property values.
Quebec officials refused. Instead, they offered
Murdochville a CAD$17.5 million (about $12.6 million U.S.)
relief package, including funding to balance the municipal
budget, establish an auto insurance call center, and recruit
new industries. The town has attracted two wind-turbine
farms, built with $180 million in private investment, and the
population has stabilized at about 800, but the town’s unemployment rate remains precariously high.
Canadian policymakers are more likely to subsidize dying
towns than their counterparts in the United States, Polèse
says. “That is definitely part of the Canadian tradition.
It’s not as strong as in Europe, but it has always been part of
the fabric of our country. Even in the Constitution, you have
what we call ‘equalization payments’ ” — essentially a redistribution of federal tax revenue from richer provinces to
poorer ones. “Implicitly, that means that you are going to
keep certain small communities alive that otherwise would
not survive.”

In his 2008 book, Small Towns, Big Ideas, Lambe highlights 22 towns in North Carolina and 23 towns in other
states that have experienced some success with a variety of
economic development initiatives. Chimney Rock, N.C.,
has “figured out a way to capture tourists flowing into a nearby park” by sprucing up its downtown and building a river
walk. Colquitt, Ga., has attracted tens of thousands of visitors to a theatrical phenomenon called Swamp Gravy, and
Siler City, N.C., has “triggered a minor renaissance” with its
North Carolina Arts Incubator.
These success stories are encouraging, but many other
small towns are pursuing similar strategies, and there has to
be a saturation point for arts-based projects and small-town
tourism. Reynolds, Ind., is betting instead on America’s
seemingly insatiable appetite for energy.
Reynolds is “a one-stoplight town with 550 people and
150,000 pigs,” Lambe writes. In 2005, the governor of
Indiana proposed turning the struggling town into a demonstration project for producing alternative energy. According
to a state study, “hog manure and other organic waste in and
around Reynolds could produce 74 times the town’s energy
needs.”
The town embraced the idea of building a biomass plant,
purchasing flex-fuel vehicles, even adopting the nickname
BioTown, USA. Charlie Van Voorst, president of the town
council sums up the town’s response this way: “We thought,
‘Gosh, there’s not much going on here in Reynolds, so we’ll
try anything.’ ”
Call it “reverse NIMBYism.” Enthusiasm for projects
that other localities would find unacceptable — at least “not
in my backyard” — can be an important comparative advantage for small towns that are struggling to survive. Many
communities have opposed wind-turbine farms, for example, but the residents of Murdochville were used to some
industrial noise, so the constant humming of the turbines
didn’t bother them. They also didn’t see the huge windmills
as eyesores. In fact, they incorporated turbine blades into
the town’s logo.
Another good example of reverse NIMBYism unfolded
in Chillicothe, Mo., where residents waged a successful
campaign to keep the state from closing a nearby women’s
prison. The town put forward an innovative proposal that
preserved the prison’s 200 jobs and created 250 additional
employment opportunities.
“While some rural communities may view prisons as an
industry of last resort, officials and residents in Chillicothe
have come to value the corrections industry,” Lambe writes.
They see it as “an antidote for the slowly collapsing farm
economy and a century of declining population.”

Small Towns, Big Ideas

Thurmond’s Future

Some small communities will become ghost towns, but that
doesn’t mean every small town losing population should stop
fighting for survival. That’s the philosophy of Will Lambe,
director of the Community and Economic Development
Program at the University of North Carolina at Chapel Hill.

Thurmond’s 14-year poker game may have ended in 1930,
but the town never folded, not even when the National Park
Service started buying out the remaining residents and consolidating the town into the park system of the New River
Gorge National River.

Region Focus | Third Quarter | 2011

PHOTOGRAPHY: COURTESY OF JOHN H. BOWMAN; COURTESY OF MARK HOTZ

In 1995, the Park Service renovated Thurmond’s 1904 passenger
depot and turned it into a museum
and visitors’ center. The Park Service
also stabilized the abandoned commercial buildings overlooking the
rail yard and proposed a historic site
similar to Steamtown in
Scranton, Pa. The next step
was to renovate the old C&O
engine house and turn it
into a railroad museum. But
Thurmond’s luck took a familiar turn for the worse: The
engine house burned down,
and the Park Service put its
more ambitious preservation
plans on hold.
Now it appears that
Thurmond — or at least its
surrounding county — might
be making a comeback. The
Boy Scouts of America is developing a $400 million complex
near the town that will become the permanent home for the
National Scout Jamboree. (See related story on page 4.)
Every four years, about 48,000 scouts and volunteers will
spend 10 days at the Summit Bechtel Family National Scout
Reserve. The complex also will become a high-adventure
base that will serve at least 30,000 Boy Scouts every year.
It’s easy to imagine the retail potential of thousands of
Boy Scouts exploring a ghost town, but the Park Service’s
historical architect says it would cost $3.5 million to prepare
the ground floors of the abandoned commercial buildings
for retail tenants. The Park Service already has invested
more than $7 million to preserve the town, and the
prospects of more federal funding are fading fast.
One Thurmond resident suggests that the Park Service
has become gun-shy after the restored passenger depot
made a brief appearance on “The Fleecing of America.” The
NBC Nightly News segment asked why the federal government would spend $3 million to renovate a train station in a
town with only seven residents.
Thurmond is still a flag stop for Amtrak, but the town’s
mayor, Melanie Dragan, points out that the passenger depot

The National Bank of Thurmond was a charter member of the Federal
Reserve Bank of Richmond. During Thurmond’s heyday, national banks
issued their own currency. Mark Hotz, who purchased this 1907 specimen
on eBay, says Thurmond bank notes are quite rare.

has become more of a museum and visitor center than a
working train station. It welcomes thousands of visitors
each year, but the underlying question of how much money
the federal government should invest to preserve a ghost
town remains relevant.
“It depends on why you want to save it,” Polèse replies.
“If you are from the small town, and you are really attached
to it, you agree. But if you are the federal government, and
you are looking at the debt going up, you may not agree.
That’s really what democracy is all about.”
Leah Perkowski Sisk, who grew up in nearby Beckley,
W.Va., casts her vote for preserving Thurmond. As an education technician at the New River Gorge National River,
she says this “Biggest Little Town” has much to teach about
economic history.
Thurmond is a great example of many things,” she says.
“To me it’s an example of changing technology, such as the
railroad’s conversion from steam to diesel, and how when job
markets shift, so do the people. Today, Thurmond seems like
the end of nowhere, but 100 years ago, it was the beginning
RF
of everywhere.”

READINGS
Carr, Patrick J., and Maria J. Kefalas. Hollowing Out the Middle:
The Rural Brain Drain and What It Means for America. Boston:
Beacon Press, 2009.
Fisher, Terri, and Kirsten Sparenborg. Lost Communities of Virginia.
Charlottesville, Va.: University of Virginia Press, 2011.
Graves, Philip E., Stephan Weiler, and Emily Elizabeth Tynon.
“The Economics of Ghost Towns.” The Journal of Regional
Analysis & Policy, 2009, vol. 39, no. 2, pp. 131-140.

Lambe, Will. Small Towns, Big Ideas: Case Studies in Small Town
Community Economic Development. Chapel Hill, N.C.: UNC School
of Government and the North Carolina Rural Economic
Development Center, 2008.
Polèse, Mario. The Wealth & Poverty of Regions: Why Cities Matter.
Chicago: University of Chicago Press, 2009.
Sullivan, Ken. Thurmond: A New River Community.
Fort Washington, Pa.: Eastern National, 1989.

Region Focus | Third Quarter | 2011

23

1970s, has raised the value and quality of the stock, sometimes at lower regulatory costs. That ensures future
generations will have something to fish for and make decent
money doing it.

Fish Swim in a Common Pool

What happened when
Virginia brought tradable
quotas to the commons
BY B E T T Y J OYC E N A S H

apt. Joe DelCampo of Virginia Beach started fishing
for striped bass in the early 1990s. Harvests had
only just begun to rebound after historically low
catches the prior two decades. By 1995, this once-plentiful sportfish had recovered, thanks to a temporary fishing
ban in state waters followed by annual quotas on catches.
But the quotas created a problem: They set off a race to
catch as many stripers as possible before the cap was
reached. That led to a surplus of stripers dockside and drove
down prices. So in 1998, the Virginia Marine Resources
Commission (VMRC) modified the quota system by allocating shares of the fishery to commercial fishermen, based on
historical landings. Individual shares are tradable. Quota
holders can sell shares outright or lease them; leases can be
long-term or just for a season.
This individual transferable quota, or ITQ, enables
DelCampo to time fishing trips. “In the springtime when
the fish are here and I can catch them easily, I catch as many
as I can,” he says. “After the fish leave, I will lease whatever
quota I haven’t caught to others.”
“It’s worked well,” says Ernie Bowden, president of the
Eastern Shore Watermen’s Association, “Before, we had a
‘rodeo’ fishery where everyone was fishing at one time. If the
quota was caught, you had to quit even if you hadn’t caught
your fish.” Bowden wasn’t particularly happy about his initial
quota allocation, but his prices have since gone up, thanks in
part to the new system. “Now I can catch my fish in April
and May when the prices are $4 to $4.50 a pound,” he says.
ITQs are a subset of management tools known as “catch
shares,” in which fishermen own a share of total allowable
catch. But an ITQ confers property rights; owners buy and
sell quota among themselves. Evidence suggests that this
rights-based management, used in some fisheries since the

C

24

Region Focus | Third Quarter | 2011

Resources without ownership — a “commons” — are easily
exploited, even wiped out. Individual fishermen have little
incentive to conserve while others are busy catching. As fish
stocks fall, the cost of catching them rises, and fishermen
overinvest in bigger, faster boats and better detection
devices. This “input stuffing” contributes to commercial
extinction.
“You have this resource that can yield substantial economic profits on a continual, sustained basis,” says Ragnar
Arnason, a fisheries economist at the University of Iceland.
Iceland’s fishing industry has operated under ITQs since
1976. “Under an ITQ, you can realize these potential gains
because you’re no longer racing for the fish, no longer competing. You can catch your allocated share.”
The Atlantic Ocean surf clam and ocean quahog fisheries
became the first seafood ITQ, in 1990. These types of clams
lie within federally-managed waters, three to 200 miles
offshore, from New England to Cape Hatteras. Hydraulic
dredges, among other innovations, had fostered a clam
industry along the Atlantic Coast that, by the 1970s, had
nearly bankrupted the stock.
“The catch rates were very, very high,” industry consultant Dave Wallace recalls. A former vessel and processing
plant owner, Wallace lives on Maryland’s Eastern Shore.
“We would catch all the clams and, because the market was
flooded, get low prices. Then that bed would collapse and
the price would shoot up but we had no supply.”
The cycle was unsustainable. In response, from 1979
through 1989, a federal plan dictated vessel numbers and
narrowed fishing times, a conventional regulatory approach.
“You could work Sunday through Thursday, 12 hours a day,
two or three trips a week,” Wallace recalls. Then the regulations kept tightening “to the point where we were fishing
about six hours every other week. And we still had hard
times.”
It took manpower to enforce these rules, including
patrols on the water. Worse, the Coast Guard had to rescue,
when they could, vessels that risked hazardous weather in
the rush to harvest.
Something had to give. The Mid-Atlantic Fishery
Management Council (MAFMC) considered an ITQ system
similar to those already under way in Scandinavia and
Iceland. Federal ITQs were permitted by the Magnuson Act
of 1976. (Wisconsin has managed certain Great Lakes
species under ITQs since 1971.) The MAFMC adopted an

State Managed Catch Shares in Ocean Fisheries
(up to 3 miles offshore)
Fishery*
VA Commercial Striped Bass
MD Black Sea Bass

First Year
1998

ITQ and monitoring costs fell, along
2004
with the number of Coast Guard res2004
DEL Commercial Black Sea Bass
cues, says Jose Montanez, an economist
2005
MD
Summer
Flounder
at the MAFMC.
2009
RI Summer Flounder
With transferable quotas, fishermen
can schedule harvests in the year-round
2009
VA Black Sea Bass
enterprise. “This also minimizes inven*NOTE: Some fisheries may have limited or no trading.
tory storage cost,” Montanez says.
SOURCE: Environmental Defense
Before, with the six-hour window, you
Federally Managed Catch Shares in Ocean Fisheries
had to store — freeze — excess fish.
(3 to 200 miles offshore)
This is one of the ways an ITQ enhances
First Year
First Year
Fishery*
Fishery*
product value as it raises productivity:
Fresh fish often command higher prices
Central Gulf of Alaska Rockfish
1990
Atlantic Surf Clam, Ocean Quahog
2007
than frozen.
Bering Sea, Aleutian Islands
1992
2008
Wreckfish (FL)
The ITQ has smoothed harvests.
Groundfish Trawl Sector
Before, everyone worked the fishery in
2009
Mid-Atlantic Golden Tilefish
Alaska Halibut, Sablefish
1995
mid-February. “It wasn’t atypical to have
Bering Sea, Aleutian Islands
Gulf of Mexico Grouper,Tilefish
2010
1999
several thousand pounds landing in a
Pollock Co-ops
week,” says Bowden, the watermen’s
Scallop (Mid-Atlantic,
2002
2010
Pacific Sablefish
association president. The fish would
Gulf of Maine, U.S. Georges Bank)
bring $3 a pound by end of the first
Bering Sea, Aleutian Islands Crab
Multispecies, Northeast
2010
2005
week, $2 per pound the second, and
2007
2011
Pacific Groundfish Trawl (90 species)
Gulf of Mexico Red Snapper
then, by the end of the third week, the
price would hit a dollar. Though ITQs
*NOTE: Some fisheries may have limited or no trading.
may not be a panacea, they may outperSOURCE: National Marine Fisheries Service
form more traditional regulations. By
comparing fisheries of similar sizes from
exceeding their catch, then it will be more expensive for you
1950 to 2003, researchers have found that ITQ fisheries
to catch your share.
appear less likely to collapse than non-ITQ fisheries, accord“There is still a vestige of the common pool problem.
ing to a 2010 study in the Annual Review of Resource
If you exceed your quota you expand your benefits,” he says.
Economics by Christopher Costello, Steven Gaines, and
“You realize everyone will be hurt, but if you think everyone
Sarah Lester of the University of California at Santa Barbara,
will do it, then you must do it.”
and John Lynham of the University of Hawaii.
Enforcement issues have led to tweaks in Virginia’s
It can be hard to see improvement in short-term data,
striped bass ITQ program over its 13-year life. In 2007, the
and many ITQs are recent, but, according to Arnason,
program began measuring quota by fish weight rather than
positive effects likely have been underestimated. “If you
by numbers. That mitigated a practice known as high gradlook at ITQ systems over 10, 20, or 30 years, you see a bigger
ing, according to Joe Grist of the VMRC, in which
stock improvement.”
fishermen discard smaller fish in favor of larger ones.
Trading also attracts the most efficient fishermen as
Still, fishermen can falsify records or under-report
quota prices guide the activity toward the common good,
catches. But even with 421 participants in the Chesapeake
Arnason says. That is, if the property rights are stable
Bay and its tributaries’ fishery, and 32 in the ocean striper
and strong.
fishery, Grist says, “We know these watermen.” Fishermen
report catches monthly and so do dealers. The Virginia
Enforcement Matters
Marine Police spot-check. In 2010, a poaching case went to
Arnason outlines features necessary to the success of this
federal district court, and resulted in fines, license revocamarket: rights that are secure, exclusive, tradable, and as
tions, and even prison time.
permanent as possible. “If there’s a likelihood someone will
Monitoring does cost taxpayers time and money, and
take your asset away, then that acts as a short time horizon
effort varies from fishery to fishery. In the clam fishery,
and you will operate as if everything ends at the time
Montanez says, some enforcement costs fell because
of expiration of your asset,” he says. Enforcement is also
critical. “If other people subtract some of your rights by
“we didn’t need to micromanage.” The fishery is managed

Region Focus | Third Quarter | 2011

25

3.5

$4.50

3.0

$4.00
$3.00

2.0

$2.50

1.5

$2.00
$1.50

1.0

PRICE (2011 DOLLARS)

$3.50
2.5

0.5

$0.50

0

$0.00

Virginia Landings

NOTES: Commercial striped bass fisheries were
closed in coastal states at various times between
1986 and 1990. Virginia’s fishery closed from
June 1989 to November 1990. In 1995, the stock
was declared restored.
Price data averaged from quarterly surveys of
Virginia seafood dealers. Mandatory harvest
reporting started in 1993; prior values provided
by the National Marine Fisheries Service.

$1.00

1970
1972
1974
1976
1978
1980
1982
1984
1986
1988
1990
1992
1994
1996
1998
2000
2002
2004
2006
2008
2010

LANDINGS (MILLIONS OF POUNDS)

Virginia’s ITQ Stabilized Striped Bass Landings and Prices

The vertical line indicates the starting year for
Individual Transferable Quotas (ITQs) as 1998.

Price Per Pound

SOURCE: Virginia Marine Resources Commission

5,000
4,500
4,000
3,500
3,000
2,500
2,000
1,500
1,000
500
0

STOCK (METRIC TONS)

16,000,000
14,000,000
12,000,000
10,000,000
8,000,000
6,000,000
4,000,000
2,000,000
0
1970
1972
1974
1976
1978
1980
1982
1984
1986
1988
1991
1993
1995
1997
1999
2001
2003
2005
2007
2009

LANDINGS (POUNDS)

Striped Bass Landings, Stock Assessment

Coastwide Landings

Virginia Landings

Coastwide Stock

NOTE: The vertical line indicates the starting year for Individual Transferable Quotas (ITQs) as 1998.
SOURCE: Atlantic States Marine Fisheries Commission

dockside today through dealer records and vessel logbooks,
cheaper than Coast Guard and National Marine Fisheries
Service (NMFS) manpower. But because ITQs are relatively
new and because fisheries differ in species, size, and scale,
cost data are often scarce and inconclusive.

Angles and Obstacles
An ITQ adds value to the fish and the fishery, but there are
reasons why some watermen object to the ITQ concept.
First and foremost, a fisherman who wants quota may not
get all he wants. (In the striped bass fishery, however, you
can buy quota even today, if you have a commercial fishing
license and find a willing seller.)
To reduce fishing effort and keep stock healthy, VMRC
limited the commercial fishery in 1996 to those earning at
least half their annual income in seafood sales. That had
reduced fishermen’s numbers even before the ITQ started in
1998, says Rob O’Reilly of the VMRC. That, in turn, made
granting shares easier.
Though DelCampo of Virginia Beach didn’t do the

26

Region Focus | Third Quarter | 2011

paperwork to get quota the first year of the ITQ, he entered
a lottery the following year and eventually bought up to the
limit, 2 percent of the Chesapeake Bay striped bass quota.
(Ocean striper quota is capped at 11 percent per owner.)
Ownership caps prevent domination by a few firms.
Naturally, distribution of initial allocations is a touchy
issue. Shares in an ITQ are almost always granted according
to historical participation in the fishery, known as grandfathering. Arnason says that grandfathering of rights
promotes stewardship and long-term investment among
fishermen. In a 2010 National Bureau of Economic Research
working paper, Arnason and co-authors Terry Anderson and
Gary Libecap, economists at Stanford University’s Hoover
Institution, argued that grandfathering increases the fishery’s net value because it rewards efficient investments and
encourages owners to work together for the fishery’s future
productivity.
But systems need careful design from the start.
Economist Sylvia Brandt of the University of Massachusetts
at Amherst compared outcomes of the surf clam ITQ with

those of the ocean quahog in a Regulation magazine article.
She found that because the initial allocations in the surf
clam ITQ were based partly on vessel numbers, the surf
clam fishery participants put more boats out on the water
during the transition period so they could get more quota.
She cautioned researchers to consider such possible strategic behavior when designing and evaluating ITQ policy.
The prospect of job losses also creates tension over ITQs
in fishing communities; lawsuits against share-based systems
in the 1990s prompted a federal moratorium in 1996, which
expired in 2002. The Magnuson Act, reauthorized in 2007,
requires NMFS to end overfishing and includes the tool of
market-based management. Today opposition is back: A
rider on the 2011 congressional budget bill cut funding for
any new share-based systems in 2011.
At the state level, commercial fishermen in North
Carolina generally oppose legislation that would allow
ITQs, says Louis Daniel, the state’s director of marine fisheries. In Maryland’s striped bass fishery, the harvest using a
specific gear-type is managed through an ITQ, but with limited rights. If fishermen want to lease or sell quota, they
must transfer the whole lot, says Michael Luisi, of the
Maryland Department of Natural Resources Fisheries
Service. But Maryland is working with fishermen to develop
alternatives to the arcane rules that currently govern the
striped bass fishery. Maryland is also exploring the idea, with
watermen, for the blue crab fishery.
Some fishermen philosophically oppose the restriction
of the open ocean only to historical participants and to
those who are able to buy their way in. Many also fear young
people won’t have money to buy into a fishery. But fishermen
already may pay tens of thousands of dollars to obtain the
required fishing permit, notes fisheries economist Kate
Quigley of the consulting firm CapLog in Charleston, S.C.

Still, the individual quotas limit a catch, while permits don’t.
Fishermen also worry that a few big operators may buy
all the shares, though some states, such as Virginia, limit
ownership to a percentage of the total allowable catch to
avoid this. Those who lack sufficient historical landings to
get a healthy initial allocation also are likely to oppose ITQs;
they might fare better under traditional management. “They
can race to fish — and maybe do all right,” Quigley notes.
“Under catch shares, they are cut out, unless they buy more
shares, which can be expensive.”
DelCampo initially opposed the idea. He realized,
though, that the old quota system was not only dangerous,
because he was forced to fish regardless of weather, it also
glutted the market. Now, he chooses when to fish based on
weather, price, and availability. The ITQ also stabilizes his
profession. “If you are a fantastic fisherman, and fill your box
each and every trip, you make a living,” he says. “But when
you are ready to retire, all you are left with is your boat and
your gear — that’s it. When I retire, I can’t sell a fish I’ve
already caught, but I can sell the quota I’ve accumulated
over the years, or lease it to other people for my retirement
income.”
But the quota is good only as long as the stock remains.
And that’s the point — to sustain the resource.
The world’s appetite for fish seems insatiable and could
haul the resource all the way to extinction. Varying types of
share systems — especially those with tradable permits —
may offer a buoy to the fish and the fishermen.
“We need property rights in the ocean in the same way we
needed them on land, to make fishing more efficient,”
Arnason says. “ITQs are one step along the way.” These
systems that codify property rights have potential to manage
the resources within the places — air, public lands, waters —
RF
that are common to all, but owned by none.

READINGS
Anderson, Terry L., Ragnar Arnason, and Gary D. Libecap,
“Efficiency Advantages of Grandfathering in Rights-Based
Fisheries Management.” National Bureau of Economic
Research Working Paper No. 16519, November 2010.
Arnason, Ragnar, and Hannes H. Gissurarson, ed. Individual
Transferable Quotas in Theory and Practice. Reykjavik: University of
Iceland Press, 1999.

Johnson, J. Michael, Joseph D. Grist, and Robert O'Reilly.
“Striped Bass Management in Virginia,” Fisheries Management
Division, Virginia Marine Resources Commission, 2010.
“Sharing the Fish: Toward a National Policy on Individual Fishing
Quotas.” Committee to Review Individual Fishing Quotas,
National Research Council, Washington, D.C.: National
Academies Press, 1999.

Brandt, Sylvia. “A Tale of Two Clams.” Regulation, Spring 2005,
pp. 18-21.
Costello, Christopher, John Lynham, Sarah E. Lester, and Steven
D. Gaines. “Economic Incentives and Global Fisheries
Sustainability.” Annual Review of Resource Economics, October 2010,
vol. 2, no. 2, pp. 299-318.

Region Focus | Third Quarter | 2011

27

BY C H A R L E S G E R E N A

estern North Carolina once depended on
tobacco and cotton production to make money.
When agriculture gave way to the Industrial
Revolution, communities turned to textile and furniture
production to utilize their natural resources.
Now another kind of farm is drawing upon the region’s
comparative advantages. Massive data centers called “server
farms” — large enough to hold several football fields —
house room after room stacked with computers. They draw
about 20 times the power of a midsized office building yet
require only a few dozen workers to operate.
Economic development officials in western North
Carolina have been promoting the region to companies
looking to build data centers, offering generous tax breaks
to compete with rural towns like Quincy, Wash., that also
have ample land, power, and water. Officials have had a
number of wins.
Google has operated a data center in Caldwell County
since 2008 and is in the process of expanding it. Apple began
operating a center in Catawba County last spring, while
Facebook plans to complete its center in Rutherford County
next year. In addition, a subsidiary of the Walt Disney Co. is
eyeing Cleveland County, where Wipro Technologies, a
provider of IT services and infrastructure, is already converting part of a boat manufacturing plant into a data center.
Would these recruitment efforts stand up to a formal
cost-benefit analysis? It depends on how you define
“benefits.” Data centers don’t generate enough employment
to make up for the thousands of manufacturing jobs that
have been eliminated through automation or relocated
overseas.
But for western North Carolina, some new jobs are better than no jobs in the face of double-digit unemployment
that has persisted for more than two years. Data centers also
generate significant tax revenue without requiring a lot of
additional services like roads and schools. Finally, they hold
the promise of generating new economic activity in the
future.
“We feel like we still have to recruit companies that make
stuff,” notes Kristin Fletcher, executive vice president of
economic development for the Cleveland County Economic
Development Partnership. “But data centers give our rural
county a foothold in the technology world. We’re looking for
diversification and stability.”

W

The Demand for Virtual Real Estate
Data centers have been around in various forms since at
least the 1960s. Companies have created these centralized
locations to house a variety of computer equipment,
from servers hosting enterprise-level applications and websites to routers and other networking equipment to data

28

Region Focus | Third Quarter | 2011

storage and backup facilities.
The demand for this virtual real estate has steadily
increased over the years, driving the need for ever-larger
data centers located far from the urban canyons of corporate
America. For one thing, there are technical advantages and
cost savings from consolidating smaller centers. “It’s the
economies of scale of having a certain amount of server
capacity under one roof,” explains John Lenio, an economist
and managing director of the Economic Incentives Group at
CB Richard Ellis, a real estate services firm.
Secondly, says Lenio, companies want their data centers
located away from their headquarters in case of an emergency. Finally, taxes are higher in the big cities and mature
suburbs where companies have usually been based.
Technical factors are also driving the growth in the size and
number of data centers. Like a river harnessed by dams and
channels, torrents of data constantly flow through our economy that have to be manipulated and stored. Every picture
posted on Facebook, every download of a song from iTunes
adds to the torrent.
“The more people that have Internet access, the more
people start using services” that require additional computing power, notes Peter Marin, president of T5 Partners,
which has been developing data centers in western North
Carolina since 2008. Then there’s the emergence of cloud
computing, where data processing and storage are taken out
of PCs in people’s homes and offices and placed in a more
efficient environment. “All a data center does is provide
power, space, and cooling on a larger scale.”

Go West
To the east, the Research Triangle has been where data centers traditionally clustered in North Carolina. IBM, Fidelity
Investments, and other companies have taken advantage of
the region’s nexus of IT professionals and telecommunications infrastructure.
What does the western part of the Tar Heel State offer
for Google, Apple, and Facebook? Often, it is the same
infrastructure that has supported manufacturers for more
than a century.
At the top of the list is access to abundant, reliable, and
relatively inexpensive electricity to run a large data center’s
computers and keep them cool. “The electrical power [sold
by Duke Energy in western North Carolina] is 4.3 cents per
kilowatt hour, which is one of the lowest rates in the
country,” says Marin. More than half of that power comes
from nuclear plants, which are stable sources of electricity
from a pricing perspective. Also, Duke has a history of serving the power needs of textile and furniture manufacturers.
“Those industries have left the region and left significant
capacity on Duke’s system.”

Server Farm Roundup
North Carolina lawmakers got the ball rolling when they approved the construction of a state-run data center in western North Carolina in 2008. Private industry
quickly followed suit, taking advantage of the region’s power, water, and telecommunications infrastructure.

Location

Owner
NC Government

Forest City, Rutherford County

Google

Lenoir, Caldwell County

Apple
Facebook
Wipro Technologies

Maiden, Catawba County
Forest City, Rutherford County
Kings Mountain, Cleveland County

* Includes contract workers

Size
(sq. ft.)

Projected Capital
Investment

Maximum
Projected
Employment

50,000

$25 million

50

100,000

$600 million

210*

500,000

$1 billion

50

390,000
215,000

$450 million
$75 million

45
17

Past Use
None (new construction)
Lumberyard operated by Burnhardt
Furniture, single-family homes
None (new construction)
Textile mill operated by Burlington Industries
Boat manufacturing facility operated by Chris-Craft

Completion
Date
2008
2008
2011
2012**
N/A

** Estimated

SOURCES: News reports and information supplied by Google and Facebook

There is also lots of water that mills no longer use. At
many data centers, the excess heat coming from computers
is absorbed by water, which is taken to a tower where
air quickly cools it off. Also, some of the water evaporates
and some is drained to remove sediment before circulating
back into the building. Both cause large losses of water. As a
result, large data centers typically use between 500,000 and
a million gallons of water a day, according to Lenio. To put
that number in perspective, a knit-fabric textile plant operated by Hanesbrands in Forest City used 3 million gallons of
water daily before it closed in 2008.
Finally, there is access to long-haul, high-speed communications lines. Some of them have been built by state and
local government, while others are operated by telecom
providers like AT&T and Verizon.
Western North Carolina has environmental factors in its
favor as well. The region tends to have a mild climate and a
low risk of natural disasters. It also has lots of land that’s
undeveloped or can be redeveloped.
On top of these factors, local economic development
officials are aggressive recruiters. To assemble the 200-plus
acres required for Google’s data center, for example, the
Caldwell County Economic Development Commission
(EDC) acquired dozens of homesites in Lenoir that were
next to a former lumberyard and an undeveloped parcel
owned by Duke Energy.
John Howard, the EDC’s former executive director,
recalls how he dealt with a railroad that passed right through
the site being developed for Google. “We had to negotiate
with the owner of the railroad and the manager to stop the
tracks prior to the site and create an off-load station.”
(Google did pay $3 million to help fund the railroad’s reconfiguration.)
Data center projects also benefit from a slew of tax incentives. For example, Caldwell County gave Google a break on
100 percent of its business property taxes and 80 percent of
its real property taxes for 30 years. The city of Lenoir, where
the center is located, offered a similar deal.
Furthermore, the state enacted a sales-tax exemption in
2006 for purchases of electricity and business property by

“Internet data centers.” These are defined as facilities
located in economically distressed counties and operated by
software publishing and Internet services firms that have
invested at least $250 million over five years.
Such incentives are important — data centers are very
capital-intensive, so companies care a lot about the tax bills
they’ll have to pay. There’s the sales and use tax on all of the
computers, electrical equipment, fire suppression systems,
and cooling systems that a data center needs. Then, business
property taxes are due on all of that equipment on an
ongoing basis. Finally, there are the real property taxes paid
on the land and buildings.

You’ve Got to Give to Get
Lenio hopes that communities don’t give up all of their
potential tax revenue to recruit a data center. But he can see
why they might do so. “There will be some counties that may
not see good economic development projects every day
[and] might be willing to use the big data center as a loss
leader to build a cluster,” he notes. “For some policymakers,
they’d rather have something than nothing.”
What is that “something?” In the short term, the construction of a large data center can provide an economic
boost, if the required expertise is available locally.
“It’s very specialized construction,” says Robert
McFarlane, who designs data centers for New York-based
Shen Milsom & Wilke. “The equipment that is used to
power and cool these facilities have to be built the right way,
and the majority of trades people are not used to building
them.” Further, “they might use a local architect, but probably not a local engineering firm. Only a few engineers in the
country know how to do these designs.”
As of October 2011, more than 1,500 people have helped
build Facebook’s data center in Rutherford County.
“Most of them live here; they eat at the restaurants and buy
food and clothes here,” says Thomas Johnson, executive
director of the county’s Economic Development
Commission. Once a month, a local restaurant caters lunch
for the construction crew.
In the long term, however, the employment impact is

Region Focus | Third Quarter | 2011

29

minimal. None of the companies with data centers will
say exactly how many people they employ full time, but
estimates range from 45 for Facebook and 50 for Apple to
210 for Google. The higher estimate for Google likely
includes security personnel and service technicians who
work on a contract basis.
While the number of jobs is small, they include a few
well-paying positions such as software engineers and operations managers. Again, the challenge is whether local
workers, many of whom were laid off from their factory jobs
years ago, have the requisite skills.
“If a person is a computer jockey at home, it’s very possible they could learn what they need to know on the job and
fill many of the positions at a server farm,” says McFarlane.
That’s because all the servers in a data center are identical.
“When you learn to reload [the software on] one of them or
replace it, you know what to do.” Still, “that doesn’t mean
you can train a monkey to do it. You’ve got to have people
who are very agile with computers because they are going to
be expected to react quickly.”
Economic development officials in western North

UPFRONT

continued from page 5

scrubbers and pollution controls over the past few years, and
is likely to spend $5 billion to $6 billion more over the next
decade, according to spokeswoman Erin Culbert of Duke,
now the nation’s sixth-largest electricity provider.
(If Duke merges with Raleigh, N.C.-based Progress
Energy, as planned, the combined firm will be the nation’s
biggest utility.)
“We are planning retirements of coal-fired plants
between now and 2015 that will total around 3,200
megawatts,” Culbert says. That’s the amount of
electricity required to power about 2.56 million homes.
Affected plants are in the Midwest and the Carolinas. The
plant closings are not only in response to Duke’s modernization of its 50-year-old coal fleet but also to future emissions
cuts. “With anticipation of multiple environmental regulations, we do see an upward pressure on rates nationally.”
According to American Electric Power, based in
Columbus, Ohio, costs will range from $6 billion to
$8 billion through the end of this decade. The company
plans to close three plants in West Virginia and one in
Virginia, among others; 600 jobs overall may be lost.
The new rule limits market-based emissions trading.
This ability to buy and sell pollution allowances gave
power plants flexibility to meet emissions standards in the
past. (Older, dirtier plants could buy allowances from
newer, cleaner plants.) CSAPR allows no carryover of
SO2 or NOx banked emissions from previous programs.
EPA says the large number of old allowances would have
made it more likely for states to exceed levels and for

30

Carolina are well aware that a data center isn’t a manufacturing plant and won’t single-handedly make up for the job
losses in the region’s manufacturing sector. But they see
other benefits for their communities in the long term
besides job creation.
Despite the generous tax breaks provided to data centers,
county governments stand to receive a significant inflow of
tax revenue from property that had been sitting vacant.
According to Johnson, Facebook will probably become the
fifth- or sixth-largest taxpayer in Rutherford County, paying
about $109,000 annually for the next 10 years. Google was
expected to yield $130,000 to $160,000 in tax revenues
annually for Lenoir and Caldwell County.
For Kristen Fletcher in Cleveland County, and her competitors in neighboring counties, the huge amount of capital
invested in data centers makes them worth the pursuit.
“It’s a positive contribution to our tax base, which is
going to allow us to progress in so many other ways — building schools, stabilizing our local government financially,”
says Fletcher. “It is a fairly easy target with a fairly large,
immediate return on investment.”
RF

Region Focus | Third Quarter | 2011

power plants to incur penalties.
CSAPR establishes four new trading programs, two of
which are for SO2. One applies to states that require deep
cuts and one to states needing fewer emissions cuts.
Another program was set up for annual NOx emissions, and
still another for ozone-season (summer) NOx. The new rule
allows intrastate trading of pollution allowances, along with
limited interstate transactions among certain groups of
plants. A strict emissions cap in each state is designed to
prevent pollution “hotspots.”
Environmental benefits may be hard to quantify.
Existing cost-benefit studies of the rule and its predecessor
rule find, however, that benefits outweigh costs, often by a
wide margin, write Richard Schmalensee and Robert Stavins
in a March 2011 paper analyzing the pollution transport rule.
The authors are economists at the Massachusetts Institute
of Technology and Harvard University, respectively.
Estimates of benefits vary across studies, from a low of
$20 billion annually to a high of $310 billion, with most of
the variation coming from assumptions about the value of a
statistical life, estimated by the EPA at about $7.3 million in
2006 dollars. Annual costs to utilities are estimated at
$2.4 billion, including capital investments already in
progress under the old rule.
The cross-state rule is one of several proposed that will
affect the electric power sector. Others include regulations
of greenhouse gas emissions, mercury and other hazardous
air pollutants, cooling water intake structures, and fly-ash
disposal at combustion sites.
— BETTY JOYCE NASH

Greater sensitivity to business cycles has made state tax revenues
more difficult to predict
BY K A R L R H O D E S

s the economy receded in March 2008, Barry
Boardman was putting together his forecast of
North Carolina tax revenues for fiscal year
2008-09. He knew that housing prices were declining
sharply in sunshine states like California, Arizona, and
Florida, but things didn’t seem so bad in North Carolina.
The housing boom there had been driven mostly by solid
population growth — not wild speculation.
Boardman, the state’s senior fiscal analyst, predicted a
mild recession in North Carolina, but by October 2008, tax
revenues were down 3 percent, then 9 percent, then 15 percent by the end of the fiscal year. North Carolina’s exposure
to the housing bubble may have been minimal, but its exposure to the ensuing contraction was substantial. “We weren’t
tying that together back in March of 2008,” he says.
Forecasting state tax revenues is tricky, especially when
the economy veers into a deep and prolonged recession, but
state revenue forecasting errors have become increasingly
large and pervasive during the past three recessions, according to a report by the Nelson A. Rockefeller Institute of
Government at the State University of New York at Albany
and the Pew Center on the States. The study, “States’
Revenue Estimating: Cracks in the Crystal Ball,” analyzed
states’ ability to forecast their tax revenues from 1987
through 2009.
“Errors in revenue estimates have worsened progressively during the fiscal crises that have followed the past three
economic downturns,” according to the report. “During the
1990-92 revenue crisis, 25 percent of all state forecasts fell
short by 5 percent or more. During the 2001-03 revenue
downturn, 45 percent of all state forecasts were off by 5 percent or more. And in 2009, fully 70 percent of all forecasts
overestimated revenues by 5 percent or more.”
Accurate revenue forecasts are important. They help
states plan ahead, carefully consider the merits of individual
budget decisions, and avoid massive across-the-board cuts
like those that became necessary in 2009.
“It’s been a constant challenge,” Maryland Gov. Martin
O’Malley told the Pew researchers. “No sooner do you make
$200 million in tough and painful cuts than the guys in green
eyeshades come into your office and tell you that revenues
have eroded further and you need to find another couple
hundred. It’s like trying to keep your nose above the waves
while the riptide is pulling you under.”
Revenue forecasters throughout the Fifth District tell
similar stories. “The fall forecast of 2008 was the start of the
downward revisions that continued until February of this
year,” says Norton Francis, director of revenue estimation

A

for Washington, D.C. “We became persona non grata,
because every time we came around, we had bad news.”

Revenue Forecasting 101
Revenue forecasters in Fifth District states and D.C. start
with national forecasts purchased from IHS Global Insight
and/or Moody’s Analytics. They adapt the national projections to the unique economies within their states. Then they
feed the data into estimating models that predict various
categories of revenue based on how their states’ tax structures capture portions of economic activity.
In the Fifth District, the most popular tools of the trade
include simple trend analysis (projections based on the trajectory of past performance such as revenue from court fines
and fees); time-series modeling (projections based on
sequential data that reveal underlying factors such as
seasonal differences in employment); and linear regression
modeling (projections based on mathematical correlations
between different types of data such as the relationship
between personal income and sales tax revenue).
In addition to these statistical tools, all of the Fifth
District jurisdictions, except D.C. and West Virginia, use
some form of consensus forecasting, which brings officials
from both the legislative and executive branches into the
process, often joined by advisory groups of business leaders
and external economists.
The Rockefeller/Pew study analyzed the effectiveness of
each of these approaches and found that none of the methods was “significantly linked to the size of the errors.” Other
research indicates that combining multiple forecasts can
lead to somewhat greater accuracy. North Carolina, for
example, now considers two forecasts, one developed by the
legislature’s analysts and another prepared separately by the
Office of State Budget and Management.
“We get together and kind of haggle back and forth,”
Boardman says. “It’s an informal process. Then the governor
and the legislature pick numbers — usually the consensus
number.”
Technological advancements also can improve revenue
estimation. Forecasters in West Virginia, for example, have
benefitted greatly from a new integrated tax information
system that replaced a mainframe system that was installed
in 1972 and never upgraded significantly. The new system,
which became fully operational in 2009, allows forecasters
to quickly access and analyze tax data in ways that were previously impractical or impossible.
“We were in the dark for many years,” says Mark
Muchow, West Virginia’s deputy secretary of revenue.

Region Focus | Third Quarter | 2011

31

Degree of Difficulty
The tools are getting better, but the task is getting harder.
State revenues have become more sensitive to economic
swings, according to Richard Mattoon and Leslie
McGranahan, economists at the Federal Reserve Bank of
Chicago. In a 2008 working paper, they conclude that from
1998 to 2007, state revenues were more sensitive to economic conditions than they were during the preceding two
decades.
“While a one percentage point change in economic conditions led to a 0.9 percentage point change in income tax
revenues prior to 1998, it corresponds to a 1.6 percentage
point change during the 1998-2007 period,” they write. This
trend appeared in 36 of the 43 states that collect income
taxes, and it was statistically significant in 10 states, including North Carolina and Virginia. The authors attributed
nearly all of this heightened sensitivity to states’ growing
exposure to increasingly volatile capital gains revenue.
From 1970 to 2000, capital income — including capital
gains — was “more than five times more volatile than wages
and salaries or consumption,” according to a 2003 article in
State Tax Notes by David Sjoquist and Sally Wallace, economists at Georgia State University.
Their analysis of Internal Revenue Service data also
shows that capital gains have become a much larger component of state income tax proceeds. Capital gains, as a
percentage of federal adjusted gross income, increased in
every state and D.C. from 1990 to 2000. In the Fifth
District, capital gains were up 211 percent in Maryland,
171 percent in D.C., 170 percent in Virginia, 152 percent in
North Carolina, 150 percent in South Carolina, and 124 percent in West Virginia.
“About 20 to 25 percent of our general fund — that’s a big
chunk — now comes from business income and capital
gains,” Boardman notes. “Those are the sources of income
that have been shown to swing from anywhere from plus 30
percent to minus 30 percent — even more — in any given
year.” Corporate income tax proceeds always have been difficult to predict, but capital gains volatility is the worrisome
wrinkle that has emerged during the past 12 years — caused
primarily by wild swings in the stock market.
Revenue forecasters often say, “If I could predict the
stock market, I wouldn’t need this job.” But predicting stock
market gyrations is just the first step toward estimating capital gains revenue. Forecasters also have to consider how an
increasingly diverse group of investors might respond to
market performance and to tax policy changes — both real
and anticipated.
The federal capital gains tax rate is expected to remain at
15 percent through 2012, says John Layman, chief economist
and director of revenue forecasting for Virginia’s
Department of Taxation. But then what? Antsy investors
might be thinking: “The bracket is going up. I’m going to
start culling my winnings and know that I am only paying
15 percent,” Layman says. (Most states treat capital gains as
regular income.)

32

Region Focus | Third Quarter | 2011

Forecasting capital gains revenue might be the ultimate
challenge, but other growing components of personal
income tax revenues also are difficult to predict. “We have
seen a big shift over the past 20-some-odd years to a lot
fewer corporate taxpayers,” Boardman notes. “With subchapter S (corporations) and LLCs and so forth, a lot of that
income now comes through the personal income tax, which
is making that a far more volatile source of revenue.”

Sales Tax Erosion
Most states have three main sources of revenue: sales tax,
personal income tax, and corporate income tax. Sales tax
revenue is the most predictable category, but sales taxes have
been shrinking in many states as a percent of overall taxes.
Expenditures on services increased from 47.4 percent of
consumption in 1979 to 58.8 percent in 2002, notes William
Fox, an economist at the University of Tennessee, in a
2003 article in State Tax Notes. The corresponding decline in
expenditures on goods relative to expenditures on services
erodes the sales tax base because most goods are taxed by
states while most services are not. In other words, state sales
tax structures are still based on a manufacturing economy,
Boardman says. “For most states, their sales tax bases were
constructed back in the 1930s.”
More recently, technological advancements have chipped
away at the sales tax base, Fox notes. The Internet has been
the primary factor — facilitating tax-free transactions and
blurring the lines between goods and services with downloadable books, music, and software. States, however, are
starting to reclaim this sales tax territory.
“Virginia passed a law a few years ago that said, ‘If you
want to bid on a contract in the Commonwealth, you have to
be a registered sales tax dealer,’” Layman recalls. “Think
about all the computer manufacturers out there that want to
do business with Virginia. That made a big difference.”
Some states have raised their sales tax rates to offset the
shrinking base, but some revenue erosion is self-inflicted.
During the economy’s so-called “Great Moderation” from
the mid-1980s to 2007, many states exempted food and
nonprescription drugs from sales tax. This practice made
many taxpayers happy, but it eliminated two of the most
predictable sources of sales tax revenues.

The Perfect Storm
Sales tax shortfalls were not a major problem for most states
during the revenue downturn of 2002 because consumer
spending remained relatively strong during and following
the 2001 recession. But the shrinking sales tax base has
caused states to become more reliant on the more volatile
personal income tax.
The rise in personal income tax proceeds that occurred in
the 1990s — driven mostly by higher capital gains — more
than offset erosion of sales tax bases. But after the dot-com
crash, states that had become heavily dependent on capital
gains found themselves in a bind.
Mindful of voters’ concerns about taxes, state policy-

makers may have missed an opportunity to
shore up underlying tax structures. Many
state leaders balanced their budgets by
cutting costs, tapping rainy-day funds, and
securitizing tobacco settlement payments
instead of raising taxes as they had during
previous recessions. These politically attractive alternatives “may have papered over structural
imbalances in the state revenue and expenditure systems,”
according to the Chicago Fed study. “While this one-time
money (reserve balances and tobacco money) could balance
their budgets in the short run, it did not force states to
examine whether their revenue structure was in fact productive enough to meet expenditure demands.”
The 2002 revenue downturn highlighted this imbalance,
but at the time, many experts viewed the severe shortfall as
a dot-com aberration instead of a bellwether event.
The National Governors’ Association called it “the perfect
storm.” Fox called it a “100-year flood,” the worst revenue
disturbance since at least 1970.
If state officials believed that the dot-com crash caused a
perfect storm of revenue-forecasting errors, then it was
reasonable for them to assume that nothing so disruptive
would happen again anytime soon. Certainly, the stock
market floodwaters receded slowly. The S&P 500 Index
declined steadily for three years, but then it resumed a
growth trajectory that was nearly identical to its rate of
increase in the early 1990s. By mid-2007, the S&P 500 Index
again was approaching its all-time high, but it would not stay
there long.
The financial crisis that began to show itself in the
second half of 2007 caused the stock market to fall even
faster and further than it did during and following the
dot-com calamity. Compared to the 2007-09 recession, the
2002 disturbance was more like a thunder shower than a
perfect storm.
“The 2002 revenue crisis gave us the experience to handle
2008 and 2009,” says Layman in Virginia. “But now we had a
financial recession and a housing recession together, and it
was a killer.”
The housing market monster dragged sales tax revenues
down with it as expenditures on home furnishings, fixtures,
and building materials plunged. The unemployment rate
peaked above 10 percent, much higher than the 6.3 percent
mark following the 2001 recession. As housing prices
continued to fall, many consumers ended up living in homes
that were worth less than the balances on their mortgages.
“The level of wealth that you have is a big determinant of
the amount you will spend,” Francis notes. “So losing the
value of your house or losing your house will rein in your
spending a lot.”
As the economy struggles to recover from the recession
of 2007-09, uncertainty still clouds the crystal ball. The
unemployment rate hovers above 9 percent. Nations struggle to pay their bills, and stock market volatility persists.
After bottoming out in March 2009, the S&P 500 doubled

by February 2011. Then it plunged 18.2 percent in late July
and early August. In this environment, it doesn’t appear that
forecasting state revenues is going to get easier.

What Can States Do?
State revenue forecasters will never master the vagaries of
capital gains, but they are improving their models based on
lessons learned from each business cycle.
After the 1990-91 recession, for example, Virginia
replaced its statewide economic model with regional economic models. “The three largest metropolitan areas, and
now the balance of the state, have their own equations to
forecast professional business services, hospitality, education, and health care, because one size doesn’t fit all,”
Layman says. Northern Virginia, in particular, has become
the state’s economic engine during the past four decades
with dramatic wealth creation from government contractors
and information technology services.
On the other side of the Potomac River, D.C. is revisiting
its economic models in light of new data from the most
recent financial crisis and recession. “It’s going to be a challenge,” Francis says. “You don’t want to zero out that whole
period, but you have to make decisions about whether it is
an anomaly or whether it represents a new cycle.”
States could restructure taxes to place more emphasis on
stable sources of revenue, such as the sales tax. Virginia
raised its sales tax to 5 percent in 2004, and Iowa raised its
sales tax to 6 percent in 2008. In addition to taxing goods,
Iowa now taxes 94 types of services, but raising taxes or
adding new taxes would be difficult during a weak recovery.
A more politically appealing strategy might be to smooth
out the benefits of revenue windfalls. In this regard, states
with high exposure to volatile capital gains might learn
something from states with high exposure to highly variable
energy prices.
West Virginia, for example, collected more than $500
million in severance taxes from coal and natural gas companies in its most recent fiscal year — up sharply from about
$200 million in 2003. (Companies pay severance taxes based
on the value of coal or natural gas they extract — or “sever”
— from the state.) Anticipating the inevitable swing the
other way, West Virginia has built one of the largest rainyday funds (as a percentage of total budget) in the nation.
When the price of coal goes up enough to create a huge
budget surplus, as it did last year, West Virginia puts more
money into the fund.
Would a similar approach work for the capital gains
component of personal income tax revenue? Massachusetts,
continued on page 45

Region Focus | Third Quarter | 2011

33

Vol atility at the Pump
Where do high gas prices come from?
BY B E T T Y J OYC E N A S H

rivers curse high gas prices. Over the summer of
2010, prices hovered around $2.80 per gallon,
climbed in late autumn, and then spiked in March
2011 as revolutions in the Middle East disrupted supplies.
Gas prices remained above $3.50 for 29 weeks in a row by
late October.
A $10 per barrel increase in crude oil raises pump prices
by about 25 cents per gallon, according to Jeff Lenard of the
National Association of Convenience Stores (NACS). Those
stores retail about 80 percent of the gasoline sold in the
United States.
The extra quarter adds up to $35 billion annually. That
causes people to spend less on other goods or save less when
they need to pay more for gas; the $10 more per barrel not
only further subtracts from GDP, since almost half of U.S. oil
comes from abroad, but diminishes estimated multiplier
effects from spending on domestic goods and services. This
lowers net consumer spending by about the same $35 billion,
a 0.2 percent decline of GDP, in a $15 trillion economy,
according to economist James Hamilton of the University of
California at San Diego.
Refinery costs, distribution and marketing costs, varying
fuel specifications, and taxes also influence prices, along
with the weak dollar. Other influences: seasonal variations in
consumption, production, imports and inventories, along
with trading speculation. But the biggest determinant by far
is the price of crude oil, roughly 68 percent of the retail price
at current price levels, according to the U.S. Energy
Information Administration (EIA).

D

Supply Matters
The current run-up in oil prices began as world economies
started to emerge from the latest recession. Their energy
needs grew. Between 2003 and 2008, crude prices also escalated because of intensified demand and stagnant
production. That sent prices to $147 per barrel in 2008
before dropping to less than $40 per barrel as the global
recession reduced demand.
Today, in the developing nations of Brazil, India, and
especially China, demand has fueled the price hikes. In
China, energy needs rose by 12 percent in 2010. U.S. demand
rose 1.9 percent over the first six months of 2011 compared
to the same period in 2010. Summer also accelerates demand
as people take more vacation trips.
Risks and reductions create uncertainty about supply,
and that affects prices. Turmoil in Libya and other Mideast
nations has influenced supply, and an April price spike also
reflected events closer to home — Mississippi River flooding and resulting refinery outages.
The United States produced the most oil, worldwide,

34

Region Focus | Third Quarter | 2011

What’s Included in a $3.80 Per-Gallon
Price of Regular Gas?

SOURCE: U.S. Energy Information Administration

until the early 1970s when Texas oil production started to
decline. The United States imported 49 percent of its oil in
2010, according to the American Petroleum Institute (API):
25 percent from Canada, 12 percent from Saudi Arabia, and
smaller percentages from Mexico, Venezuela, Algeria,
Nigeria, Iraq, Russia, and elsewhere. Instability in oilproducing nations threatens supply and raises world market
prices. The Arab oil embargo in 1973, the Iranian revolution
in 1978, the Iran-Iraq war in 1980, and the Persian Gulf
conflict in 1990 all boosted U.S. prices.

Refinery to Retail
The nation’s 148 refineries heat crude oil to change it into a
gas, and then condense it into liquid gasoline and other
petroleum products. The process accounts for about 13 percent of the per-gallon gasoline price. Refinery capacity in
January 2011 was at a 29-year high, despite many refineries
having shut down since the 1970s. Existing facilities have
expanded, according to Tim Hogan of the National
Petrochemical and Refiners Association. “In some cases,
[shutdowns occurred] because that facility was not economical and did not want to make investments in new
equipment,” he says.
It’s unclear the extent to which refiners will continue to
expand, Hogan says, because biofuel mandates have reduced
production at refineries. Gasoline-ethanol blends account
for more than 90 percent of the gasoline sold in the United
States; blends of up to 15 percent have recently been
approved for some later model vehicles, though that gas is
not yet sold at the pump. Increased volumes of ethanol and
other renewable fuels displaces the need for more refinery
capacity. (Ethanol refineries are typically located in the
Midwest, near corn producers.)
From the refinery, about 168,000 pipeline-miles send various fuel blends to product depots around the nation. Most
states in the Fifth District use gas piped from the Gulf

Coast, including Texas and Louisiana; in 2010 about 26 percent of the gasoline produced in the United States came
from the Gulf Coast, according to the EIA. Other ingredients may be blended into gasoline at the refinery.
Gas prices at the pump also vary according to “boutique”
fuel formulas that federal regulations require in certain
regions and metro areas. Hot-weather blends reduce the
volatile organic compounds that contribute to smog — the
incremental cost of production is estimated at roughly
5 cents per gallon or less. Fuel requirements vary by location.
The District of Columbia, for example, uses reformulated
gasoline, or RFG. Pipelines carry the different blends, and
computer controls route fuels that may bypass Virginia, for
instance, and go all the way to New Jersey.

At the Pump
It takes about seven to 10 days for an oil price increase to
reach the pump, according to Lenard of the NACS.
Competition for customers determines the timing and
amount. Distance from gas depots, for instance, affects
prices because deliveries cost more with distance. The
markup on gas is about 15 cents per gallon at convenience
stores, and about 3 cents of that is profit to the retailer.
People consider price first when shopping for gas,
according to consumer research published by NACS, with
nearly one-third of respondents in consumer surveys saying
they will reroute trips to save as little as 3 cents per gallon.
That knowledge makes retailers cautious. “Retailers know
they can’t pass on the whole increase,” says Lenard.
Convenience stores use gas to generate traffic inside the
store, he explains, and so they first assess the local competition. If the retailer raises gas prices by 10 cents and the
competition doesn’t, traffic plunges, not just at the pump
but inside the store. “People won’t say, ‘They have a terrible
gas price but I’m still getting a sandwich.’ ”
Pump prices also depend on the gas source and the stores’
supplier contracts. Branded gasoline offers fewer highs and
lows, especially if the store is affiliated with an oil company.
Contracts ensure the branded stores get served first.
Unbranded stores may sell cheaper gas when supplies are

Oil Prices and Retail Gasoline Prices
4

DOLLARS PER GALLON

3.5
3
2.5
2
1.5
1
0.5
0
2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011

Price of Gas 2011 $
SOURCE: U.S. Energy Information Administration

Price of Oil 2011 $

plentiful, but may charge more when supplies tighten
because they are forced to pay more on the spot market.
The demand for gasoline responds to price, although not
very much. From 2007 to 2010, vehicle miles traveled in
the United States have stayed roughly constant at around
3 trillion miles.
Demand for gasoline may react less to price today than in
the 1970s, according to Jonathan Hughes, Christopher
Knittel, and Daniel Sperling of the University of California
at Davis. They estimated the average per capita demand
for gasoline in the United States from 1975 to 1980 and
from 2001 to 2006, both periods of similarly high prices.
The results suggest drivers in the later period responded less
to increases. People may depend more on cars for daily
transportation, because of long commutes, today than in the
1970s and 1980s. Drivers also may respond more slowly to
short-term price increases because they now drive more
fuel-efficient cars.

Elasticity of Supply
As gas prices escalated, to nearly $100 per barrel, so have
drilling and calls for more exploration. While drilling in the
western Gulf of Mexico will continue under new regulatory
safeguards, a year after the BP oil spill, the eastern Gulf and
the Atlantic coast’s outer continental shelf remain under a
drilling moratorium.
In August, the Department of the Interior granted Royal
Dutch Shell conditional approval to drill four exploratory
wells next summer in the Beaufort Sea off the Arctic
National Wildlife Refuge.
North Dakota and Montana are cashing in on oil in an
area known as the Bakken Formation. Recent horizontal
drilling technologies fracture shale and release oil.
Oil’s high prices have revived prospects for oil leases in an
area 50 miles off Virginia’s coast, despite the moratorium.
No one knows yet how much oil to expect there; the potential for oil in that location is based on old estimates from the
1970s, says Mike Ward of API’s Virginia office. A seismic
investigation can’t happen before a “lease sale,” which allows
energy companies to bid for rights to explore.
“Estimating undiscovered resources in areas with little
previous drilling is as much art as science,” says energy
consultant Michael Lynch of Strategic Energy & Economic
Research Inc.
Even if oil were discovered there today, it could take
years to enter the supply chain. In Prudhoe Bay, Alaska,
production of oil found in 1968 did not begin until 1977.
Exxon’s recent discovery of two reserves in the Gulf of
Mexico must be studied, verified, and permits authorized.
Saudi Arabia holds a fifth of the world’s oil reserves and
production capacity. It’s also the world’s largest exporter.
U.S. supply represents only about 2 percent of world supply,
too small to affect global prices very much. But U.S. drilling
has expanded, reversing a 30-year decline. By 2009, proven
oil reserves had gone up about 8.6 percent over 2008 for a
RF
total of about 22 billion barrels.

Region Focus | Third Quarter | 2011

35

INTERVIEW
Nearly everyone has an opinion on what should be done
to reform the American educational system. Among the
more popular ideas over the last few decades has been to
hold teachers and schools accountable by tracking
student performance through standardized assessment
exams. Such exams were at the center of the No Child
Left Behind Act of 2002. Like many policies, though, it
may have had consequences that were both unintended
and undesirable, argues Derek Neal, an economist at the
University of Chicago. Teachers have incentives to
“teach to the test” rather than impart critical thinking
skills that help students reason through issues. And
school administrators have incentives to issue tests that
help students achieve the scores needed for their
schools to continue to receive full public funding.
Neal, whose work on education issues draws on his
training as a labor economist, has also examined the
factors behind the increase in wage and wealth
inequality in the United States. He argues that trend is
largely due to increasing returns to skill — that is,
higher-skilled workers tend to earn a substantial premium relative to their lower-skilled counterparts.
To help bridge that divide and improve educational
opportunities for people in economically disadvantaged
neighborhoods, particularly those in large cities, Neal
argues that providing school vouchers is an idea worth
trying. Vouchers, he says, may also help to narrow the
skills gap that exists, on average, between black and
white Americans.
Neal earned his Ph.D. in economics at the University
of Virginia. In addition to his appointment in the
Department of Economics at Chicago, he also is a
professor with the university’s Committee on
Education. Neal has served as the co-editor of the
Journal of Human Resources, the editor-in-chief of the
Journal of Labor Economics, and is the current editor
of the Journal of Political Economy. Aaron Steelman
interviewed Neal in August 2011.
u

RF: What is your view of subject matter assessments
administered to students? Do they, on balance, have the
intended effect of increasing knowledge in certain key
areas and, as a result, students’ human capital?
Neal: I don’t think we have precise answers to those
questions. I do believe there are reasons to suspect that the
entire assessment-based accountability movement has produced rather small gains in terms of the true subject mastery
that students possess, their true command of the curricu-

36

Region Focus | Third Quarter | 2011

lum. It is very easy to find evidence that increases in test
scores on a particular high-stakes assessment that are tied to
accountability or performance pay often don’t show up when
the same kids are taking other tests that are supposed to
cover the same topics but happen to be low-stakes tests.
That is not definitive proof that the source of the gains on
the high stakes is entirely coaching or cheating or some
other activity that has little lasting value for the students.
But there’s enough evidence of that flavor in the literature
now that we should be very cautious when advocates of
these programs point to movements in scores of high-stakes
tests as evidence that something important is happening.
There is a paper by Daniel Koretz in the Journal of Human
Resources. In that paper he looks at a Kentucky school
district that had a moderate-stakes assessment program and
he noticed that when they changed test vendors — not the
curriculum, just the company that made the exams — he saw
drops in the scores, and then the scores went back up over
three or four years. So he took the old test and gave it to a
random sample of students, and it turned out that on the old
test, the students who were now doing so well on the new
test did just as poorly as the first year the new test was introduced. So it appears that all of the improvements on the new
tests over a three- or four-year period were improvements in
performance that were completely specific to a particular
type of exam. I don’t believe that is the type of performance
we are interested in when we are evaluating whether our
schools are doing a good job or not.

PHOTOGRAPHY: BETH ROONEY

Derek Neal

RF: So this might suggest that
assessment-based programs do
little to help a student improve
his ability to reason his way
through a question or set of questions on a particular subject.

Economists need to start weighing
in on how [accountability policies in
education] should be designed, the
same way that economists weigh in
on the design of regulations, the
design of environmental policies,
the design of government auctions,
the design of the tax code.

Neal: The big thing you have to
realize is that these tests were not
designed to be used to gather information for accountability systems.
These tests were designed so that you could track the performance of kids over time — for instance, that you could
have a score for a kid in 2005 and a score for a kid in 2008
and make the claim that both scores are well placed on the
same scale, so that a score for a third-grader in math in 2008
is comparable to a score for a third-grader in math in 2005.
The idea was that this would permit us to say meaningful
things about how the distribution of student achievement
for third-graders is changing in a state or a district or a
school over a period of time.
If you want that type of stability and the ability to make
comparisons over time, you are going to need a testing
system where a lot of questions are repeated, the tests follow
a common format, and the tests are very regular in a way that
allow the psychometricians to have links between the exams
to place them all on a common scale. So the very regularity,
the repetition of items, the features that provide the
opportunity to create the constant scale also create the
opportunity for coaching and manipulation and drilling on
answers to specific questions. When you attach stakes to the
exams, there can be a response by teachers that undermines
the integrity of the scale because now it is no longer measuring student aptitude but measuring how well students were
coached for the exam. The same features that make consistent scaling possible over time in theory will guarantee that
the scale becomes corrupted over time in practice if you use
the tests for accountability and performance pay rather than
just as a source for gathering information. So the best way to
understand this is that they are trying to have a twofer —
they are trying to have a set of tests designed for one thing
and use them for something different, and that’s often
problematic.
RF: If assessment-based systems provide an incentive
for teachers to coach students for a test rather than help
them gain mastery over a subject, how do you alter that
incentive?
Neal: The first thing you need to do is have two sets of tests.
You need the current tests and those tests have to be low
stakes, and you have to have commitments that no one will
ever know how the students in one particular school did on
this test, that these scores will be reported on a district level
or above, and they will be for the education department to
track how things are going. And then you need a second set

of tests for accountability and incentives that are designed for those
purposes. Those tests would look
very different. They would never
repeat questions; they wouldn’t have
fixed formats; they would have a lot
of essay and short answer questions;
and they would not be multiplechoice tests, where there are
optimal strategies for when you
guess and when you don’t guess and
opportunities for people to coach students on test-taking
strategies. It would be a process of developing an entirely
new type of exam that would not be predictable and would
have the property that the best thing teachers could do to
raise the scores of their students would be to teach them in
ways that build a deep mastery of the curriculum.
The way that mechanism design works in economics is
you figure out what you want people to do and then you
build a system so that if they just try to maximize their own
take-home pay or maximize their take-home pay net of
effort — maximize their own well-being — then in response
to the system you designed, they will do what you want them
to do. So if we want teachers to teach well and build subject
mastery in students, we need to design a system such that
the best response of the teachers to the system is to teach
that way.
RF: On that second test, even if you give short-answer
rather than multiple-choice questions, there is still
opportunity and incentive for administrators to grade
those answers favorably. It seems that you need a third
party to grade the exams.
Neal: You need third-party everything. You need third-party
administration. You need third-party development of the
grading rubrics. The thing that is silly about No Child Left
Behind is that the impetus for it was the allegation that
there were local school districts wasting the money the federal government, and in some cases state governments, were
giving them, so we needed a system for making sure that
people were accountable, especially for the federal money
they received. Now, if the whole problem is that the states
aren’t holding the local school districts accountable for their
performance, then why in the world does it make sense to
develop a system where the states make up the exams, the
states define what proficient means, the states let individual
teachers administer the exams to their own students, and
then the state education office scores the exams and decides
what score is needed each year to meet the proficiency standard? It’s very much analogous to me telling my kids that
they have to clean their rooms but letting them define the
standards of cleanliness and then letting them inspect their
rooms to determine if they have met their own standards.
There are many things about the way No Child Left
Behind was implemented that just make you scratch your

Region Focus | Third Quarter | 2011

37

head in terms of whether this was ever a serious effort to
build an accountability system that could work. People roll
their eyes and scream and yell about the recent events in
Atlanta — where teachers got together, sometimes with
their principals, and changed the answers on exams before
they turned them in to be graded. But if you read the literature, it’s not a new story.
People in the private sector do not have situations where
their supervisors say, “You come up with your own evaluation form, fill it out, and turn it in to HR yourself.” Any
system that is going to be worth having is going to be one
that is designed to induce the behavior we want and to eliminate obvious opportunities for corruption.
RF: While many of the reforms you have discussed seem
relatively straightforward, it seems that it may be useful
to have economists who understand a little about mechanism design sitting on the committees that determine
how the programs are structured. Is that the case?
Neal: If you look at all the stuff I have been doing recently,
that idea has been the thrust of much of my research. One
way to understand why performance-pay and accountability
systems have been less than successful in public education is
that these are human resource policies that are typically
designed by people in education or maybe public policy
schools who have no background in the design of incentive
systems, who never took a class in the design of contracts,
who never took a class in personnel economics. And so far
what has happened, I believe, is that economists have gone
and done empirical work to show that poorly designed
incentive systems have had less than desirable outcomes
that were completely predictable if you analyzed the systems
from the outset. That’s been valuable but it’s time for economists to do more than that. So a big theme of my writing
of the last year or two is that people who work in the economics of education need to do more than just come up with
sophisticated methods for evaluating poorly designed programs. Economists need to start weighing in on how
programs should be designed, the same way that economists
weigh in on the design of regulations, the design of environmental policies, the design of government auctions, the
design of the tax code. They need to be involved from the
outset in building models that tell us how accountability
policies in education should be designed.
RF: What does your research on Catholic high
schools tell us about their performance? And are
there certain characteristics that are particularly
important to the results they achieve in some cases?
Neal: The second question is for someone who knows how
to run a school. As an economist, I can say that it appears
that the greatest gain from getting access to a Catholic
school is for economically disadvantaged kids who have bad
options in the public system. If you have a great public

38

Region Focus | Third Quarter | 2011

school you can attend, it’s not a big deal. If you live in an
urban area and you have a bright kid and you can get him
into a really good magnet school, who cares whether you
have vouchers, who cares whether your kid won a scholarship to a private school. You have made sure that the public
system gave you yours. I think the one thing that is most
clear from the public-private schooling literature is that if
there is a group who benefits greatly from having the private
sector more involved in providing alternatives for them, it’s
politically and economically disadvantaged people who live
in large cities where there is a large monopoly school system.
I believe what the literature shows is that Catholic
schools, as a rule, are not super schools. If you go to the
northern suburbs of Chicago, you are likely to find that both
the local Catholic school and the public school offer good,
comparable educations. But if you go to the inner-city neighborhoods of Chicago, you will find some Catholic schools
where the students are doing quite well but the public school
down the street is just a disaster. It’s not that Catholic
schools are better than all public schools. It’s that in certain
settings, Catholic schools are a lot better than their public
school neighbors.
RF: Much has been made about the importance of early
childhood education in building human capital over the
course of a student’s educational career and then later in
life. What is your view on this issue?
Neal: I am not an expert on this issue, but I am predisposed
to believe that there is something to it, that if you improve
children’s health and emotional well-being and cognitive
development early in life, you give them things they can
build on for the rest of their lives. That logic of making
investments that can grow over time is quite compelling.
The details of how you do it and how you could be confident
that it would pass a cost-benefit test, that’s something for
others who know more about the issue to determine.
RF: How large is the black-white skill gap and how
has it changed over time? And what might be done to
narrow that gap?
Neal: The gap is smaller than it used to be in terms of basic
reading and math skills. But it’s still quite large and it’s still
quite important as a determinant of overall labor market
inequality between blacks and whites. Jim Heckman and
others have pointed out in recent years that there are gaps in
noncognitive skills — persistence, work habits, personality
traits — that are also important. I think that it is fairly
obvious that strong basic reading and math skills are more
essential requirements for labor market success in many
different areas of the labor market than they were 50 years
ago and, therefore, even though the black-white skill gap
may be smaller than it was 50 years ago, the gap that remains
may be more important.
As for narrowing the gap, I think one of the things we

see first: real school choice for disadshould try — and I am not promising
vantaged families in large cities or
that it would necessarily work — is to
the elimination of wasteful farm subgive economically disadvantaged fam➤ Present Position
sidies? I don’t know, but both may
ilies who live in disadvantaged areas
Professor, Department of Economics
and the Committee on Education,
happen.
access to something like education
University of Chicago
vouchers that would allow the United
Way or the Catholic Church or the
RF: Broadly measured, what has
➤ Previous Faculty Appointment
local Edison Schools company —
been the trajectory of returns to
University of Chicago (1991-1998 and
whoever it might be — to move into
skill over, say, the post-war period
2001-present); University of Wisconsin
the neighborhood, open up a brand— and has that changed recently?
(1998-2001)
new school, and compete for the
public funding that has been allocated
Neal: It depends on where you look
➤ Education
to these students. I think there’s very
in the skill distribution. The return
B.S. (1985), Shorter College; M.A. (1987)
and Ph.D. (1992), University of Virginia
little evidence that people who are
to a college degree has been very
wealthy or upper-middle class benefit
significant for a long time, but it has
➤ Selected Publications
greatly from expanded access to
not grown in the last 10 years. The
Author or co-author of papers in such
private schooling options, because
return to graduate and professional
journals as the American Economic
they are usually politically powerful
degrees has grown during that periReview, Journal of Political Economy,
enough and geographically mobile
od. And you have more people going
Journal of Economic Perspectives,
enough to make sure that they get
on to graduate and professional
Review of Economics and Statistics,
good services for their children,
education at the same time that you
Journal of Labor Economics, and
either by living in a good school disdo not have more people graduating
Journal of Human Resources
trict or by sending them to a good
from high school, and you might
private school. The place where we
actually have fewer if you count
have the most compelling evidence that there would be
things correctly. So I think what we are seeing is a great
significant benefit from enhancing private alternatives is
polarization in terms of the skills and capacities that people
with disadvantaged minority populations, especially in large
have and also the lifetime earnings that people can expect
cities. If you have neighborhoods where the potholes aren’t
given their skills. We have a growing number of people who
always fixed, and the police and ambulances don’t always
are becoming very well educated and highly trained by
come when you call, and the trash isn’t picked up regularly
historical standards and another group that is poorly
because the people living in the community are poor and
educated even by the standards of several decades ago.
disenfranchised and do not have a lot of political clout in the
RF: We have seen a lot of stories in the popular press
city at large, it should not be a great surprise that those same
about the growing amount of debt that many college
individuals do not receive great public schooling.
students are incurring and whether that is a wise
RF: Opinion polls suggest that vouchers are, in fact,
decision from a simple pecuniary standpoint. What is
relatively popular with lower-income people and have
your view?
been for some time, yet there has been little progress on
that issue. How do those families gain the type of politNeal: I think that’s mostly silliness. The vast majority of
people have an option, or at least people who live in urban
ical support necessary to implement such programs?
areas have an option, of some state university where if they
Neal: I don’t know. Every time someone says it will never
go and pay in-state tuition, and they work hard, and they get
change, though, I always think about the time I said that 15
a degree that is marketable, the difference in what they will
or 20 years ago at lunch and Gary Becker said, “When I was
make over their lifetime as opposed to what they could have
your age, they said we would never deregulate the airlines or
made if they went to work at, say, a retail store out of high
trucking.” So I don’t understand how these things change
school and tried to work their way up the ladder with no
or why these things change when they do, but we do have
additional formal education is very large. The return on the
historical cases where there was an entrenched group of
time and money they spent on the college education is
special interests that either had government monopolies in
really impressive.
terms of providing some good or service or had a regulatory
environment that stifled competition, cases that were clearRF: There have also been some claims that a growing
ly wasteful and went on for some time, but at some point
number of people are now going to college who are
policy changed and we got rid of them. Does that mean it
simply not well suited for it.
will happen here? I don’t know. But I do think it’s hasty to
adopt a this-will-never-change attitude simply because the
Neal: People aren’t born as college material or not college
political actors involved are very powerful. Which will we
material. There is a whole sequence that happens in terms of

Derek Neal

Region Focus | Third Quarter | 2011

39

how their parents interact with them, how their teachers
interact with them, and how their parents interact with the
schools that determine whether they will have the cognitive
skills, work habits, and emotional stability to function well
in college. I think the real question is, given that the returns
to college have remained so high for so long, why has there
been such a tepid response in terms of the number of young
men — and it is more true among males than females — who
are being shaped and urged to become prepared to succeed
in college?
RF: Why do you think the unemployment rate,
especially the long-term unemployment rate, has
remained so persistently high following the recession?
Neal: I don’t have any favored theories that I would offer as
explanations for large components of what we have seen. I
do believe one reason that unemployment remains at 9 percent or more is that we have extended people’s eligibility for
unemployment insurance benefits in ways that were never
even dreamed possible decades ago. And I think we
have fairly clear evidence from many different states in this
country and many different countries around the world that
when people’s benefits exhaust, they look much harder for a
job and they become less picky about the jobs they are willing to take. I am not claiming that this is a huge portion of
why unemployment is 9 percent rather 6 percent, but I find
it inconceivable that the policies we have adopted with
respect to unemployment insurance haven’t played at least
some modest role in keeping unemployment high.
RF: What are the big unanswered — or understudied —
questions in labor economics, in your view?
Neal: I think the biggest question is one we have already
talked about. The returns to formal education have been
very high in the United States for a long time — at least from
the 1990s through the present — but there have been very
small changes in the number of males, in particular, who
graduate from high school and finish college. So the question of why we live in a world where skills appear to be so
valuable in terms of lifetime income and we still have
roughly the same high school graduation rate among men
that we had 30 or more years ago and college graduation
rates among those who have graduated high school that have
trickled up only a little bit is really puzzling. Why is that
happening? Why aren’t people responding to market signals
that skills are really valuable and, as a result, acquiring more
skills? A related question is: Why do the girls appear to get
it? Unlike with males, there have been noteworthy
changes in terms of educational attainment and skill
development among young women over the same time
period. I don’t think we have answers to those questions,
but I think they are key to understanding why we see
the type of inequality that exists in the United States and
what we can expect in the future.

40

Region Focus | Third Quarter | 2011

RF: As the editor of the Journal of Political Economy,
how would you assess the overall health of the publication process in economics? Are there things that could
be done to improve its efficiency and more generally the
dissemination of research?
Neal: This is a completely organic market. We see new
journals start all the time and we see old journals fold; we see
some journals that make you pay submission fees and we see
some journals that don’t; we see some journals that have very
high standards and publish one out of 15 papers submitted
and other journals that publish one out of three papers
submitted.
So, overall, I think this organic publication process with
no central governing body works pretty well. I believe there
are very few papers worth reading that aren’t published
somewhere. If there is any inefficiency on that dimension it
may be that papers are often published years after they
should be because some editors allow the perfect to become
the enemy of the good and waste months and sometimes
years on revisions that have marginal value. So it may be the
case that there are publication delays due to socially inefficient editorial behavior. But there are so many journals and
so many different outlets that I find it very hard to believe
there are good papers out there that don’t see the light
of day.
RF: Which economists have been most influential in
shaping your research agenda and your thinking about
economic policy issues?
Neal: I would say that the most important person who I
ever had the privilege to interact with on a daily basis was
Sherwin Rosen, who was a senior labor economist at
Chicago when I was an assistant professor. He was both a
mentor and a dear friend and was very willing and eager to
sit and discuss ideas and help me learn about so many
different areas of economics that I hadn’t been exposed to.
He also took the time to give me a pat on the back when
I needed it and to give me a kick in the rear when I needed
that.
I also learned a great deal from Bill Johnson and Steve
Stern, my thesis committee chairs at the University of
Virginia. Bill, especially, was very much like Sherwin in wanting to see how labor economics fit within the big picture of
economics generally and how to always be aware of the
opportunities to take ways of analyzing markets that were
maybe more prevalent in other areas of economics and bring
those into labor economics.
Even though I was on the faculty at the University of
Wisconsin for only three years, John Kennan, whose office
was next to mine there, also was in many ways like Sherwin,
in that he knew a great deal about many fields of economics
outside labor and was very willing to help me learn things
that have improved my work and have made me a better
RF
economist.

AROUNDTHEFED

The Uncertain Effects of Economic Uncertainty
BY C H A R L E S G E R E N A

“How Do Households Respond to Uncertainty Shocks?”
Edward S. Knotek II and Shujaat Khan, Federal Reserve
Bank of Kansas City Economic Review, Second Quarter 2011,
pp. 63-92.

uestion marks hang over many things lately — from
the pace of the nation’s recovery to the implementation of health care and financial market reforms. As a result,
many believe businesses and consumers are holding back
on spending decisions that would be costly to reverse.
But do sudden changes in uncertainty actually lead to
significant changes in economic activity? Economists have
tried to factor in varying levels of uncertainty into their
models. But, note Edward Knotek II and Shujaat Khan of
the Kansas City Fed, “the results have been mixed thus far,
with some authors finding that fluctuations in uncertainty
are a key factor in the business cycle, while others have
found little such evidence.”
Knotek and Khan tackled this question by looking at
how households respond to changes in two measures of
uncertainty: first, the monthly appearance of the words
“uncertainty” or “uncertain” in New York Times articles about
the economy and second, an index of stock market volatility.
Knotek and Khan found that increases in uncertainty
don’t necessarily lead to sharp pullbacks in household purchases and economic weakness. Rather, reductions in
spending overall seem to be modest and take some time to
occur. “In addition, movements in uncertainty account for
only a small portion of the total fluctuations in household
spending,” the authors write.
They concede that, overall, recessions tend to coincide
with spikes in uncertainty and that expansions tend to coincide with low or declining uncertainty. However, they cite
prominent exceptions to those correlations.

Q

“How Much of the Decline in Unemployment is Due to the
Exhaustion of Unemployment Benefits?” Luojia Hu and
Shani Schechter, Federal Reserve Bank of Chicago, Chicago
Fed Letter No. 288, July 2011.

lthough the pace of the recovery in labor markets has
been painfully slow, the national unemployment rate
has been generally falling since its peak in October 2009.
Some of that decline, however, may be due to a rise in the
number of people dropping out of the workforce as they
lose their unemployment insurance (UI) benefits, which
are paid out only to those actively seeking a job.
Luojia Hu and Shani Schechter, two economists at the
Chicago Fed, used data on unemployment duration to identify how many workers were within five weeks of exhausting

A

their UI benefits. Then the researchers tracked these
so-called “exhausters” over time to see whether they found
work, continued their job search, or gave up and left the
workforce. They compared this group to “nonexhausters” —
longtime unemployed who still had more than five weeks of
eligibility for benefits.
“From 2008 onward, nonexhausters became increasingly
likely to stay unemployed,” Hu and Schechter write in their
paper. Meanwhile, exhausters followed a similar trend until
mid-2009, when they started becoming less likely to stay
unemployed. “This visible split between the two groups’
tendencies to leave unemployment, especially from
September to December 2010, is not so much due to
exhausters being more likely to find a job…but due to
exhausters’ higher likelihood of leaving the labor force.”
“Adopting, Using, and Discarding Paper and Electronic
Payment Instruments: Variation by Age and Race.” Ronald J.
Mann, Federal Reserve Bank of Boston, Public Policy
Discussion Paper No. 11-2, May 2011.

an’t remember the last time you pulled a dollar bill out
of your wallet? You’re probably among the millions of
Americans who almost exclusively use cards or electronic
forms of payments for their convenience and security.
The Boston Fed decided to find out more about this shift
in behavior by conducting a survey in fall 2008. A recent
paper details some of the survey’s findings, including demographic differences in how Americans have adopted new
types of payment instruments.
For example, older adults (ages 45 years and older)
appear much less likely to use debit cards or online bill
payment, even when controlling for variables like income,
education, and race. Also, a much higher percentage of
blacks than whites use money orders. In general, “blacks are
significantly less likely to use all of the products summarized
here [paper, cards, electronic payments] except for debit
cards,” notes Ronald Mann, an electronic commerce expert
at Columbia University who wrote the paper.
Mann reran his model and controlled for various characteristics of the survey respondents to attempt to account for
this difference. Each time, the model yielded a similar racial
divide — except when he narrowed his focus to blacks and
whites with checking accounts. There, he found no significant differences in adoption rates of payments instruments.
“The analysis sheds relatively little light on precisely
what is causing [many] blacks to shy away from noncash payment instruments,” concludes Mann, “but it does suggest
that it is closely related to whatever is keeping them from
using mainstream institutions like checking accounts.” RF

C

Region Focus | Third Quarter | 2011

41

ECONOMICHISTORY
Wartime Wilmington
BY R E N E E H A LT O M

Wilmington residents celebrate the christening
of the S.S. Zebulon B. Vance — the first
ship launched out of the North Carolina
Shipbuilding Company, on Dec. 6, 1941 —
with no idea the Pearl Harbor attack
that drew the United States into the war
would take place the following day.

42

hose on the homefront during
World War II remember
exciting times and a thriving
economy, but also the shortage of
some basic necessities. Wilbur Jones
recalls finding a creative solution:
Walking across the street to trade
with German prisoners of war.
Jones was just 5 years old when
Europe went to war in 1939. His hometown is Wilmington, N.C., located on
the Cape Fear River, less than 30 miles
from the mouth of the Atlantic Ocean.
The city hosted 551 POWs at three
camps spread across the city toward
the end of the war. “We couldn’t
always get bubble gum and candy at
our ‘mom and pop’ neighborhood
stores,” due to wartime sugar rations,
he remembers. “The only place to get
it was the German prisoners.” He and
his friends got sweets in exchange
for paper for the prisoners to write
letters home. “You
could just walk up to
the fence.”
In addition to
hosting three of
the nation’s roughly
500 POW camps,
Wilmington was a
hotbed of defense
activity during the
war. The metropolitan area housed bases
for all five branches
of the military,
including 50,000 soldiers at Camp Davis, an Army training
facility. The state’s largest port
shipped materials to allies and imported scarce petroleum from the Gulf of
Mexico and Brazil. The Atlantic Coast
Line Railroad, then headquartered in
Wilmington, transported equipment,
defense workers, and troops. Most
important of all to Wilmington’s economy was a privately run shipyard that
became the largest employer, and the

T

Region Focus | Third Quarter | 2011

largest defense producer, in the state
that housed more servicemen than any
other in the country.
The North Carolina Shipbuilding
Company (NCSC) opened in 1941 as
one of a handful of shipyards constructed nationally in an emergency
effort to expand the nation’s cargo
shipping fleet for the war. The NCSC
employed an estimated 21,000 people
at its peak, many of whom brought
their families to Wilmington. The
city’s population surged from 34,000
before the war to perhaps more than
100,000, all within a span of two or
three years. The locals coped with the
population explosion, massive construction projects, overcrowding and
food shortages, and even the threat of
enemy attack.

Prophetic Production
“Let me give you a picture of what
Wilmington was like in 1940,” says
Jones, speaking as a resident, historian, and military veteran. “It was the
hub of southeastern North Carolina.
It still is, but the area then was
extremely rural.” The downtown area,
he recalls, “was probably no more than
one half mile by one half mile.
This was where all the financial
institutions, theaters, restaurants,
department stores, doctors, and
dentists were located. Anytime someone needed something, they’d have to
go downtown.”
Wilmington’s small-town institutions were totally unprepared for the
economic boom brought by the war,
but the shipping industry was not.
Congress had the “prophetic foresight” to pass the Merchant Marine
Act in 1936, as described by Admiral
Emory Land, head of the newly established Maritime Commission. The act
authorized a massive shipbuilding program to restore and modernize the
nation’s aging and outdated merchant

PHOTOGRAPHY: COURTESY OF NEW HANOVER COUNTY PUBLIC LIBRARY

World War II
shipyards brought a
short-lived economic
boom to the
North Carolina
port city

fleet, comprised of privately owned cargo ships that would
become a naval auxiliary in times of war.
The Maritime Commission’s objective was to build
50 ships per year over 10 years — a lofty goal considering
that the nation produced a grand total of two dry cargo
freighters in the 15 years prior. Nearly all of the nation’s 1,375
merchant ships before World War II were two decades old
and obsolete. New shipyards and technologies were in
desperate need.
The increasing war threat upped the ante. By mid-1940,
less than one year into the conflict, Britain — which had by
far the largest fleet in the world — had lost 10 percent of its
shipping capacity, mostly as a result of the devastating
German U-boat submarine campaign. France had already
fallen. Germany controlled the coast of Europe and threatened to strangle Britain’s resources. Without an adequate
ship supply to transport weaponry, equipment, and soldiers
to the front lines, the war was starting to look dismal for the
Allied forces, which the United States would eventually join.
The shipbuilding program was accelerated as the war
threat mounted, and production targets expanded considerably in 1941. Part of the impetus was Congress’ decision that
year to allow President Roosevelt to supply ships to Britain
and other Allied powers under the “Lend-Lease” program,
despite the United States being technically still neutral. The
strategic headway made between the wars helped make possible what war historians view as one of the most remarkable
feats of engineering and production in human history: The
United States built a total of 5,777 cargo vessels under the
Maritime Commission between 1939 and 1945.
An early task was to choose cities to host additional shipyards, which would be privately run with the help of federal
subsidies. The offices of the Maritime Commission were
flooded with letters from politicians and other local
interests throughout the country lobbying to be one of the
chosen locations.
Wilmington offered an ideal site. The Cape Fear River’s
estuary boasted deep water, and the region’s temperate
climate limited the harshness of year-round outdoor production. The Atlantic Coast Line Railroad was headquartered
there, which would allow the 250,000 parts to each ship to
be prefabricated in 250-ton sections offsite, traveling by railcar for final construction at the yard. The shipyard was
located several miles up the Cape Fear River, which allowed
ships to be launched without immediately becoming vulnerable to enemy attack. Perhaps most important was the large,
cheap labor supply that Wilmington offered. Swathes of
unemployed, lower-skilled men were seen as adaptable for
training in various ship-related trades. Without much other
industry around, a Wilmington shipyard would face little
competition for trade laborers compared to the larger cities
that were vying for yards.
The Wilmington yard was announced on Jan. 10, 1941,
and ground broke on February 3. Within three months —
under the shadow of increasing Allied ship losses and likelihood of U.S. involvement in the conflict — enough progress

had been made to lay keels for the first two vessels.
The NCSC’s first vessel, the S.S. Zebulon B. Vance, sailed
on Dec. 6, 1941 — the day before the horrific attack on Pearl
Harbor that officially drew the United States into war.

High Marks for Wilmington
The NCSC produced 243 ships in its five years of operation.
Half were the famed Liberty ships, designed for quick
assembly line construction, not for aesthetics. Upon first
sight, President Roosevelt declared them “dreadful looking
objects,” and they became known as Ugly Ducklings, even in
official correspondence. Liberties were designed to carry
10,000 tons of cargo — such as 2,840 jeeps, 440 light tanks,
or 234 million rounds of rifle ammunition.
Once up and running, the shipyards’ productivity
improvements were astounding. In early 1942, a Liberty ship
took an average of 241.6 days to complete. In December of
that year, 82 Liberty ships were completed nationally in an
average of 55 days. The structure of the federal subsidy was
designed to reward productivity and encourage friendly
rivalry between the yards. One California yard produced a
Liberty ship in barely more than four days as a publicity
stunt.
Stunts aside, the Wilmington shipyard had one of the top
production records. The NCSC was one of five yards to earn
consistently high marks from the government’s Truman
Committee, created to ensure efficient defense production.
The western yards excelled in speed, while Wilmington’s had
the lowest dollar cost per ship of all the yards building
Liberty ships — partly because southeastern wages were low
— and also ranked second in productivity.
Margaret Rogers, a young child during the war, used to
cross the Cape Fear River Bridge to check out the ships.
“There were so many stockpiled there that they ran from the
river, from the highway all the way back to the state port and
you could literally step from one ship to the other without
touching the water for miles,” she remembers. (Rogers relays
her experience in “World War II: Through the Eyes of
Cape Fear,” a commemorative website created jointly by the
University of North Carolina Wilmington and the Cape Fear
Museum. It’s at http://library.uncw.edu/capefearww2/.)

Boomtown
Beyond the walls of the shipyard, Wilmington embodied the
wartime incongruity of profit and economic boom juxtaposed with shortage, sacrifice, and discomfort.
All of Wilmington, it seemed, found profit. Retailers
providing clothing, food, and entertainment formed the
nucleus of the social scene. Banks and real estate agents
served the new residents. Truckers hauled supplies between
Camp Davis and Camp Lejeune, both newly opened in early
1941. The city became a madhouse on weekends when
soldiers flooded downtown for recreation.
“You stood in line everywhere,” Helen Dobson told
Wilmington Magazine in 1995. She was a schoolteacher who
took a summer job organizing housing for shipyard workers.

Region Focus | Third Quarter | 2011

43

“If you were lucky enough to get a [restaurant] booth or
table, you had to keep your hand on your coffee cup because,
I’ll tell you, they would grab it up and take it! They want to
get more people in there and move you out!”
As with a lot of wartime boomtowns, the city’s housing
stock couldn’t quite keep up. One in five Americans relocated during the war, many more than once, and most of those
who relocated did not return to their original hometowns.
One in eight Americans left farm life for good. Cities like
Wilmington were their destination. Right away, tiny
Wilmington was short 3,000 housing units — even though
half the shipyard workers commuted up to 95 miles a day
from their homes outside the city. The shipyard leased
eleven 100-person trailer buses to transport workers, many
of whom continued to work on local farms. Shipyard managers turned a blind eye to summertime absenteeism so
workers could tend to their crops as necessary. Such workers
earned the pride of supporting two wartime necessities:
defense and food production.
Eventually the federal and local governments would build
more than 6,000 new housing units, and private groups
another 1,400, all within walking distance from the shipyard. But the housing shortages persisted — and since ships
couldn’t be produced without workers, families were urged
to rent out rooms in their houses as a patriotic gesture.
Everyone went along, if a bit grudgingly at first (“Southern
hospitality only went so far,” Wilbur Jones writes in one of
his memoirs of the war experience.) Few people were willing
to rent to single women, which posed a serious problem for
the teachers, nurses, and shipyard women who filled the
city’s labor gaps.
When housing couldn’t be found, residents simply
doubled up. It was common for men to rotate the use of a
single bed according to shipyard shift; when one’s shift
started another turned in for sleep. Building codes were
sometimes cast aside. One local shipyard worker reported
dividing his house into five separate apartments, finding
immediate takers for the cramped quarters. The Wilmington
Morning Star reported an instance of 40 shipyard workers
sharing a home with one toilet and a single bathing area.
Food was another serious problem. Meat, butter, sugar —
all the meal staples were rationed. Not all cities experienced
the shortages felt in Wilmington; the Office of Price
Administration (OPA) had determined food and ration allocations based on the city’s lower prewar population. Families
waited in store lines for hours on mere rumors of a new beef
shipment. The city’s handful of restaurants had to close or
serve scraps after running through their ration points. The
OPA couldn’t prevent black markets from arising for virtually all rationed goods; the Raleigh OPA office deemed
Wilmington the state’s worst violator of price controls.
Despite the tough times, community prevailed.
“Everybody in the neighborhood took care of everybody
else. We didn’t have homeless people in our neighborhood
unless the person decided that was what they wanted to do.
We didn’t have people who were hungry unless they were

44

Region Focus | Third Quarter | 2011

just so proud that no one knew they were hungry,” Rogers
remembers. Her dad worked for the railroad, and she recalls
that cargo filled with basic necessities was sometimes
“accidentally” spilled, workers sharing the contents with the
neighborhood.
A byproduct of all the war activity that flocked to
Wilmington was that it displaced the labor supply of
surrounding farms. This might have exacerbated the food
shortage, except the POWs again proved helpful. Most were
held in the main camp, located in a large park right in the
middle of an old established neighborhood. The location
was chosen so farmers could easily bus the prisoners out to
work on local dairy farms. The prisoners labored there without shackles, but never tried to flee; after all, imprisonment
meant freedom from combat. Many formed relationships
with the farmers and corresponded with them for years after
the war.

War Comes to Wilmington
Wilmingtonians had more than shortages and overcrowding
to worry about. As soon as the United States entered war,
residents became acutely aware of how vulnerable they
stood as a militarily strategic town on the coast of the
Atlantic. The city was at the southern end of so-called
Torpedo Junction, a stretch of waters infested with German
U-boats. In the first six months of 1942 alone, 397 ships were
sunk off the East Coast, including more than 80 off the
coast of North Carolina.
“Our government put us on pins and needles to anticipate that at any time, the Germans could attack by land, sea,
or air,” Jones recalls. “We were put on constant footing with
air raid drills and blackout drills [since city glare could illuminate American ships patrolling the coast], looking for
German planes. Goodness knows where they were supposed
to be coming from, as they didn’t have any aircraft carriers.
But we didn’t know that,” he says. “What we did know was
that submarine warfare was going on because we had evidence washing up on the beaches.”
The threat of attack led to some local lore that is disputed to this day. Many Wilmington residents recall the night of
July 24, 1943, when a German U-boat reportedly fired several shells at the Ethyl-Dow chemical plant, located on Kure
Beach, 15 miles south of Wilmington. The plant produced a
special compound for aviation gasoline. The shots missed,
the story goes, and landed on the other side of the Cape Fear
River. During the blackout that night, even the round-theclock shipyard went dark, which residents knew meant it
was more than a drill; attack — or further attack — could be
imminent. Thankfully, none followed.
But the alleged U-boat incident was never proven, and a
critic — another Wilmington-based veteran, David Carnell,
who died in April — has argued that German records establish U-boat activity in the region had ended before then.
“Some people think the attack is a myth,” Jones says,
“but I’ve accumulated enough evidence to say it’s not. It
happened.” If true, it apparently would be the only German

attack (apart from failed sabotage missions) to have taken
place on American soil during the war.

“Back to a Sleepy Little Town”
Just as remarkable as the boom’s magnitude was how quickly it evaporated with the war’s end in 1945. The Army closed
shop at Camp Davis and Fort Fisher the year before, and the
Air Force followed in 1945. The last vessel was launched
from the North Carolina Shipbuilding Company on April 16,
1946. Wilmington’s economy collapsed, and its 6-digit
population plunged to roughly 50,000. Thousands of veterans returned to the city but there were few jobs for them.
Wilmington’s somewhat parochial culture wasn’t always
welcoming to would-be transplants, and many eventually
moved on.
That’s what makes Wilmington’s World War II experience unique, Jones says. “The thing about [other war
boomtowns] like Philadelphia, Long Beach, San Pedro,
Norfolk, Newport News, is that after the war, they
continued to thrive. There was no bell curve for them,”
Jones says. “Everything went back to Wilmington being a
sleepy little town.”
The $20 million shipyard that had once employed up to
21,000 people became the center of a tug of war between the
Maritime Commission and local interests looking to regenerate Wilmington’s economy. Nearby shipyards weren’t
eager to welcome peacetime competition, and with perhaps
a little nudging, the Maritime Commission decided to place
the NCSC, along with three West Coast shipyards, in a

dormant reserve status while international tensions subsided. This prevented the shipyard from being sold or
commissioned for alternative use. (It wasn’t until the end of
1949, after five years of negotiations, that the Maritime
Commission finally leased the facility to the state of North
Carolina to become the site of the state ports authority.
The land itself was locked in legal battle until 1971.)
Another major blow to the economy came when the
Atlantic Coast Line Railroad abandoned Wilmington as its
headquarters in 1960. Wilmington wouldn’t see another
boom until the completion of the I-40 highway in 1990,
which provided a vital link from the ports to the inland
Mid-Atlantic population, once again turning the city’s
prospects around. Along with the North Carolina State
Ports Authority, today Wilmington hosts a campus of the
University of North Carolina, a tourism industry that pumps
nearly $400 million into the local economy each year, and
the largest television and movie production studio outside
of California. Its population is about 106,000 — roughly
equal to its size during the war.
Though the economic boom belied sometimes painful
conditions, many Wilmington residents remember World
War II as one of the most exciting times of their lives. The
city bustled with an energy and purpose it had never experienced. Though the boom faded, the city’s wartime heritage
has remained. Many of the buildings — even some of the
hastily constructed housing projects — are still in use today.
Even more potent for the war’s witnesses is the memory of a
handful of years when the city took on new life.
RF

READINGS
Jones, Wilbur. A Sentimental Journey: Memoirs of a Wartime
Boomtown. Shippensburg, Pa.: White Mane Books, 2002.
____. The Journey Continues: The World War II Home Front.
Shippensburg, Pa.: White Mane Books, 2005.

REVENUES

Lane, Frederic. Ships For Victory: A History of Shipbuilding Under the
U.S. Maritime Commission in World War II. Baltimore: Johns
Hopkins University Press, 2001 (second edition).
Scott, Ralph. The Wilmington Shipyard: Welding a Fleet for Victory in
World War II. Charleston, S.C.: The History Press, 2007.

continued from page 33

a state with high exposure to capital gains, is about to find
out. Last year, the state passed a law limiting the amount
of capital gains revenue the state can include in its
operating budget. Anything over $1 billion will go into a
rainy-day fund.

The Massachusetts strategy does nothing to improve
revenue forecasting, but bigger rainy-day funds (especially
in states that experience dramatic revenue swings) appear to
be the best defense against inevitable forecasting errors in
an increasingly unpredictable economic environment. RF

READINGS
Fox, William F. “Three Characteristics of Tax Structures Have
Contributed to the Current State Fiscal Crises.” State Tax Notes,
Aug. 4, 2003, pp. 375-383.
Kaglic, Richard. “How the Recession Affects State and Local Tax
Shortfalls — and How Those Shortfalls Affect the Recovery.”
Region Focus, Fourth Quarter 2010, pp. 36-39.
Mattoon, Richard, and Leslie McGranahan. “Revenue Bubbles and
Structural Deficits: What’s a State to Do?” Federal Reserve

Bank of Chicago Working Paper No. 2008-15, July 2008.
Sjoquist, David L., and Sally Wallace. “Capital Gains: Its Recent,
Varied, and Growing (?) Impact on State Revenues.” State Tax Notes,
Aug. 18, 2003, pp. 497-506.
“States’ Revenue Estimating: Cracks in the Crystal Ball.” The Pew
Center on the States and The Nelson A. Rockefeller Institute of
Government at the State University of New York at Albany,
March 2011.

Region Focus | Third Quarter | 2011

45

BOOKREVIEW
Depression and Innovation
A GREAT LEAP FORWARD:
1930S DEPRESSION AND
U.S. ECONOMIC GROWTH
BY ALEXANDER J. FIELD
NEW HAVEN: YALE UNIVERSITY PRESS,
2011, 387 PAGES
REVIEWED BY DAVID A. PRICE

he 20 years following World War II saw an extraordinary period of prosperity in the United States.
While the business cycle had not disappeared —
there were occasional brief recessions — the period is
remembered today for its burgeoning middle class, rapidly
rising output, and modest inflation. When did the leaps in
productivity occur that laid the foundations for this prosperity? During the war? Or perhaps during the boom years
of the 1920s?
Economic historian Alexander Field of Santa Clara
University argues in A Great Leap Forward that the answer is
“none of the above.” For Field, the Depression-era decade of
the 1930s — despite its financial crisis and unemployment —
was a period of greater technological and organizational
innovation than either the 1920s or the war years, and one
that made a greater contribution to America’s economic
development. The 1930s represent a “golden age,” Field says,
that “experienced the fastest sustained growth in the
material standard of living in U.S. economic history.”
Field draws this conclusion based primarily on rates of
total factor productivity (TFP) growth; TFP, a measure of
productivity in relation to the supply of all inputs, can
be understood (with some exceptions) as a measure of
innovation. The numbers are clear: TFP grew faster during
the period of 1929-1941 than in other 20th-century periods.
Although inputs increased only very slightly, if at all, from
1929 to 1941, real output grew at a rate between 2.3 percent
and 2.8 percent annually. Not only was TFP growth higher in
the 1930s, it was also broader-based; while TFP growth in
the 1920s was almost entirely within manufacturing, in the
1930s it also gained strongly in other sectors, including
wholesale and retail, transportation, and public utilities.
No one area of innovation was responsible for the 1930s
advance in productivity. A major cause, in Field’s view, was
public infrastructure spending, especially the building-out
of the highway network; this, in turn, led to a transforming
of transportation and distribution through the integration
of railroad shipping and trucking. In addition, the decade

T

46

Region Focus | Third Quarter | 2011

brought significant innovations in chemistry and materials
that improved equipment and structures and extended
their lives. Finally, employment in private research and
development in manufacturing more than quadrupled.
Field’s account of the course of progress between the
wars is closely argued and firmly grounded in statistics. It is
a valuable reminder that the 1930s, although ruinous in
terms of unemployment, were far from bleak in terms of
technological and business innovation.
At the same time, a closer analysis indicates that much of
the TFP growth took place in one year, 1941. Some 30 percent of TFP gain from 1929 to 1941, and 22 percent of TFP
gain from its 1933 trough, shows up in that single year. While
it’s true that the United States did not enter World War II
until the last weeks of 1941, the question remains:
To what extent was the concentration of TFP growth in that
year a product of President Roosevelt’s prewar buildup, how
much of it was due to highway spending and the other phenomena that Field catalogs, and how much of it came from
other, unexamined influences emerging in the early 1940s?
Field rejects any influence from the buildup on innovation at
that point on the basis that “only a small fraction” of total
military spending for the war had already been spent.
With regard to the war years themselves, Field concedes
that some advances came about through the war effort, such
as radar, penicillin production, and atomic energy, but holds
that “there is relatively limited evidence of beneficial feedback from wartime production to civilian activity in the
postwar period.” Even with regard to the wartime spinoffs,
he believes the war may have done no more than accelerate
developments that were already on course to happen
regardless.
Such an assessment, however, seemingly would require a
micro-level study of the development of these technologies
and their prewar trajectories, a type of analysis that Field
eschews here. The counterfactual question — what would
have happened without the war? — is, of course, impossible
to resolve conclusively. But it does appear likely that at least
some important innovations would have come about much
later. Atomic power is one. Another is the commercial
production of penicillin, stymied until rescue came from a
citric-acid manufacturer, Charles Pfizer & Co. of Brooklyn,
which applied its unique fermentation expertise to the
problem — a cross-disciplinary breakthrough that would
have been unlikely without the exigencies of war.
It goes to show that innovation does not yield easily to
quantitative analysis. Nonetheless, A Great Leap Forward
will no doubt stimulate scholars of the subject for years
RF
to come.

DISTRICTDIGEST

Economic Trends Across the Region

The Federal Presence in the Fifth District
BY R . A N D R E W BA U E R A N D J A K E B L AC K WO O D

he Fifth District has a very diverse economy with
strong manufacturing, trade, and service sectors.
The District economy also benefits from the presence of the federal government from the capital in Washington, D.C., to the numerous civilian and military facilities
located throughout the District. Government employment
and spending is an important source of demand, attracting businesses to the region to provide goods and services
to various government agencies. In many cases, these goods
and services are technical in nature and require highly
skilled or educated workers and sometimes also include
capital-intensive production processes. These additional
resources add to the District’s productive capacity and
higher rates of economic growth. In addition, government
employment and spending has traditionally brought a
source of stability to the District economy, acting as a
buffer during economic downturns.
Yet with the recent focus on the budget deficit — both
the short-term deficit as well as long-run fiscal imbalances
— the benefit of having the federal government’s presence
and influence in the economy may become a potential
source of uncertainty. The impact of budget cuts would
vary across the Fifth District as the influence of the federal
government varies in each jurisdiction, both in terms of
employment and contract spending.

T

Civilian Employment
A primary conduit through which the government influences the economy is the civilian job market. By hiring and
laying off federal employees, the government can tangibly
boost or dampen a location’s economy. This is particularly
true in the Fifth District. In March 2011, more than 500,000
people in the Fifth District were employed directly by the
federal government, many of them concentrated in the

PERCENT OF TOTAL EMPLOYMENT

Federal Employment
25

23.7

20
15
10
4.0

5.2

5

1.1

1.2

NC

SC

4.0
2.2

1.6

0
DC

MD

SOURCE: U.S. Office of Personnel Management

VA

WV

Fifth
District

US

Washington, D.C., metro area. As the District’s largest
employer, the federal government could greatly affect the
regional job market through future budget cuts.
The influence of government employment can be further
quantified by some other measures. One approach is to
describe its presence in terms of the government share of
total civilian employment. Four percent of Fifth District
citizens were employed by the federal government (excluding the postal service and military) in March 2011, while
federal employees made up only 1.6 percent of workers in
the United States as a whole. At the state level, federal government shares of employment were as high as 23.7 percent
in D.C., while Maryland and Virginia also posted high shares
of 5.2 percent and 4.0 percent, respectively (see chart).
Government shares of employment were notably lower in
West Virginia and the Carolinas, although West Virginia’s
share (2.2 percent) was still higher than the national average.
Perhaps the geographic concentration of federal government jobs in the Fifth District helps explain the strength of
the relationship between this region and the federal government. In March 2011, one-quarter of all federal government
workers were employed in the Fifth District. D.C. alone
accounted for 8 percent of federal employment, with
Virginia (6.9 percent) and Maryland (6.2 percent) also contributing a notable amount of workers.
Fifth District citizens not only make up a disproportionate amount of federal government payrolls, they also take
home larger paychecks. In the United States as a whole, 27.8
percent of federal government workers received a salary of
less than $50,000 in March 2011, whereas only 16.3 percent
of all District federal employees earned less than $50,000.
District employment was also more heavily concentrated in
higher-paying jobs, with 46.8 percent of federal employees
making more than $90,000 per year, while in the United
States as a whole, only 30.2 percent of federal employees
took home $90,000 or more in salary (see chart on page 48).
As one would expect, not all states in the Fifth District
reap the same benefits from the presence of the federal government. Indeed, government influence differs greatly in
the farthest states from the government seat. For example,
the Carolinas have smaller shares of federal workers than
both the Fifth District and the nation as a whole. Moreover,
the salary distribution of federal government employees
suggests that federal workers in West Virginia and the
Carolinas earn less than those in D.C., Maryland, Virginia,
and the nation as a whole, although some of this difference
may be offset by cost of living adjustments. More than 36
percent of federal workers in the Carolinas are paid less than
$50,000 per annum, and less than 17 percent make more

Region Focus | Third Quarter | 2011

47

PERCENT OF TOTAL
GOVERNMENT EMPLOYMENT

Federal Government Salary Distribution by State
35
30
25
20
15
10
5
0
DC
$0-$29,999
$90,000-$109,999

MD

NC

$30,000-$49,999
$110,000-$129,999

SC

VA

WV

$50,000-$69,999
$130,000-$149,999

Fifth
District

US

$70,000-$89,999
$150,000+

SOURCE: U.S. Office of Personnel Management

than $90,000 a year. According to these data, federal
employment in these states underperforms both in terms of
quantity and quality in comparison with the rest of the
Fifth District.

Defense Employment
Although budget reductions typically carry implications for
most departments and agencies, a common thread among
the various proposed federal budgets this year is revision to
defense spending. Most proposals address the rate of growth
in total military spending, calling for tighter caps on spending rather than broad cuts. Nonetheless, many plans require
absolute cuts to certain defense programs, which could have
a more immediate effect on employment and the economy.
This carries a good deal of weight in the Fifth District,
where the Departments of the Army, the Navy, and Defense
employ more than 30 percent of civilian federal government
employees.
Furthermore, the employment statistics above understate the effect of defense budget cuts on Fifth District
employment because they do not cover military personnel.
For many citizens, the numerous military bases located in
the Fifth District are the most visible representations of the
federal government’s influence on employment. From
Fayetteville, N.C., home of Fort Bragg, to the Beltway area
around Washington to the U.S. Navy installations of
Hampton Roads, the military’s presence is especially constant and vital to the economy. For these places, the military
is an important engine of local employment. (See also
“The Benefits and Burdens of Expanded Military Bases,”
Region Focus, First Quarter 2011.)
According to 2009 data, more than 250,000 military personnel were stationed in the Fifth District, making it home
to 23.5 percent of the nation’s military. While having a low
share of civilian government employment, North Carolina
accounted for 10.3 percent of all military personnel in the
nation — the highest share in the Fifth District and the
third highest in the nation. Virginia had the next highest
share (5.8 percent), followed by South Carolina (3.0 percent),
Maryland (2.8 percent), D.C. (1.2 percent), and West Virginia

48

Region Focus | Third Quarter | 2011

(0.1 percent). Such high concentrations of military personnel in the District would make military cuts particularly
significant to the region.
Budget cuts may also vary in their effect on different
branches of the military, making the composition of the
military in the District a notable factor. Within the Fifth
District, a large majority of active military were Army personnel (41.3 percent), followed by the Marine Corps
(30.0 percent), and the Air Force (16.2 percent). Notably, 30.1
percent of all Marine Corps personnel in the United States
are located in North Carolina, home of Camp Lejeune, the
largest Marine Corps base on the East Coast. Also, the Navy
has stationed 36.4 percent of its personnel in the Fifth
District, although naval personnel account for only 7.4 percent of the military in the region.
These figures and percentages are bound to shift not only
in response to budgetary actions, but also in response to the
shifting structure of the military. The Base Realignment and
Closure plan from 2005, or BRAC, details the shifting of
personnel across various institutions and, in some cases, the
closure and expansion of installations. By the time of the
scheduled completion date in mid-September, the Fifth
District will have ultimately gained military jobs through the
BRAC plan, adding 1,368 net jobs in the process, despite the
closure of 11 installations in the District. These gains will not
be shared equally, however, as four of the six jurisdictions will
lose military personnel due to the plan. Though Virginia will
gain 5,101 military jobs, and South Carolina is set to gain 1,464
jobs, D.C. will lose almost 3,000 military personnel and
Maryland will lose more than 1,500 defense jobs. North
Carolina and West Virginia will both lose less than 1,000
military jobs due to the realignments and closures.
Overall, both military and civilian employees in the Fifth
District are likely to be affected by federal budget cuts.
Even if budget cuts do not lead to outright eliminations of
military or civilian positions, they could yield further pay
freezes or reductions. Pay cuts would almost certainly affect
the Fifth District more than some other areas, as more
highly paid government workers generally shoulder a disproportionate amount of the burden when pay is cut. Whether
through job loss or salary reduction, the potential impact of
budget cuts causes uncertainty in the Fifth District economy via the labor market.

Federal Contract Spending
In addition to employing workers, the federal government
influences the economy through fiscal expenditures. There
are many forms of government expenditures: contracts,
grants, loans and guarantees, direct payments, and insurance, among others. The government most directly interacts
with the economy by purchasing goods and services through
contracts with private sector businesses. Since the nation’s
capital is located within the Fifth District, a sizeable number
of those contracts are with businesses located in the
District. Indeed, looking at federal contract spending for
fiscal year 2010, three of the Fifth District’s jurisdictions

ranked in the top 10 recipients among all states.
The federal government’s demand for goods and services
within the District impacts the economy in a number of
ways. The types of goods and services that the government
purchases will affect the region’s industry and the location
decisions of businesses. In many cases, businesses will move
to be closer to federal departments and installations, and as
a consequence, there is often a clustering of contractors
around these facilities and installations. In addition, the
types of goods and services demanded by government are
sometimes highly technical in nature and involve a longer
production cycle. Defense spending, which is the secondlargest expenditure in the federal budget after health care, is
a good example. Many defense goods and services are highly
technical and can require years of research, development,
and production. The firms that enter this market, defense
contractors, employ a large number of highly skilled and
educated workers and have longer time horizons as their
contracts often stretch over several years. For a local economy, this provides the benefit of attracting high-paying
workers to an area as well as providing stability given the
longer-term nature of the projects.
At the same time, reliance on government contracts
brings risks of its own. For some of these goods and services,
especially defense and basic research, the government is the
only market. Should the contract be canceled due to shifting
priorities or budget cuts, it is unlikely that these businesses
would be able to find a purchaser in the private sector for
their good or service.

Federal Agency Spending Within the District

While the Fifth District receives a large amount of federal
contracting each year, the location, source, and type of
spending vary considerably across the District. Not surprisingly, Virginia, Maryland, and D.C. are the jurisdictions that
receive the most federal contract dollars each year (see map).
In fiscal year 2010, Virginia received nearly $58 billion in
federal contracts, second only to California, while Maryland
received nearly $26 billion, fifth highest among all states.
D.C., received roughly $21 billion, seventh among all states.
South Carolina, North Carolina, and West Virginia, on
the other hand, received much less (roughly $8 billion,
$5 billion, and $2 billion, respectively).
To gauge the impact of contract spending on a state, it is
useful to scale the spending to get an idea of the size of the
expenditures in proportion to regional economy. As such,
contract spending as a percentage of gross state product is
greatest for D.C. (2.0 percent), with Virginia (1.4 percent)
second and Maryland (0.9 percent) third. Across the entire
Fifth District, federal contract spending represents 0.8 percent of the District’s economy. Overall federal contract
spending for fiscal year 2010 was $537 billion, representing
3.7 percent of gross domestic product for the United States
in 2010. The percentage has increased for all of the jurisdictions within the Fifth District over the past 10 years with
the greatest increases in D.C., Virginia, and Maryland.
The increase is related to the expansion in government over
the past decade — a significant amount resulting from the
creation of the Department of Homeland Security as well as
an increase in defense-related spending due to ongoing
military operations.
Federal Contract Spending in Fiscal Year 2010
Federal spending is viewed as an economic stabilizer
because it is less responsive to downturns in the economy
than private spending. While federal spending on an
aggregate level is somewhat stable over time, federal
FREDERICK
MORGANTOWN
BALTIMORE
contract spending at the state level can vary considerably.
ANNAPOLIS
WASHINGTON
For example, the average nominal year-over-year growth
in federal contract spending for North Carolina was
CHARLOTTESVILLE
CHARLESTON
9.7 percent from 2000 through 2010, but spending
RICHMOND
declined in three of those years and increased at very
ROANOKE
modest rates in two other years. Similarly, in D.C.,
VIRGINIA
NORFOLK BEACH
contract spending fell
CHESAPEAKE
in 2003, increased at
GREENSBORO
DURHAM
very modest rates in 2004
Military Installations
RALEIGH
and 2007, and had much
GSA Owned or Leased Properties
CHARLOTTE
stronger growth in other
FAYETTEVILLE
Action Obligation
JACKSONVILLE
years; it averaged 10.9 per$0 - $70,000
cent over the 11-year
$70,000 - $440,000
COLUMBIA
period. For the entire
$440,000 - $1,015,000
$1,015,000 - $2,400,000
Fifth District, however,
$2,400,000 - $5,807,500
federal contract spending
$5,807,500 - $10,300,000
CHARLESTON
averaged 10.3 percent,
$10,300,000
$21,400,000
BEAUFORT
increasing every year in
$21,400,000 - $70,000,000
$70,000,000 - $280,000,000
nominal terms except for
$280,000,000 - $24,400,000,000
NOTE: Military installations include Army, Marine Corps, Navy, Air Force,
2010 when spending was
Coast Guard, and Defense Logistics Agency.
unchanged from 2009.
SOURCES: Federal Procurement Data System (FPDS); Department of Defense; U.S. General Services Administration

Region Focus | Third Quarter | 2011

49

Federal Contract Spending by Agency

Health and Human Services (HHS) contract
spending is strong in Maryland, in part due to
Department of Defense
70.1
institutions such as the National Institutes of
Department of Homeland Security
5.8
Health, the Centers for Medicare and Medicaid
General Services Administration
6.1
Services, and the Food and Drug Administration
49.4
Department of Defense
Department of Health and Human Services
17.4
there; HHS spending is also strong in North
National Aeronautics and Space Administration
6.8
Carolina. In South Carolina and West Virginia,
22.8
Department of Defense
11.1
Department of Homeland Security
the Department of Energy (DOE) is one of the
General Services Administration
17.6
largest contracting agencies, also partly as a result
49.5
Department of Defense
of having significant installations in those states.
Department of Energy
35.3
Department of Veterans Affairs
7.2
As one would expect, the goods and services
65.3
Department of Defense
being provided to these agencies vary considerDepartment of Health and Human Services
8.9
Department of Veterans Affairs
4.1
ably across the District. The chart below shows
28.1
Department of Defense
the top three services that have been contracted
Department of Justice
12.9
from 2000 to 2011. There are some commonDepartment of Energy
11.1
alities, however. In most jurisdictions, agencies
0
20
40
60
80
SOURCE: USASpending.gov
contract with private businesses for professional,
administrative, and management support service.
These services provide agencies with program manageFederal Spending by Type
ment, logistical support, technical assistance, and systems
The type of federal spending also influences the local
engineering.
economy through the type of industry and businesses that
Agencies also frequently contract with businesses for
it attracts. The chart above shows federal contract spending
information technology services such as data storage,
in each District jurisdiction by the top three funding
systems development, telecommunications network
agencies from 2000 to 2010. Some interesting patterns
management, and systems analysis. Along with these
emerge when looking at spending through this lens. Not surservices are purchases of data processing equipment, softprisingly, the Department of Defense is the primary source
ware, supplies, and support equipment — the category of
of contracting dollars in each of the jurisdictions in the
goods most purchased by the government in the Fifth
District. Research and development on defense-related
District over the past 11 years, representing nearly onetechnology is a strong component of all contract spending
quarter of all goods purchases. Purchases of communication,
within the District, as is spending on defense-related goods
detection, and coherent radiation equipment were the
such as aircraft carriers, drones, camouflage, ammunition,
second-highest over that period — roughly 10 percent of all
and combat vehicles. The Defense Department also congoods contract spending.
tracts for technical support services for its data systems and
Spending on research and development is also strong
logistical support for its many programs.
within the Fifth District and the type of research conducted
As the chart indicates, there are notable differences in
is very broad, ranging from research in defense-related
the proportion of defense spending relative to total spendsystems and applications to energy research to biomedical
ing. In Virginia and North Carolina, Department of Defense
research. In Maryland, research and development includes
contracts accounted for 70 percent and 65 percent, respecdefense services; defense electronics and communication
tively, of all contract dollars from 2000 to 2010. A number of
factors account for the high percentage of
defense spending in both states —
Federal Contract Spending by Type of Service
notably, the Pentagon, the Norfolk shipPercent of total spending, 2000-2010
yard, and the various military bases in
11.5
Research and Development
28.0
Automatic Data Processing and Telecommunication Services
Virginia and North Carolina. In Maryland
29.6
Professional, Administrative and Management Support Services
and South Carolina, defense contracts
17.5
Research and Development
20.6
Automatic Data Processing and Telecommunication Services
accounted for nearly half of all contract
30.2
Professional, Administrative and Management Support Services
dollars over the past 11 years; in D.C. and
5.4
Research and Development
27.1
Automatic Data Processing and Telecommunication Services
West Virginia, it was considerably less,
Professional, Administrative and Management Support Services
36.9
closer to one-quarter of all spending.
Construction of Structures and Facilities
10.3
15.9
Maintenance, Repair, and Rebuilding of Equipment
Overall, it is clear that the Fifth District
Operation of Government-Owned Facility
44.2
has benefited from federal spending on
11.3
Research and Development
11.4
Professional, Administrative and Management Support Services
defense.
25.8
Construction of Structures and Facilities
Civilian contract spending by agency
14.1
Automatic Data Processing and Telecommunication Services
19.6
Construction of Structures and Facilities
varies considerably across the District.
Professional, Administrative and Management Support Services
33.3
Not surprisingly, Homeland Security
0
20
30
40
50
10
SOURCE: USASpending.gov
spending is strong in Virginia and D.C.
WV

NC

SC

DC

MD

VA

WV

NC

SC

DC

MD

VA

Percent of total spending, 2000-2010

50

Region Focus | Third Quarter | 2011

goods and services from the private sector. The District has
a much higher percentage of federal workers than other
areas of the United States, and those workers, on average,
receive higher salaries than federal workers in other parts of
the country.
The District benefits from the presence of numerous
military installations. Federal contract spending attracts
businesses to the region to provide services and goods to the
various government agencies at these installations. In addition, government contracting attracts workers, often with
specific skills or advanced degrees, to the region. As a consequence, the District labor market is stronger both in the
underlying demand for workers as well as the quality of the
supply of workers.
With the possibility of budget cuts in response to the
federal deficit and longer-term fiscal imbalances, there is
concern about the impact of those cuts on the Fifth
District economy. Reducing the deficit and putting the
federal government on a fiscal sustainable path is a
long-term positive for the United States and the District
economy. Yet it also creates uncertainty for workers and
businesses within the District who have, in the past, benefited from the federal government’s influence on the
regional economy.
RF

equipment; space science, applications and operations; biomedical; and defense missile and space systems. In Virginia,
contract spending on research and development focuses on
the defense industry, with “other defense” and defense
services receiving the greatest amount of research and development contracts. In addition, R&D spending in Virginia
includes defense electronics and communication equipment, defense missile and space systems, and tank and
automotive systems, among other research types. In North
Carolina, the other Fifth District jurisdiction with a
relatively high percentage of research and government contracting, contracts for research are focused on biomedical;
basic research, including basic research in biomedical,
AIDS, and defense services; defense missile and space
systems; and other defense and health-related work.
Agencies also contract private businesses for building
construction. This is true across the District, but as a
percentage of total contracts, construction spending is
considerably more substantial in the Carolinas and West
Virginia.

Conclusion
In conclusion, the federal government has a strong influence
on the Fifth District through its hiring and its purchasing of
u

The Richmond Fed introduces Regional Update, an analysis of
labor market conditions in the Fifth District. Written by our regional
economists based in Richmond, Charlotte, and Baltimore, this new
feature delves into state-level unemployment data from the Bureau
of Labor Statistics as well as information from the Richmond Fed’s
surveys of business activity.
It includes a podcast and written report for each part of the
Fifth District, which includes the District of Columbia, Maryland,
Virginia, North Carolina, South Carolina, and most of West Virginia.

Region Focus | Third Quarter | 2011

51

State Data, Q1:11
DC

MD

NC

SC

VA

WV

714.6

2,511.3

3,877.6

1,814.8

3,647.2

748.6

Q/Q Percent Change
Y/Y Percent Change

0.0
1.3

-0.2
0.7

0.5
0.6

0.4
1.1

0.4
1.1

0.1
1.0

Manufacturing Employment (000s)
Q/Q Percent Change
Y/Y Percent Change

1.2
0.0
-7.7

112.1
-1.3
-2.6

435.0
0.9
1.0

210.1
0.5
2.0

231.1
0.7
-0.2

49.3
0.2
1.1

Professional/Business Services Employment (000s) 149.3
Q/Q Percent Change
-0.5
Y/Y Percent Change
1.4

389.4
0.2
2.0

494.5
1.0
4.6

221.6
-0.5
9.8

660.0
0.5
3.1

62.4
2.1
3.9

Government Employment (000s)
Q/Q Percent Change

249.5
0.8

499.6
0.4

695.2
0.3

334.1
-0.1

703.0
0.6

149.9
-0.7

2.1

0.6

-1.3

-3.8

0.2

-1.3

333.7
0.8

2,977.5
-0.1

4,469.8
0.1

2,155.4
-0.3

4,188.1
0.2

781.8
0.4

Y/Y Percent Change

-0.6

-0.2

-1.9

-0.6

-0.1

-0.7

Unemployment Rate (%)
Q4:10
Q1:10

9.5
9.7
10.2

7.1
7.4
7.6

9.8
9.8
11.4

10.2
10.9
11.6

6.4
6.6
7.2

9.4
9.6
8.8

38,995.3

260,434.4

305,876.0

137,940.7

325,741.7

54,028.0

Q/Q Percent Change
Y/Y Percent Change

0.9
4.5

0.9
3.4

1.0
2.7

0.9
3.5

0.8
3.6

0.3
2.7

Building Permits
Q/Q Percent Change
Y/Y Percent Change

714
413.7
138.8

2,414
22.6
-19.1

8,471
27.9
-7.3

3,569
24.4
-19.1

5,837
62.2
12.4

364
31.9
-13.3

560.2
-1.5
0.3

417.1
-3.3
-3.8

311.9
-2.0
-2.6

317.3
-1.9
-2.9

401.6
-2.3
-2.2

219.8
-1.5
-0.0

10.0

82.4

140.8

68.4

112.4

28.4

31.6
13.6

21.2
9.0

12.5
-0.8

1.2
-1.7

16.6
2.2

7.6
7.6

Nonfarm Employment (000s)

Y/Y Percent Change
Civilian Labor Force (000s)
Q/Q Percent Change

Real Personal Income ($Mil)

House Price Index (1980=100)
Q/Q Percent Change
Y/Y Percent Change
Sales of Existing Housing Units (000s)
Q/Q Percent Change
Y/Y Percent Change

52

Region Focus | Third Quarter | 2011

Nonfarm Employment

Unemployment Rate

Real Personal Income

Change From Prior Year

First Quarter 2001 - First Quarter 2011

Change From Prior Year

First Quarter 2001 - First Quarter 2011

First Quarter 2001 - First Quarter 2011

8%
7%
6%
5%
4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%

10%

4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%

9%
8%
7%
6%
5%
4%
3%
01 02 03 04 05 06 07 08 09 10

11

01 02 03 04 05 06 07 08 09 10

Fifth District

11

01 02 03 04 05 06 07 08 09 10

United States

Nonfarm Employment
Metropolitan Areas

Unemployment Rate
Metropolitan Areas

Building Permits

Change From Prior Year

Change From Prior Year

First Quarter 2001 - First Quarter 2011

First Quarter 2001 - First Quarter 2011

First Quarter 2001 - First Quarter 2011

7%
6%
5%
4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%
-7%
-8%

Change From Prior Year

30%

13%
12%
11%
10%
9%
8%
7%
6%
5%
4%
3%
2%
1%
01 02 03 04 05 06 07 08 09 10
Charlotte

Baltimore

11

20%
10%
0%
-10%
-20%
-30%
-40%
-50%
01 02 03 04 05 06 07 08 09 10

Washington

Charlotte

Baltimore

FRB—Richmond
Manufacturing Composite Index

First Quarter 2001 - First Quarter 2011

First Quarter 2001 - First Quarter 2011

30

30

20

20

10

0

First Quarter 2001 - First Quarter 2011

16%
14%
12%
10%
8%
6%
4%
2%
0%
-2%
-4%
-6%
-8%

-20
-30

-20

-40

-30

-50
01 02 03 04 05 06 07 08 09 10

11

United States

Change From Prior Year

-10

-10

Fifth District

11

House Prices

0

10

01 02 03 04 05 06 07 08 09 10

11

Washington

FRB—Richmond
Services Revenues Index

40

11

01 02 03 04 05 06 07 08 09 10

11

01 02 03 04 05 06 07 08 09 10
Fifth District

11

United States

NOTES:

SOURCES:

1) FRB-Richmond survey indexes are diffusion indexes representing the percentage of responding firms
reporting increase minus the percentage reporting decrease.
The manufacturing composite index is a weighted average of the shipments, new orders, and employment
indexes.
2) Building permits and house prices are not seasonally adjusted; all other series are seasonally adjusted.

Real Personal Income: Bureau of Economic Analysis/Haver Analytics.
Unemployment rate: LAUS Program, Bureau of Labor Statistics, U.S. Department of Labor,
http://stats.bls.gov.
Employment: CES Survey, Bureau of Labor Statistics, U.S. Department of Labor, http://stats.bls.gov.
Building permits: U.S. Census Bureau, http://www.census.gov.
House prices: Federal Housing Finance Agency, http://www.fhfa.gov.

Region Focus | Third Quarter | 2011

53

Metropolitan Area Data, Q1:11
Washington, DC
Nonfarm Employment (000s)
Q/Q Percent Change

Hagerstown-Martinsburg, MD-WV

2,394.4
-1.3

1,249.4
-2.4

95.5
-2.1

Y/Y Percent Change

1.7

0.3

0.2

Unemployment Rate (%)
Q4:10
Q1:10

5.9
5.9
6.8

7.7
7.6
8.4

10.3
9.7
10.9

Building Permits
Q/Q Percent Change

4,156
119.2

1,079
8.2

125
16.8

Y/Y Percent Change

22.2

-26.7

-26.5

Asheville, NC

Charlotte, NC

Durham, NC

164.7
-2.4

797.8
-1.1

279.0
-0.9

Y/Y Percent Change

0.7

0.5

0.7

Unemployment Rate (%)
Q4:10
Q1:10

8.6
7.7
10.0

10.9
10.7
12.7

7.4
7.0
8.5

Building Permits
Q/Q Percent Change
Y/Y Percent Change

287
39.3
-7.7

1,429
52.5
-18.2

456
29.9
2.2

Raleigh, NC

Wilmington, NC

Nonfarm Employment ( 000s)
Q/Q Percent Change

Greensboro-High Point, NC

54

Baltimore, MD

Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change

335.6

499.7

134.1

-1.4
0.0

-0.5
2.6

-2.0
0.4

Unemployment Rate (%)
Q4:10
Q1:10

10.6
10.1
12.4

8.1
7.8
9.7

10.2
9.6
11.7

Building Permits
Q/Q Percent Change
Y/Y Percent Change

649
46.2
20.9

1,093
30.6
-26.9

389
7.8
-37.3

Region Focus | Third Quarter | 2011

Winston-Salem, NC

Charleston, SC

Columbia, SC

202.0
-1.3
-0.6

283.0
-0.8
2.0

341.3
-0.6
-0.1

9.7
9.3
11.2

8.2
9.1
9.9

8.3
9.2
9.6

201
-12.2
-21.5

719
26.6
-28.3

786
40.6
-15.2

Greenville, SC

Richmond, VA

Roanoke, VA

295.0
-0.9

594.6
-1.3

151.9
-1.7

Y/Y Percent Change

1.5

0.4

-0.5

Unemployment Rate (%)
Q4:10

8.3
9.3

7.3
7.3

7.1
6.9

Q1:10

10.5

8.4

8.2

Building Permits
Q/Q Percent Change

431
17.1

610
1.5

107
91.1

Y/Y Percent Change

-19.4

-30.9

0.0

Virginia Beach-Norfolk, VA

Charleston, WV

719.1
-2.1
-0.2

146.1
-1.5
0.6

112.6
-1.8
-0.1

7.3
7.1
7.9

9.1
8.3
8.5

9.5
8.7
9.2

1,158
21.8
2.9

24
-14.3
-48.9

4
-55.6
-66.7

Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q4:10
Q1:10
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Nonfarm Employment (000s)
Q/Q Percent Change

Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q4:10
Q1:10
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Huntington, WV

For more information, contact Sonya Ravindranath Waddell at (804) 697-2694 or e-mail Sonya.Waddell@rich.frb.org

Region Focus | Third Quarter | 2011

55

OPINION
A Focused Approach to Financial Literacy
BY J O H N A . W E I N B E RG

home. This may not alter the actual housing services enjoyed
n article in this issue of Region Focus begins with an
so much, but it does have significant implications for the
observation on the difficulties many adults have in
household’s balance sheet and to what risks it is exposed.
answering basic questions about key aspects of
Finally, as a household enters its peak earning years, plans
financial decisionmaking. Indeed, systematic research has
for retirement become important. Savings and the accumurevealed large gaps in knowledge about such things as comlation of wealth through financial or real estate assets
pound interest. These findings suggest that many people
become key financial tools for such planning.
would have difficulty assessing the trade-offs involved in
These decisions not only involve basic trade-offs between
even the simplest financial decisions. Even more so, then,
consumption now and consumption in the future, but they
people must struggle with the really big decisions we all
all bring with them a choice among financing strategies or
face at some time in our lives — decisions about the acquiinstruments — how much debt to incur, what kind of loan to
sition and financing of education, or about homeownertake on, what kind of savings instruments to use. And they
ship, or about saving for retirement.
are decisions that can have lasting
The demonstrably high level of
effects on well-being, as well as havfinancial improficiency in many
Financial education could have
ing important implications to how
parts of the population also presents
substantial implications for
exposed a household is to economic
a challenge for economic analysis,
individuals’ well-being, even if the shocks. Ill-informed decisionmakers
since our understanding of the
overall effects on the
are not only prone to make mistakes,
aggregate behavior of households is
but they also become more vulnerabased on a model that assumes indimacroeconomy are not large.
ble to abusive financial practices.
viduals are capable of making the
I think it’s also important to note here that, while what
decisions that lie at the heart of household finance. If graspmay appear to be ill-informed financial decisions indeed
ing the essential trade-offs is difficult for many individuals,
often are, that is not always the case. Individual circumthen can we trust a model based on an assumption of sophisstances that cannot be observed by outsiders may lead
ticated decisionmaking to give a good representation of
households to make decisions that, in fact, are rational.
the data? Perhaps surprisingly, there is evidence that at the
If one’s future income stream is highly variable or unstable
aggregate level, or looking broadly across the population of
— for instance, if a person is self-employed in the first case
households, such models do reasonably well at describing
or faces a potential layoff in the second — then it very well
consumption and savings decisions.
may make sense for that person to act differently than what
Does the fact that models of sophisticated consumers do
we would normally perceive as optimal. That person may
well at capturing aggregate economic behavior imply that
wish to save less now for long-term purposes such as retirethe problem of financial improficiency is small, and that
ment — especially, perhaps, in tax-preferred vehicles that
resources dedicated to financial education would not yield
can carry significant penalties for early withdrawal — in
large improvements in peoples’ well-being? I don’t think so.
order to remain relatively liquid and better weather those
It is certainly possible that errors in decisionmaking, relative
more immediate financial shocks.
to a standard model of household choice, are not systematic
That said, I do not wish to downplay the importance of
enough to show up in aggregate behavior but that such
financial education. It seems likely that such efforts,
errors still have large consequences for individuals. This is
targeted at people who are close to critical decision points
especially true of the large decisions that households must
in their lives, could have substantial implications for
make — decisions that can have lasting consequences.
individuals’ well-being, even if the overall effects on the
These most consequential decisions tend to be assomacroeconomy are not large. Indeed, effective financial
ciated with major phases of an individual’s life cycle. Early
education could be the single best strategy for consumer
on, people must make choices about education — choices
financial protection. While there may be a role for reguthat may imply delaying labor market participation and,
latory oversight and legal recourse, such as in the case of
thus, delaying earning income — in order to accumulate furfraud, giving consumers the tools to better understand the
ther human capital after secondary school. This can have a
financial choices before them should also help make unfair
large impact on both an individual’s lifetime earnings ability
and misleading practices less profitable.
and financial position in early adulthood, as the delay in
RF
earnings and the cost of education may need to be funded
by debt.
John A. Weinberg is senior vice president and director
Another early decision may be whether to purchase a
of research at the Federal Reserve Bank of Richmond.

A

56

Region Focus | Third Quarter | 2011

NEXTISSUE
The State of the Manufacturing Sector

Policy Update

Rapid productivity increases and continued output growth
imply that the U.S. manufacturing sector is quite strong, despite
large declines in employment. But recent research suggests that
the picture is more complicated than it first appears. How
healthy is manufacturing, and what does this mean for the
U.S. economy as a whole?

Despite their presence in Virginia well
before the Jamestown settlement, Native
Americans of the region have struggled for
formal recognition from the federal government. Why is the process so hard, and why
do the tribes pursue it?

A Little Inflation to Create a Bigger
Economic Recovery?
Weak employment numbers and relatively contained inflation
have led some economists to propose an unexpected solution:
a short-term dose of inflation to kick the recovery into gear.
Why do some think this would help, and what are the costs?

Federal Reserve
The Dodd-Frank Act gives federal banking
regulators and the SEC new authority over
compensation and compensation-related
governance practices at large financial
institutions. What will the rules mean for
covered institutions?

Recession on the Eve of Retirement

Economic History

This year, the first of the 82 million baby boomers hit retirement
age, on the heels of the worst recession since before they were
born. The conventional wisdom is that boomers are being forced
to delay retirement to shore up decimated retirement portfolios, but research shows there may be more to the story.

Colonists extracted tar, pitch, and turpentine from Southeastern longleaf pines,
and exported these goods to the world’s
largest navy, the British. North Carolina
produced the most, and earned its nickname, The Tarheel State.

Visit us online:
www.richmondfed.org
• To view each issue’s articles
and Web-exclusive content
• To view related Web links of
additional readings and
references
• To add your name to our
mailing list
• To request an e-mail alert of
our online issue posting

Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261

Change Service Requested

To subscribe or make subscription changes, please email us at research.publications@rich.frb.org or call 800-322-0565.

or more than two decades, the Richmond Fed has
conducted monthly surveys of hundreds of
manufacturing, retail, and services firms in the region.
Participants are selected to represent the industrial
structure of the District’s economy in terms of type, size,
and location. Over time, these surveys have proven to be
an accurate gauge of economic conditions in the region.
By comparing the surveys with benchmark measures for
the nation, such as the Institute for Supply Management
(ISM) manufacturing and nonmanufacturing indexes, they
provide insights into how the region is faring relative to the
nation as a whole. And, in mid-2011, all the indexes were
revised and updated for the first time in a decade, with the
introduction of recalculated seasonal factors over the entire
history of the surveys. While most indexes changed little,
the indexes conformed more closely to comparable
measures, without losing the distinctive characteristics of
Fifth District business activity.

F
Richmond Manufacturing Composite Index
ISM Manufacturing Composite Index
30

65

20

60

10

55

0

50

-10

45

Jan 11

Jan 10

Jan 09

Jan 08

Jan 07

Jan 06

Jan 05

Jan 04

Jan 03

Jan 02

30

Jan 01

-40
Jan 00

35
Jan 99

40

-30
Jan 98

-20

Richmond Service Sector Revenue Index
ISM Nonmanufacturing Business Activity Index
30
20
10
0
-10
-20
-30
Jan 11

Jan 10

Jan 09

Jan 08

Jan 07

Jan 06

Jan 05

Jan 04

Jan 03

Jan 01

Jan 02

Jan 00

Jan 99

Jan 98

-40

70
65
60
55
50
45
40
35
30

Manufacturing Survey
This survey covers a variety of economic measures,
including sales, orders, backlogs, inventory, employment,
and prices, as well as the expectation of most measures
over the next six months. The sales, orders, and employment indexes are combined as a weighted average to
create a composite index of manufacturing activity.
Not only do these indexes help explain how recessions
and recoveries are playing out in the District, but they
also provide region-specific measures that are not often
available on a monthly basis to monitor cyclical activity.
Service Sector Survey
This survey covers both retail and nonretail service
firms, capturing such economic measures as revenue,
employment, wages and prices. And in the case of retail
firms, inventory, shopping traffic, and sales of big-ticket
items are tracked.

To find out what these surveys tell us about District economic activity, please see the District Digest section in Region Focus or
visit our website at http://www.richmondfed.org/research/regional_economy/surveys_of_business_conditions/index.cfm
Become a participant in one of our surveys and have your voice heard. It’s quick, easy, and confidential, and will give you the
opportunity to share your views on local business conditions. Please contact Judy Cox at judy.cox@rich.frb.org