View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

FIRST QUARTER 2012

THE

Sarbanes-Oxley
10 Years Later

FEDERAL

Eurozone
at a Crossroad

RESERVE

BANK

OF

RICHMOND

Interview with
John B. Taylor

VOLUME 16
NUMBER 1
FIRST QUARTER 2012

COVER STORY
12

What We Don’t Know About Innovation: We know innovation
is important — but do we know how to make it happen?

Region Focus is the
economics magazine of the
Federal Reserve Bank of
Richmond. It covers economic
issues affecting the Fifth Federal
Reserve District and
the nation and is published
on a quarterly basis by the
Bank’s Research Department.
The Fifth District consists of the
District of Columbia,
Maryland, North Carolina,
South Carolina, Virginia,
and most of West Virginia.

FEATURES
17
DIRECTOR OF RESEARCH

The Dream Behind the Eurozone: How the region’s political aims
led to a complicated morning after

John A. Weinberg
EDITOR

Aaron Steelman

21

SENIOR EDITOR

David A. Price

Open House: Will helping homeowners help the economy?

MANAGING EDITOR

Kathy Constant

26
STA F F W R I T E R S

Sky Miles: Small airports support flexible flying

Renee Haltom
Betty Joyce Nash
Jessie Romero
E D I TO R I A L A S S O C I AT E

Tim Sablik

DEPARTMENTS

1 President’s Message/The Limits of Limiting Financial Innovation
2 Upfront/Regional News at a Glance
6 Federal Reserve/When the Fed Makes Fiscal Policy
9 Policy Update/Money for Marrow?
10 Jargon Alert/Liquidity Trap
1 1 Research Spotlight/Playing the Waiting Game
29 Interview/John B. Taylor
34 Economic History/The Counterfeiting Weapon
38 Book Review/Grand Pursuit: The Story of Economic Genius
42 Around the Fed/Death and Taxes
44 District Digest/Economic Trends Across the Region
52 Opinion/Reflections on Sarbanes-Oxley 10 Years Later

CONTRIBUTORS

Charles Gerena
Rick Kaglic
Karl Rhodes
Sonya Ravindranath Waddell
DESIGN

ShazDesign
Published quarterly by
the Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261
www.richmondfed.org
www.twitter.com/RichFedResearch

Subscriptions and additional
copies: Available free of
charge through our website at
www.richmondfed.org/publications
or by calling Research
Publications at (800) 322-0565.
Reprints: Text may be reprinted
with the disclaimer in italics below.
Permission from the editor is
required before reprinting photos,
charts, and tables. Credit Region
Focus and send the editor a copy of
the publication in which the
reprinted material appears.
The views expressed in Region Focus
are those of the contributors and not
necessarily those of the Federal Reserve Bank
of Richmond or the Federal Reserve System.
ISSN 1093-1767

PHOTOGRAPHY: GETTY IMAGES

PRESIDENT’S MESSAGE

The Limits of Limiting Financial Innovation
nnovation typically brings to mind advances in technology and medicine, such as the personal computer
or the development of vaccines. Such innovations are
a key driver of economic growth, as discussed in the cover
story of this issue of Region Focus. But there is another kind
of innovation that plays a crucial role in our economy, one
that has received a great deal of attention in recent years:
financial innovation.
Modern financial innovations range from ATMs and
online banking to complex derivatives and currency swaps.
Many have had a positive effect on economic growth and
macroeconomic performance. Beginning in the 1980s, for
example, new credit products reduced borrowing costs for
both consumers and businesses, enabling them to better
smooth their consumption and investment in the face of
shocks, and potentially moderating the negative effects of
reduced spending and lending on the economy as a whole.
Other innovations, such as asset-backed securities or credit
default swaps, help to allocate capital and allow companies
and investors to protect themselves against risk.
Of course, many of these same products were at the heart
of the financial crisis of 2007-2008. Should that change the
way we think about the benefits of financial innovation?
I believe that it should not. At issue is not whether
financial innovation is inherently good or bad, but rather the
incentives market participants have to innovate, and the
regulatory environment in which they do so. In particular,
the size and ambiguity of the government financial safety
net gives institutions an incentive to use financial innovations to take on excessive risk, believing they are insulated
from losses by an implicit government guarantee. According
to estimates by Richmond Fed researchers Nadezhda
Malysheva and John Walter, the safety net covered $25 trillion in liabilities at the end of 2009, or 59 percent of the
entire financial sector. Nearly two-thirds of that support is
implicit and ambiguous.
Outside the financial sector, the interests of innovators
tend to be aligned with the interests of society as a whole;
a new product or service generally will only be profitable if
it improves the well-being of households or businesses.
Within the financial sector, however, innovations often are a
means of “regulatory bypass,” an attempt to work around
the constraints imposed by regulators. For example, money
market mutual funds arose as a means of circumventing
regulatory constraints on deposit interest rates. Such innovations may offer legitimate benefits to end users, but
problems can arise when there is a mismatch between the
scope of prudential regulation and the size of the government safety net: Institutions that are not subject to
prudential regulation, but believe that they are part of the
safety net, often engage in increasingly risky behavior.

I

Prior to the financial crisis,
officials often followed a
policy of “constructive ambiguity” about the likelihood
of intervening. Policymakers
downplayed their willingness
to provide support, hoping
firms would limit their risk,
but still left room for intervention when necessary. In
practice, however, policymakers tended to intervene more
frequently, increasing the market’s expectations about the
likelihood of rescues.
The repurchase, or “repo,” market is an illustrative
example. A repo is a short-term collateralized loan that
provides borrowers with a low-cost way to finance a broad
range of assets and offers lenders an attractive rate of return
on a highly liquid investment. Repo financing becomes risky,
however, when lenders refuse to roll over their positions and
the borrower has trouble finding other ways of financing
its assets, as happened to Bear Stearns in March 2008. Bear
Stearns’ sale to JPMorgan Chase did benefit from government support — and the expectation of that support may
have led to a reliance on such fragile financial arrangements
in the first place.
Policymakers can reduce this tension by clarifying the
boundaries of the financial safety net and making sure that
those within the safety net are subject to rigorous prudential
regulation. The response to the financial crisis has largely
focused on the latter. New regulations may succeed in
limiting risk insofar as they apply, but I believe the greater
concern is that we have not taken adequate steps to reduce
and clearly define the size of the safety net. Designing a
regulatory regime before we have determined the extent of
the safety net is “putting the cart before the horse.” Financial
firms and market participants will continue to have an
incentive to find innovations that benefit from the safety net
but bypass prudential regulation.
Enhancing prudential regulation is a valuable step
forward. But to start with, we must address the incentives
that lead to potentially harmful innovations in the first
place — and be careful not to limit those innovations that
RF
do contribute to economic growth and well-being.

JEFFREY M. LACKER
PRESIDENT
FEDERAL RESERVE BANK OF RICHMOND

Region Focus | First Quarter | 2012

1

UPFRONT

Regional News at a Glance

Post-Panamax

East Coast Ports Prepare for Bigger Ships from the Panama Canal

PHOTOGRAPHY: VIRGINIA PORT AUTHORITY

The Panama Canal is getting wider and deeper to accommodate the increasing size of
container ships in the world’s fleet. Likewise, the ports need to add depth and breadth.
And all eyes are on the Southeast, which has the nation’s
fastest-growing population, “but has the least capability
in terms of channel depth,” according to economist
Keith Hofseth of the U.S. Army Corps of Engineers, the
agency charged by Congress with deciding which port
expansions are a good taxpayer investment.
Of the major container cargo ports in the Southeast,
currently only Virginia’s Norfolk International
Terminals can accommodate “post-Panamax” vessels —
which carry more than twice the container volume as
those that can currently squeeze through the Panama
Canal — in all tidal conditions. The majority of
goods that come through Norfolk’s port, however, are
shipped to the Midwest and Upper Midwest, leaving no

A vessel loaded with containers arrives
at Norfolk’s port.

2

Region Focus | First Quarter | 2012

port primarily serving Southern markets that can
receive the larger ships 24 hours a day. A feasibility study
for the Port of Charleston, which today can receive
fully loaded post-Panamax ships only at high tide, is
currently under way. The study alone will cost about
$20 million, and the deepening has an estimated price
tag of $300 million, according to spokeswoman Allison
Skipper. In July 2010, the Port of Savannah — second on
the East Coast in container volume — received word
that the Corps of Engineers approved its deepening
project based on engineering feasibility, economic viability, and environmental acceptability. Other possible
recipients of federal dollars include ports in Miami,
Tampa, Mobile, and Houston.
Though port projects may be justified, “now the
scramble will be to find the money to fund these investments,” says Hofseth. The Corps, a federal agency,
shares the cost of dredging projects with the ports,
which can raise money through bond issues, and local
governments. But the Corps’ historical budget implies
that it is unlikely that all ports will get money to expand,
he says. Expansion entails studies such as those
conducted in Savannah, which take years — ports not
already deep enough won’t be by the time the expanded
Canal is open in 2014 — and then ports that win
approval from required federal agencies will be in
competition for an undetermined pool of federal dredge
money. In preparation for this battle, ports have been
touting the regional economic benefits of greater port
traffic, as well as their respective port infrastructures
and proximities to regional rail lines.
For them, the stakes are potentially high. More than
95 percent of cargo imports to the United States arrive
by ship, much from China and other Asian countries.
Most imports from Asia that end up on the East Coast
initially arrive on the West Coast and travel east by rail,
highway, and waterway. Only a fifth arrive in East Coast
ports via the Panama Canal. A small fraction of imports
travel the globe in the opposite direction through
Egypt’s Suez Canal.
East Coast ports plan for their arrivals to grow. But
others in the shipping business say the Canal’s expansion won’t bring a sudden flood of new traffic. Not every

port can win, says Theodore Prince, a former rail and
ocean carrier executive. He heads a shipping consulting
firm based in Richmond. As ship sizes grow, they call on
fewer ports per trip. “The big ships only make money when
they’re moving.” Also, ground and waterway transport from
the West — train, truck, and barge — is faster, he says.
A slight cost savings from a new shipping route often is not
worth adding several days to transit since that increases the
time between production and delivery. Some shippers are

so anxious to minimize this window that final destinations
are determined once the cargo has arrived in the United
States, based on to-the-minute forecasts of local demand.
The Corps of Engineers will wrap up a report in June
2012 on the options for port and waterway expansions to
accommodate post-Panamax vessels, but it will not recommend ports that should receive priority. The decision of
where to direct limited funds will be Congress’ to make.
— R E N E E H A LTO M

District Disparity

Income Gap Grows in Nation’s Capital, Narrows in Suburbs
Washington, D.C., Population
900
800

NUMBER OF RESIDENTS
IN THOUSANDS

he income gap between whites and blacks living in
Washington, D.C., is among the highest in the nation,
according to U.S. Census Bureau figures released in
December. On average, white residents earn $3.04 for
every $1 earned by black residents. That’s the fourth-highest earnings difference in the nation when compared to
the 700 U.S. counties with the largest black populations,
according to William Frey, a demographer and senior
fellow at the Brookings Institution, who has analyzed the
data. The greatest white-black disparity was $4.15 to $1 in
New York County (Manhattan). D.C.’s gap has increased
modestly during the past two decades; in the 1990 census,
the ratio was $2.95 for every $1.
Washington, D.C., scores high on other measures of
income inequality as well. Residents in the 90th income
percentile earn nearly 22 times as much as those in the
10th percentile, compared to the national average of 11
times. Among cities of more than 100,000 people, only
Atlanta and New Orleans showed greater overall income
disparities.
The black-white gap has narrowed in the Maryland and
Virginia suburbs of D.C., however. In many of the counties
in the metro area, the gaps are among the lowest in
the nation. In Stafford County, Va., whites earn $1.17 per $1
earned by blacks, compared to $1.40 per $1 in 1990.
In Prince George’s County, Md., the nation’s wealthiest
majority-black county, the ratio has declined from
$1.33 per $1 to $1.22 per $1 over that same period.
Of the 21 U.S. counties with the highest per capita incomes
for blacks, 10 are in the D.C. metro area, including No. 1,
Loudoun County, Va.
The growing income gap in the city reflects a reverse of
migration patterns that date back to the Civil War.
Washington, D.C., became the first majority-black city in
the United States in 1957 as blacks were drawn by federal

T

700
600
500
400
300
200
100
1870 80 90 1900 10

SOURCE: U.S. Census Bureau

20

30

40 50 60 70 80
Total

Black

90 2000 10
White

jobs which provided a path to upward mobility.
“D.C. was one of the first cities where blacks were
able to develop institutions and neighborhoods,” says
Roderick Harrison, a sociologist at Howard University and
the former chief of racial statistics at the Census Bureau.
“It is the earliest black middle class that one sees emerge.”
After World War II, white residents began moving to
the suburbs, while the city’s black population continued to
grow. By 1970, 70 percent of D.C. residents were black.
More recently, the trend has reversed as black
residents have moved to the suburbs. Nationwide, the
share of blacks in large metro areas who live in the suburbs
increased from 37 percent in 1990 to 51 percent in 2010;
the D.C. metro area had the third-highest increase.
Within the city, the black population fell more than
11 percent between 2000 and 2010, according to 2010
census data, and in 2011, it fell below 50 percent for the
first time since 1957. “D.C. was really at the forefront of
suburbanization,” says Frey. “The government jobs that
were available provided a good opportunity for socioeconomic mobility for African-Americans for many decades

Region Focus | First Quarter | 2012

3

before affirmative action and the real impact of the civil
rights movement,” which enabled many black residents to
follow whites out to the suburbs.
Not everyone who leaves the city is upper or middle
class; many of the suburbs’ new residents are from Wards
7 and 8, where poverty rates are as high as 40 percent. Still,
some residents have the means to cross the line into
neighboring Maryland, which increases the poverty concentrations in the neighborhoods they left behind,
Harrison says.

At the same time, gentrification and a reversal of the
previous decades’ white flight has brought new people and
businesses to the city. Median income growth was the
third highest in the nation during the 2000s. D.C.’s white
population increased 31 percent between 2000 and 2010,
and most are college educated. Nine in 10 whites in the
city have a college degree; two in 10 blacks do. New
government and high-tech jobs often require a college
degree, which bodes ill for D.C.’s income gap.
— JESSIE ROMERO

Copper Robbers

Rising Prices Fuel Thefts
hieves in Charles Town, W.Va., recently struck an
elementary school, but their target wasn’t computers
or craft supplies — it was copper coils housed within heating and air-conditioning units. Across the country, homes,
businesses, and churches have reported stolen copper gutters or plumbing. Phone and power service has been
disrupted as telecom and power companies struggle to
replace copper wiring faster than it goes missing.
“It’s almost a weekly occurrence,” says Dan Page,
communications manager for Frontier Communications,
which provides landline phone and broadband Internet
services to roughly 95 percent of West Virginia. “Since
August 2011, we’ve had more than 120 aerial cable thefts.”
Like gold and silver, copper is an excellent conductor
of electricity. But unlike its more precious relatives,
copper has historically been much cheaper and more
readily available, leading to its ubiquitous use in infrastructure. Since 2003, demand from industrializing
countries like China has outpaced supply, as low prices in
the previous decade had led copper miners to scale back
production. As a result, the price of copper has climbed to
record highs — from less than a dollar per pound in 2002
to roughly $3 per pound today in 2002 dollars.
Michael Baylous, public information officer for the
West Virginia State Police, says the sour economy makes
copper theft more attractive. The payoff for stolen
copper is often dwarfed by the damage thieves leave
behind, however. At the Charles Town school, the stolen
copper coils would fetch about $800 from scrap dealers,
but repairs will cost more than $300,000. Likewise, Page
reports that Frontier has spent $680,000 to replace stolen
cable in its coverage area.
Copper theft is also a problem elsewhere in the
District. Verizon employees reported missing copper wire
in Maryland, leading to the arrest of thieves who were
using a bucket truck to remove wires from telephone
poles. Dominion Virginia Power, which serves customers
in Virginia and North Carolina, says that copper theft has

PHOTOGRAPHY: COURTESY OF FRONTIER COMMUNICATIONS

T

4

Region Focus | First Quarter | 2012

A Frontier
Communication’s
employee surveys
the damage left
behind by copper
thieves in West
Virginia. The
company has
replaced 38,000
feet of stolen
cable since
August 2011.

resulted in power outages along highways, airports, and
even military bases. The Federal Bureau of Investigation
issued a report in 2008 warning that copper theft posed a
growing threat to infrastructure across the country.
“We are very concerned about the public safety risks
that are associated with cable theft,” says Page. “People who
live in areas with no other means of communication are
losing their link to friends, family, and emergency care.”
States in the Fifth District and elsewhere have been
exploring methods to deter copper thieves. Virginia and
South Carolina require scrap metal sellers to obtain
permits; scrap yard employees are also required to record
identification information about each seller. Both states
also require some scrap copper sales to be paid by check,
which has helped to reduce theft. North Carolina is modeling its new laws on South Carolina’s.
West Virginia has increased the fines for metal theft.
Police are now authorized to stop vehicles suspected of
carrying stolen metal. Telecommunication companies,
utilities, and recyclers support the law.
Those deterrents may not be enough, however, given
the risks some thieves take to obtain copper. In February,
police in Princess Anne, Md., found the remains of a man
who was electrocuted to death after trying to steal copper
from an electrical transformer. He was not the first wouldbe thief to suffer that fate.
“I don’t know that you can fully deter this behavior,”
says Baylous.
— TIM SABLIK

BYOB: Bring Your Own Bag

Montgomery County, Md., Taxes Shopping Bags
aper or plastic?” is a question most grocery
shoppers expect at the checkout counter. But in
Montgomery County, Md., that question has changed.
Now it’s, “Did you bring a bag?” Since the start of the
year, shoppers have paid an extra nickel for any bags
that retail stores provide.
The new tax is intended to reduce litter in the county’s
waterways. A 2008-09 study prepared by the Anacostia
Watershed Society found 121 tons of litter in county
streams and rivers, and shopping bags were a major
component. In 2009, Montgomery County spent
$3 million on litter cleanup. The county plans to use
revenue from the new tax to pay for future cleanup
efforts. Although retailers previously did not charge extra
for the bags, the legislators argued that the bags’ true
costs included the litter they generated.
Economists refer to such costs borne by third parties as
“negative externalities.” In this case, the public must pay
for litter cleanup. The new tax ensures that at least some
of the externality is priced into the bags customers use.
But covering cleanup costs isn’t the only reason legislators passed the law. By creating a cost for using
disposable bags, they hope the law will induce shoppers to
bring their own reusable bags instead.
Montgomery County based its law on one enacted two
years earlier in neighboring Washington, D.C. Evidence

P

shows that D.C.’s tax is having an impact on disposable
bag use. Although the D.C. tax only applies to stores that
sell food or alcohol, the District Department of the
Environment reports that 75 percent of surveyed
shoppers reduced their disposable bag use after the fee
became effective. Surveyed businesses reported bag
consumption dropped by half.
To date, the District tax has generated $3.9 million
since January 2010, less than forecast. For the measure’s
supporters, the lower revenue is a sign of the tax’s success,
not its failure; it means that more shoppers are changing
their behavior to avoid it. Initial reports show the
Montgomery County tax raised $154,000 in January 2012,
roughly the amount raised by the District’s bag tax in its
first month.
Of course, disincentives are not the only way to
change behavior. Some supermarket chains, including
Kroger and Safeway, have experimented with offering
5-cent rebates to customers who bring reusable bags. But
Kroger reported no difference in reusable bag usage at its
stores that offered a rebate. Early evidence from
Montgomery County and the District suggests that the
stick has been more effective in changing shopper
behavior than the carrot. Officials in other Maryland
counties will be observing the tax’s impact, as some
—TIM SABLIK
consider similar laws.

$19.20 to Cross NC in 2019?

More Fifth District States Propose Tolls on I-95
orth Carolina and Virginia are pursuing plans to
place tolls on Interstate 95 to fund major improvements to the Fifth District’s most traveled highway.
Currently there are no tolls on I-95 south of Maryland.
In February, the Federal Highway Administration
granted conditional approval for tolls on I-95 in
North Carolina as part of the agency’s Interstate System
Reconstruction and Rehabilitation Pilot Program.
Virginia received conditional approval in September 2011
to switch a pending toll proposal from Interstate 81
to I-95. The federal initiative allows up to three states to
place new tolls on existing interstates that cannot
be adequately maintained and improved with other
sources of funding. (Missouri is the third state in the
program.)
North Carolina’s ambitious proposal would expand
and improve I-95 throughout the state. The N.C.
Department of Transportation estimates the cost at

RENDERING: COURTESY OF THE RICHMOND METROPOLITAN AUTHORITY

N

between $10.3 billion and $12
billion to reconstruct, expand,
maintain, and operate the 181mile stretch over 40 years. The
first phase would widen and
improve I-95 from its junction
with Interstate 40 near Benson
to its interchange with state
This rendering of an electronic
toll zone under construction in
Route 211 at Lumberton. This
Richmond shows what I-95 toll
61-mile segment has the highest
zones might look like.
levels of existing and projected
traffic on I-95 within the state,
according to North Carolina’s January 2012 “I-95
Planning and Finance Study.”
The study recommends nine toll zones at approximately 20-mile intervals plus ramp toll zones at adjacent
interchanges. This would discourage out-of-state
continued on page 41

Region Focus | First Quarter | 2012

5

FEDERALRESERVE
When the Fed Conducts Credit Policy
BY DAV I D A . P R I C E

As the Fed targets lending to help
specific sectors or institutions,
does it jeopardize its independence?

of the Columbia University Graduate School of Business.
“The consequence is that the Fed loses its ability to have
itself viewed as outside the political process of spending and
taxing.”

Discount-Window Lending to Troubled Institutions
he U.S. Constitution states, “No Money shall be
drawn from the Treasury, but in Consequence of
Appropriations made by Law; and a regular Statement and Account of the Receipts and Expenditures of all
public Money shall be published from time to time.” The
power to appropriate money, which James Madison called
“the power over the purse,” seemingly gives Congress the
sole control of fiscal policy.
But just what is fiscal policy? In the view of some, the Fed
has made and carried out what amounts to fiscal policy on a
significant scale at various times in the past half-century,
most recently in response to the 2008 financial crisis.
In particular, the Fed has engaged in fiscal policy in credit
markets, also known as “credit policy,” in which it directly or
indirectly channels credit to private entities and foreign
central banks — in contrast with monetary policy, in which
the Fed creates bank reserves by purchasing only Treasury
securities from the public and holding them while turning
the interest over to the Treasury.
To be sure, the Fed has long extended credit to private
banks through its discount window. The scale of that lending
has been quite limited, however, in comparison to the Fed’s
actions following the financial crisis. Fed lending to private
entities and other central banks reached $1.5 trillion by the
end of 2008. Lending to foreign central banks had a modest
resurgence earlier this year, peaking at $109 billion in midFebruary.
The Fed has legal authority to carry out such activities
thanks to emergency powers that Congress granted it in 1932
and expanded in 1991. In the eyes of some critics, however,
that authority is problematic because it more appropriately
belongs to the Treasury than the Fed.
“In fall 2008, the Treasury could have issued debt to fund
emergency actions, but that would have been politically
difficult,” says Richmond Fed economist Robert Hetzel.
“If you think that fiscal policy should be subject to democratic monitoring because it’s in the spirit of the
Constitution, that’s exactly the sort of political debate you
want to have. But it’s painful.”
Moreover, there is concern that such activities could
make it more difficult for the Fed to maintain its independence when conducting its core functions, especially the
setting of monetary policy. “The Fed basically made itself
an active player in fiscal policy,” says Charles Calomiris

T

6

Region Focus | First Quarter | 2012

Discount-window lending by Federal Reserve district banks
provides liquidity on a short-term basis, usually overnight, to
depository institutions. The longtime dictum of central
banking has been that the discount window should be open
only to illiquid banks, not insolvent ones — that is, only to
banks that are sound, but which are facing a temporary
liquidity crunch. The extent to which the Fed carries out
lending through the discount window has been limited by its
short-term nature, by the constraint on the types of institutions with access to the window (supervised depository
institutions), and, in theory, by the requirement that the
institution not be in distress.
A rationale for closing the discount window to distressed
institutions is to avoid putting taxpayer funds at risk.
Institutions that borrow at the discount window must
pledge collateral, but such lending still creates risk indirectly: By enabling a distressed bank to make payments to
uninsured depositors and unsecured creditors, loans to a
distressed bank effectively move the deposit insurer — the
Federal Deposit Insurance Corporation — to the back of
the line if the bank’s distress reaches a point when the FDIC
must intervene.
The Fed’s district banks have not always heeded the
dictum to lend only to illiquid institutions, however. A 1992
paper by Anna J. Schwartz of the National Bureau of
Economic Research looked at discount-window lending
from January 1, 1985, to May 10, 1991, including the financialstrength scores that regulators had assigned to the
institutions. The scores were so-called CAMEL ratings (for
Capital adequacy, Asset quality, Management, Earnings,
Liquidity). Of the 530 borrowers that failed within three
years of the start of their discount-window borrowing, more
than 82 percent had a CAMEL rating of 5 at the time of their
borrowing, the rating reserved for “institutions with an
extremely high immediate or near-term probability of
failure.” More than 90 percent had a rating of 4 or 5.
“These loans were granted almost daily to institutions
with a high probability of insolvency in the near term, new
borrowings rolling over balances due,” Schwartz observed.
“In aggregate, the loans of this group at the time of failure
amounted to $8.3 billion, of which $7.9 billion was extended
when the institutions were operating with a CAMEL
5 rating.”
Earlier in the Fed’s history, two of the most famous

instances of Fed credit policymaking took place through the
discount window. The distress of Continental Illinois
National Bank and Trust Company in 1984, then the
seventh-largest bank in the country, worried policymakers
who perceived the bank as too big to fail. Despite the bank’s
effective insolvency, the Fed granted the bank and its holding company access to the discount window from May 1984
through February 1985 to keep its doors open, with the total
loan balances reaching as high as $8 billion ($17 billion in
present-day dollars).
Another episode grew out of the bankruptcy of the Penn
Central Transportation Co. in 1970. The Fed believed that a
financial crisis might result if the company defaulted on its
$82 million in outstanding commercial paper because that
might cause lenders to shy away from commercial paper in
general. After Congress declined to authorize fiscal action to
bail out the company, the Fed channeled credit to commercial paper markets indirectly by, in the words of its 1970
annual report, making clear that “the Federal Reserve
discount window would be available to assist banks in meeting the needs of businesses unable to roll over maturing
commercial paper.”
The Continental Illinois and Penn Central cases
remained the high-water marks of Fed credit policy for
nearly a quarter-century — until the summer of 2007.

Emergency Lending After the Financial Crisis
On the eve of the 2007 havoc in mortgage-backed securities
markets, the Fed had long followed a policy known as
“Treasuries only”: It held mainly Treasury securities and discount-window collateral. This policy both avoided the
exercise of fiscal power and kept risky assets off the Fed’s
balance sheet.
In response to the emerging financial crisis, the Fed instituted a series of major actions, the first of which was the
Term Auction Facility, or TAF. Open only to depository
institutions, the TAF was similar in concept to the discount
window, except that it relied on an auction mechanism to
control the volume of lending and to increase the anonymity of the borrowing banks. (Because the Fed publishes the
total weekly lending of each of the district banks, it is possible under some circumstances for banks to surmise which
other banks have borrowed from the discount window;
some observers believe this may inhibit discount-window
borrowing.) TAF loans, which had terms of 28 days or 84
days, were also longer-term than discount-window loans.
The total of TAF loans outstanding reached a peak of $493
billion in March 2009.
The Fed used its emergency powers to create a wideranging array of additional programs. Some of these
programs were based on a belief that certain financial markets were not functioning adequately. From January 2009 to
March 2010, to support housing and mortgage markets, the
Fed purchased $1.25 trillion of mortgage-backed securities
guaranteed by Fannie Mae, Freddie Mac, and Ginnie Mae.
To improve the market for asset-backed securities, such as

securitized auto loans and credit-card loans, the Fed created
the Term Asset-Backed Securities Loan Facility (TALF) in
November 2008 to make loans to owners of those securities.
The program peaked in March 2010 with assets of
$48.2 billion. The Fed also created lending facilities to
provide support to commercial paper, money-market funds,
and securities broker-dealers.
Most controversially, the Fed extended credit to rescue
the investment bank and securities firm Bear Stearns
Companies and the insurance company American
International Group (AIG). When Bear Stearns was poised
to collapse in March 2008, the Fed concluded that its failure
would destabilize the financial system. The Fed, acting
through the New York Fed, therefore used its emergency
powers to clean up the company’s balance sheet and facilitate its acquisition by JPMorgan Chase. The New York Fed
created a company called Maiden Lane for the purpose of
buying various risky assets from Bear Stearns and loaned
Maiden Lane $29 billion with which to do so.
In the case of AIG, the Fed believed that the global financial system would be at risk if the company failed and were
unable to make good on its credit-default swap (CDS) agreements. (Roughly speaking, CDS agreements are similar to
insurance against a borrower’s default.) The Fed announced
in September 2008 that it would provide the company an
$85 billion line of credit; later that year, it also formed two
companies, Maiden Lane II and Maiden Lane III, and
extended credit to them so that the former could purchase
mortgage-backed securities from AIG and the latter could
purchase collateralized debt obligations that AIG had
insured with its CDS agreements. Maiden Lane II borrowed
$19.5 billion from the Fed and Maiden Lane III borrowed
$24.3 billion.
The Fed has since arranged for Maiden Lane II to sell its
holdings and repay all of its loans. Although Maiden Lane
and Maiden Lane III have repaid most of their loan balances, the Fed still has some loans to those entities on its
balance sheet.
The Fed’s rescue operations for nonbanks were based on
an expansion of its emergency powers by the Federal
Deposit Insurance Corporation Improvement Act of 1991,
which freed the Fed from longstanding requirements concerning the quality of collateral from nonbanks. Federal law
in the past had generally allowed the Fed to provide emergency assistance to nonbanks only if the institutions’
intended use of the borrowings fell within a narrow set of
eligible purposes or if those institutions pledged collateral of
the same type required from member banks at the discount
window. The Fed had not used its emergency power to lend
to nonbanks since the 1930s. (Shortly after passage of the
1991 law, Walker Todd, then of the Cleveland Fed, expressed
concern in an article that “greater potential access to the
federal financial safety net could boost the risk-taking
incentives for nonbanks.”)
The rescue programs created in response to the financial
crisis were criticized from across the political spectrum

Region Focus | First Quarter | 2012

7

within Congress and elsewhere. In the Dodd-Frank Wall
Street Reform and Consumer Protection Act of 2010,
Congress responded in part by narrowing the Fed’s emergency lending powers. Among other restrictions, the Act
requires that any Fed lending programs must have “broadbased eligibility,” that they must be for the purpose of
providing liquidity to the financial system rather than to aid
a failing financial company, and that they must be approved
by the Secretary of the Treasury.
Columbia’s Calomiris argues that it was proper for the
Fed to use its emergency powers in situations where it was
not feasible to go to the Treasury or Congress because time
was of the essence — but, he says, that this was not the case
for most of the emergency programs. “No one can argue that
there wasn’t enough time for Congress and the Treasury to
act on the mortgage-backed securities markets,” he says.
“That was not a policy that was done over a weekend.”
Although such rescue operations could have been carried
out by the Treasury, relying on the Fed’s emergency powers is
attractive from the perspective of policymakers, says Marvin
Goodfriend, an economist at Carnegie Mellon University’s
Tepper School of Business and formerly senior vice president and policy advisor at the Richmond Fed. It avoids the
delays and uncertainties of the political process, and it
avoids an increase in the federal deficit (since Fed lending
does not count in the deficit as it is officially measured). But
the very existence of those powers may have fueled the
perception that large failing institutions would be rescued.
“It was the expansive credit powers granted by Congress
that made it virtually inevitable that those powers would be
exercised in a crisis in the future,” Goodfriend says.

Currency Swaps
To help foreign economies deal with the aftermath of the
financial crisis, the Fed established currency swap lines (also
known as liquidity swap lines) with numerous other central
banks, starting with the European Central Bank in
December 2007. The programs enable the foreign central
banks to offer short-term dollar loans to banks within their
jurisdiction using funds that the Fed has loaned to the central banks. The initial wave of swap programs continued
until February 2010. Swap programs with five central banks
were relaunched in May of that year and remain in operation. The Federal Open Market Committee (FOMC) voted
on November 28, 2011, to authorize the programs through
February 1, 2013 and to establish swap arrangements in the
currencies of the foreign central banks.
The programs are generally not regarded as a subsidy to
the foreign central banks or as a financial risk to the Fed. The
Fed charges the central banks a market-based interest rate.
The Fed suffers no exchange-rate risk since the exchange rate

is the same in both directions of the transaction. The foreign
central bank is responsible for covering any defaults.
The programs pose the institutional risk of increased
pressure on the Fed’s political independence. Some wonder
whether the Treasury, rather than the Fed, should fund any
such programs.
“There’s no reason in theory why it couldn’t be done
through the Treasury through the Exchange Stabilization
Fund, but there are always issues in lending to a foreign
country,” says Hetzel of the Richmond Fed. “Foreign aid is
subject to a lot of debate.”
Richmond Fed president Jeffrey Lacker dissented from
the November 28 vote by the FOMC to extend the programs. (Lacker voted as an alternate to then-voting member
Charles Plosser, president of the Philadelphia Fed.) Lacker
explained in a statement that he opposed the currency swap
programs because such lending “amounts to fiscal policy,
which I believe is the responsibility of the U.S. Treasury.”

Maintaining a Boundary
The Fed and the Treasury Department entered into a formal
accord in 1951 establishing that the Fed would carry out
monetary policy only to stabilize the economy, not to serve
the Treasury’s borrowing needs. The historic agreement was
a reversal of a practice in place since World War II, in which
the Fed used monetary policy to reduce the cost of Treasury
borrowing. Goodfriend and others have argued that the
temptation for policymakers to rely on the Fed to engage in
fiscal policy warrants a new Fed-Treasury accord to maintain
a boundary between their functions. Goodfriend argued in a
1994 article that among the principles of such an accord
should be that liquidity assistance, such as discount-window
lending, must not assist insolvent institutions (a principle
since incorporated into the Dodd-Frank Act) and that the
Fed should not use its balance sheet to “fund expenditures
that ought to get explicit Congressional authorization.”
The Fed and the Treasury Department did issue a statement in March of 2009 on the delineation of responsibilities
of the two institutions. While the statement indicated that
“decisions to influence the allocation of credit are the
province of the fiscal authorities,” and pledged Treasury’s
help in removing the Maiden Lane assets from the Fed’s balance sheet, it largely reaffirmed the Fed’s continued
long-term use of its emergency lending powers.
What extraordinary steps should the Fed be able to take
on its own in the midst of a potential financial catastrophe,
and when should policymakers be obliged to trudge, hat in
hand, to Capitol Hill to ask elected representatives for
approval? In the wake of the worst financial crisis since the
Great Depression, these questions remain only partially
RF
answered.

READINGS
Goodfriend, Marvin. “Central Banking in the Credit Turmoil:
An assessment of Federal Reserve practice.” Journal of Monetary
Economics, January 2011, vol. 58, no. 1, pp. 1-12.

8

Region Focus | First Quarter | 2012

Lacker, Jeffrey M. “Government Lending and Monetary Policy.”
Speech at the Washington Economic Policy Conference in
Alexandria, Va., March 2, 2009.

POLICYUPDATE
Money for Marrow?
BY T I M S A B L I K

oughly 5,000 people in the United States receive
blood stem cells from a bone marrow transplant
each year, but twice as many patients each year are
diagnosed with blood diseases, such as leukemia, for which
blood stem cells may be the best or only treatment. One
California nonprofit hopes to close that supply gap by offering incentives in the form of scholarships or housing
subsidies to donors with rare bone marrow types. But until
a recent court decision, such a plan would have been illegal.
The 1984 National Organ Transplant Act (NOTA)
bans compensation for organs, including bone marrow.
MoreMarrowDonors.org, which plans to offer incentives to
donors, was part of a group that filed suit in federal district
court in California arguing that certain donations are outside the scope of NOTA. They conceded that the law may
have originally included bone marrow to protect donors
from a painful and potentially risky process. When NOTA
was written, bone marrow was extracted directly from the
hip bone via a large needle. The majority of marrow donations today, however, are collected using a less invasive
process called apheresis. Donors are given medication to
accelerate the production of blood stem cells, which are
what transplant recipients need rather than the bone marrow itself. These cells can then be separated from the
donor’s blood through the same process used for donations
of other blood components, such as platelets or plasma.
The process is both less risky and much less painful.
The U.S. Court of Appeals for the Ninth Circuit ruled in
December 2011 that since compensation for blood components is not prohibited under NOTA, compensation for
blood stem cells obtained using apheresis is also legal. (The
court did not address the constitutionality of the ban in
NOTA.) The U.S. Justice Department asked the court to
reconsider the decision, arguing that NOTA covers bone
marrow stem cells regardless of how they are obtained, but
the court rejected that request.
Economists have long argued in favor of some sort of
market system to address widespread organ shortages.
According to data from the United Network for Organ
Sharing, headquartered in Richmond, there are 10,065
individuals waiting for an organ transplant in the Fifth
District, and two-thirds of them have been waiting for a year
or longer.
Economic theory predicts that increasing the price
of a good will induce more sellers to enter a market,
increasing supply. For example, blood banks in the
United States regularly compensate people for donating
blood plasma, and this has not only prevented a
shortage, it has also resulted in a surplus — the United
States supplies about half of the world’s plasma,

R

exporting to countries that don’t compensate donors.
Researchers Nicola Lacetera of the University of
Toronto, Robert Slonim of the University of Sydney, and
Mario Macis of Johns Hopkins University conducted a field
experiment to see how economic incentives would affect
general blood donations, which are often not compensated.
They offered gift cards in varying denominations at Red
Cross blood drives. They found that donations increased at
drives offering incentives, and that effect rose with the value
of the reward. Also, donors at those drives were more likely
to persuade others to donate with them.
“Based on the results of our study on blood, and given the
similarities between blood donation and bone marrow
apheresis, I do expect marrow donations to increase when
compensation is allowed,” says Macis.
Blood and bone marrow are naturally replenished by
donors’ bodies, but most internal organs are not. Opponents
of compensation for all organ transplants have argued that a
marketplace for organs that can’t be regenerated, such as
kidneys, would exploit the poor and the desperate, who
would be most likely to face situations in which they feel
that selling organs is their only option.
“People deplore the degrading sale, a sale made in desperation, especially when the seller is selling something so
precious as a part of his own body,” Leon Kass, a professor
emeritus at the University of Chicago and the former chairman of the President’s Council on Bioethics, wrote in 1991.
Kass acknowledged that allowing for the sale of organs
could increase the supply. But Kass and other bioethicists
express moral aversion to putting a price tag on a human being.
“The idea of commodification of human flesh repels us,
quite properly I would say, because we sense that the human
body especially belongs in that category of things that defy
or resist commensuration,” wrote Kass.
Macis argues that the moral objection can go the other
way, as well. If two consenting parties agree to a transaction
that they believe can make each better off, then one could
raise a moral objection to a third party prohibiting that
exchange from taking place. In addition to purely economic
exchanges, Macis says there are other ways to provide incentives to organ donors. He cites the example of a “priority
rule,” implemented in Israel and Singapore, which grants
registered donors priority on organ waiting lists, reassuring
them that their generosity will be repaid if they find themselves in need of an organ.
Although the ruling from the Ninth Circuit is not likely
to result in immediate changes to organ donation in the
United States other than blood stem cells, it has once again
raised the question of how best to solve a supply shortage
that confronts patients and doctors daily.
RF

Region Focus | First Quarter | 2012

9

JARGONALERT
Liquidity Trap
re there times when the central bank is powerless
to stimulate short-term economic growth? Some
economists say the economy is currently in such a
situation, often called a “liquidity trap.”
The phrase has a nebulous definition in economics due to
changes in the underlying theory since John Maynard
Keynes first introduced the concept in the 1930s. The broadest definition is a situation in which monetary policy cannot
stimulate the economy — the “trap” part — possibly
because interest rates have already been pushed to zero.
They have been at zero or close to it since December 2008.
A more precise definition of a liquidity trap is a situation
in which people have a virtually endless
demand to hold cash — an endless
demand for liquidity — relative to
other assets. In that situation, the
central bank’s increases to the money
supply fail to translate into increased
consumption or investment to get the
economy churning because the private
sector simply holds on to the cash.
The impotence of monetary policy in a
liquidity trap is often asserted to justify
alternative policies like fiscal stimulus.
Yet it would be hard to determine in
real time that the Fed’s expansionary
efforts are having no effect on the
economy given the myriad of competing influences — as a
practical matter, it would be knowable only after the fact.
Therefore, that definition may not offer much insight for
real-time policymaking.
Many economists argue that liquidity traps can’t occur at
all. Economic research suggests that central banks are far
from powerless when interest rates hit zero. For example,
quantitative easing has allowed the Fed to pump the banking
system full of excess liquidity to push down lending rates.
Additionally, the central bank can effectively ease lending
conditions further by creating expectations that policy will
remain stimulative, as the Fed has attempted to do since
August 2011 by stating that it expected to keep interest rates
very low for the foreseeable future. Financial markets
appeared to respond positively when each of these policies
were announced, suggesting that market participants don’t
believe the Fed’s policies to be impotent.
In fact, there is, in principle, no limit to how much
money the central bank could create; at an extreme, it could
purchase every interest-bearing asset in the economy.
Before reaching that point, people would likely start to bid
up the prices of nonmoney assets, making investment more
attractive and kickstarting economic activity.

A

10

Region Focus | First Quarter | 2012

What many economists seem to mean when they discuss
a liquidity trap is a limit on the central bank’s willingness to
stimulate the economy further rather than its ability to do
so. That is, there are costs to monetary expansion, the most
obvious being the risk of generating inflation.
Inflation has been contained since the Fed reached the
zero bound, but policymakers might, nonetheless, judge that
the economy will heal on its own with fewer costs than a
recovery encouraged by additional monetary stimulus. For
example, some economists, such as Philadelphia Fed
President Charles Plosser, have argued that easier monetary
policy could cause financial market distortions — making
some investments artificially cheap relative to others — and the misallocation
of resources down the road.
A central bank’s unwillingness to
stimulate the economy further — given
its assessment of the costs and benefits
of doing so — may be more plausible
than the conventional notion of a
liquidity trap in which the central bank
is literally powerless. Nonetheless, if
policymakers judge that further monetary expansion would not be a net benefit
to the economy, we may observe conditions that look and feel a lot like what
might be expected in the technical definition of a liquidity trap — namely, persistently weak economic
growth despite some strong measures by the central bank.
The evidence on liquidity traps, too, is murky. There are
three commonly suspected episodes: First is the Great
Depression, but economists Milton Friedman and Anna
Schwartz famously noted that the Fed didn’t actually keep
monetary policy easy in the mid-1930s. In fact, it inadvertently contracted the money supply due to an incomplete
understanding of how a new reserve requirement policy
would affect the financial system, making the Great
Depression worse. Second is Japan’s “lost decade” of low
economic growth in the 1990s (some economists even
include much of the 2000s). Many economists, however,
have argued that the Bank of Japan, too, became contractionary at points during that period, making it difficult to
argue that it tried all it could to boost growth. Finally, some
economists argue that we are in a liquidity trap following the
2008-2009 recession, given that we have not experienced a
strong recovery despite the Fed’s unprecedented efforts to
induce one. Though economic growth has been weak, many
Fed policymakers have argued that the Fed is not —
and never will be — out of ammunition, should conditions
RF
warrant it.

ILLUSTRATION: TIMOTHY COOK

BY R E N E E H A LT O M

RESEARCH SPOTLIGHT
Playing the Waiting Game
BY C H A R L E S G E R E N A

economy, uncertainty, and policy in articles published in
ommentators and lawmakers have blamed policy
10 major newspapers.
uncertainty for creating a “waiting game” that has
The second component captures the number of federal
made the recession deeper and the recovery slower
tax code provisions set to expire in the near future. This can
than it might have been. Business owners may put off
generate uncertainty because lawmakers often don’t reach a
investing in a new facility or hiring more salespeople if they
decision on whether to renew them until the last minute.
don’t know how changes in tax policy, government spendThe third component uses the Federal Reserve Bank of
ing, or regulation will affect their plans. If enough business
Philadelphia’s Survey of Professional Forecasters to measure
decisions are delayed, the overall economy suffers.
the amount of disagreement on the future course of
In a recent paper, Scott Baker and Nicholas Bloom of
consumer price inflation and federal, state, and local governStanford University and Steven Davis of the University of
ment purchases. These macroeconomic variables were
Chicago construct their own measure of uncertainty, test
selected because they are directly influenced by monetary
how well it corresponds to past changes in the business
policy and fiscal policy actions.
cycle, and assess its ability to predict future swings. They
“The resulting index … looks
find that a big jump in their
sensible, with spikes around
index of policy-related econom“Measuring Economic Policy
consequential presidential elecic uncertainty is associated with
Uncertainty.” Scott R. Baker, Nicholas
tions and major political shocks
significant declines in output,
Bloom, and Steven J. Davis. Manuscript,
like the Gulf Wars and 9/11,”
investment, and employment.
the
researchers
comment.
Previous studies have invesStanford University and the University of
The Lehman bankruptcy, the
tigated potential connections
Chicago, February 2012.
eurozone crisis, and the U.S.
between policy uncertainty and
debt-ceiling dispute pushed the
economic outcomes. In a 1991
index to record highs.
paper, Dani Rodrik of Harvard University found that firms
To determine which policies exert the most influence on
tend to delay new investments until policy uncertainties are
uncertainty, Baker, Bloom, and Davis analyze a narrower
resolved. A 1999 paper by Kevin Hassett at the American
collection of news articles than those used for the news
Enterprise Institute and Gilbert Metcalf of Tufts University
component of their index, applying search terms like
found that uncertainty over the implementation of invest“inflation.” While national security loomed large as a source
ment tax credits affects the timing of when businesses invest
of uncertainty after the 9/11 terrorist attacks, “extraordinary
— they may wait if they think a new tax credit is on the
levels of policy uncertainty in 2010 and 2011 are dominated
horizon or rush to invest if they anticipate a credit will be
instead by concerns related to monetary policy and taxes,”
taken away.
note the researchers.
While these studies identified a relationship between
To assess how closely aggregate economic activity moves
uncertainty and the economy, others have not. For example,
in response to changes in their uncertainty index, the
a 2011 paper by Edward Knotek and Shujaat Khan of the
researchers consider several models using monthly data
Federal Reserve Bank of Kansas City demonstrated that not
from 1985 to 2011. They find that when policy uncertainty
every past spike in uncertainty corresponded to a recession.
increases as it did over the last six years, private investment
(See “Around the Fed,” Region Focus, Third Quarter 2011.)
falls, bottoming out at a 16 percent decline within nine
Part of the challenge for researchers is determining cause
months of the spike. In addition, industrial production
and effect. The same factors can influence both policy
shrinks as much as 4 percent after 16 months and as many as
uncertainty and the state of the economy at the same time.
2.3 million jobs may be lost in the aggregate within two years.
Similarly, policy uncertainty may contribute to a recession,
These findings demonstrate only an association between
or merely be a leading indicator of a downturn.
high levels of policy uncertainty and weaker economic conIf policy uncertainty does contribute to overall economic
ditions. Still, they “reinforce concerns that policy-related
uncertainty, what is the strength of that relationship?
uncertainty played a role in the slow growth and fitful recovFactors beyond the control of policymakers can cause appreery of recent years,” suggest Baker, Bloom, and Davis. RF
hension about the economy, such as the rate of technological
change and the future path of commodity prices.
The index of policy-related economic uncertainty
The paper discussed in this article can be found at:
created by Baker, Bloom, and Davis has three components.
http://www.policyuncertainty.com/uploads/BakerBloom
The first measures the frequency of references to the
Davis_Feb3.pdf

C

Region Focus | First Quarter | 2012

11

C O V E R

S T O R Y

We know innovation is important — but do we know how to make it happen?
BY J E S S I E RO M E RO

T

It is a foundation of economic theory that such innovations are the key to long-term economic growth. New ideas
and new technologies lead to rising productivity, which leads
to sustained increases in per capita income and living
standards over time.
As the United States continues its slow recovery from the
deepest recession since the Great Depression, restoring the
country’s “innovation economy” has taken on new urgency.
A raft of recent policy proposals from think tanks, trade
organizations, and politicians emphasize support for
small business, incentives for private-sector research and
development (R&D), more federal spending on science, and
education and immigration reform as strategies that will
lead to more innovation and thus to more growth and jobs.

12

Region Focus | First Quarter | 2012

These policies typically are predicated on a set of
beliefs — that small companies are more innovative than
large, that more government spending translates into more
innovation, and that innovation in general is a phenomenon
that can be measured and studied scientifically. But
innovation is an elusive concept. A novel idea isn’t enough
in and of itself; that idea must also translate into profitable
products and services, and lead to the creation of new economic value. How and why innovation actually happens —
and how to make more of it — are questions that researchers
are still trying to answer.

Sizing up Business Size
In 1979, Apple co-founder Steve Jobs visited Xerox PARC,
the innovation lab of the Xerox Corporation. There, he saw
the Xerox Alto, the first computer to operate with a mouse
and the graphical user interface familiar to today’s computer
users. Jobs was so excited that he returned a month later
with his own engineers, who paid close attention to what
they saw. “If Xerox had known what it had and had taken
advantage of its real opportunities, it could have been as big
as IBM plus Microsoft plus Xerox combined,” Jobs said in an
interview years later. Instead, the story goes, the nimble,
entrepreneurial startup launched the Macintosh and transformed computing while the established, slow-moving
corporation failed to realize the commercial potential of
what it had developed.
But the story isn’t so clear-cut. Apple wasn’t actually all
that small at the time; the Apple II computer had been a

PHOTOGRAPHY: GETTY IMAGES

he printing press. The steam
engine. Penicillin, personal
computers, the Internet —
these are turning points in history,
innovations that have transformed
modern life and contributed to an
exponential rise in living standards
throughout much of the world.

commercial success for two years, and the company was just
a year away from a $101 million initial public offering
($279 million in today’s dollars), the largest since that of Ford
Motor Co. in 1956. By 1983, the year before it introduced the
Macintosh, Apple’s revenues were $983 million (more than
$2.2 billion today). And although Xerox failed in its first
attempt to market a computer with a graphical interface
(a successor to the Alto named the Star), Apple’s first
attempt to market such a computer, the Lisa, was also a flop.
Moreover, while the large company is faulted in hindsight
for not realizing the commercial potential of its inventions,
the resources it committed to research and experimentation
are what helped make innovations like the Macintosh
possible.
Small businesses are the beneficiaries of many policies
designed to support the next Apple. The Small Business
Administration (SBA) administers several programs that
help small businesses win federal technology contracts and
research grants, and federal agencies are required to award a
set portion of their R&D funds to small businesses. The
Small Business Jobs Act of 2010 enacted new tax cuts and
extended lending programs for small businesses, citing them
as “the engine of our economy.” And the current administration’s Strategy for American Innovation, released in 2009 and
updated in 2011, includes a number of provisions to help
small businesses and entrepreneurs, such as loans, training
programs, regional economic development grants, and a
promise to aggressively pursue large companies for antitrust
violations.
Over the past several decades, smaller businesses have
performed an increasing share of R&D in the United States,
according to research by Robert Hunt and Leonard
Nakamura of the Philadelphia Fed. They found that nearly
all of the increase in R&D between 1980 and 2000, a period
when private R&D as a share of GDP about doubled, is
accounted for by the increase in R&D at smaller firms.
Computers and other technologies made it easier for new
firms to enter the market and compete against large incumbent firms, and cheaper to develop new products. “We have
a body of evidence that a very significant portion of our
fairly good outcomes in terms of innovation in the last
30 years was driven by a structural change that favors
smaller and younger firms,” Hunt says. As Hunt notes, however, that change resulted from new technology, not from
new government policies.
The special attention paid to small businesses stems from
the belief that they have an inherent innovative advantage
over large businesses: They are less likely to have an interest
in maintaining the status quo, and they are more responsive
and quicker to change. As a result, they have a disproportionate impact on “disruptive” innovation — change that
creates an entirely new market — as opposed to large firms,
which tend to engage in incremental innovation, some say.
Not every small business matches the popular images of
the “inventor in the basement” or the company run out
of a garage. The SBA’s official definition of a small

business is one with fewer than 500 employees — a definition that covers 99.8 percent of all employer firms in the
United States.
Economists have spent decades studying the complicated
relationship between market structure and innovation.
A large and highly influential body of research, beginning
with a seminal 1962 paper by Nobel laureate Kenneth Arrow,
argues that competition is more conducive to innovation
than monopoly. Arrow showed that because monopolies
maximize profit by raising prices and restricting quantities,
they have less output over which to spread the fixed costs
of innovating, and thus are less likely to invest in new technology. In addition, each firm in a competitive market may
have an incentive to innovate in an effort to escape the
competition, whereas monopolists don’t face the same
competitive pressures. But another large body of work
suggests that even the mere threat of other firms entering
the market might be enough to induce monopolies to act as
if they are in a competitive market.
Economist Joseph Schumpeter, who pioneered the idea
of “creative destruction” as the engine of capitalism, argued
that large businesses — particularly monopolies — are
better qualified to innovate because they have the size to
take advantage of increasing returns to investments in R&D,
they have greater capacity to take on risk, and they have
fewer competitors to imitate their invention. Other economists also believe that large companies have greater
incentives to innovate, although not necessarily because
they are monopolies, as economist William Baumol of New
York University discussed in his 2002 book The Free-Market
Innovation Machine. According to Baumol, a market structure that involves competition among a few large businesses
has “innovation as a prime competitive weapon,” and thus
“ensure[s] continued innovative activities.”
This has been true in retail, where just five companies —
Wal-Mart, Kroger, Target, Walgreens, and the Home Depot
— account for 36 percent of the total sales of the country’s
100 largest retailers. Although Wal-Mart’s sales are more
than the sales of the next four companies combined, it has
continued to develop new supply chain technologies that
have both increased its own productivity and led to higher
productivity in the sector overall.
Large companies also are often the center of “innovation
ecosystems,” the constellation of suppliers
and developers that forms around a central
business. Today, Apple — the world’s
largest publicly traded company, as
measured by market capitalization — is
the center of just such a constellation.
Apple and other large companies
invest in key technologies
and create stable platforms that make it
easier for small
companies to
innovate,

Region Focus | First Quarter | 2012

13

R&D Spending as Share of GDP
3.5
3.0

PERCENT

2.5
2.0
1.5
1.0
0.5
0
1953 57

61

65

69

73

77

81

85

89 93 97 2001 05 09
Industry R&D
Total R&D
Federal R&D
Other R&D

NOTE: 2009 is the latest year data are available. “Other” includes college and university,
nonprofit industry, and nonfederal government.
SOURCES: National Science Foundation; Bureau of Economic Analysis, Haver Analytics;
Region Focus calculations

and the ecosystem as a whole can better handle both risk and
scale, according to economist Michael Mandel of the
Progressive Policy Institute. Small and large companies may
have different advantages and incentives to innovate, but
both are important to the process.

Should the Government Spend More?
Whether they own a Mac or a PC, most people use their
computer to browse the millions of websites that have
sprung up in the past two decades. The modern Internet
grew out of ARPANET, a Defense Department project
during the 1960s and 1970s to develop a communications
network that could continue operating even if various command centers were destroyed. The Pentagon’s Advanced
Research Projects Agency funded research at private companies and universities nationwide into packet switching
(and later the TCP/IP model), and awarded development
contracts to numerous small companies, helping to create a
market for the new technologies. Today, the Internet is cited
as a prime example of the importance of government spending to scientific advances and innovation.
The theoretical case for such spending is strong, particularly in the case of basic research. New knowledge is a
“nonrivalrous” good — once an inventor has an idea, that
idea can be replicated or put to use by many other people. In
addition, there is often a long lag between a new idea and its
commercialization. As a result, the full economic value of a
new discovery is unlikely to be realized by the discoverer,
making the private sector unlikely to invest in a socially
desirable amount of research. The patent system is one way
to address this “market failure,” by awarding a temporary
monopoly to the inventor. But strict intellectual property
rights can also limit the use of new ideas, making it difficult
for researchers to build on earlier work. Another way to
correct potential underinvestment in research by the private
sector is for the government to fund the research itself, or
lower the costs of private-sector investment via subsidies or
tax credits.

14

Region Focus | First Quarter | 2012

Since the 1970s, federal spending on R&D has declined
as a share of GDP, although the overall level of R&D
spending has remained fairly constant due to an increase in
private-sector spending (see chart). Because federal spending is more likely, in theory, to be directed toward the type of
research unlikely to occur in the private sector, many are
concerned that the shrinking federal share hampers the
United States’ ability to discover the next big thing.
In practice, government spending on R&D isn’t always
allocated to projects that provide the greatest public good,
as political considerations might factor into the decisionmaking. And while the success stories — such as the
Internet; the development of hybrid seed corn, which
dramatically increased crop yields in the 1930s; or the mapping of the human genome, which is leading to new medical
technologies — are compelling examples of the benefits of
government spending on research, those cases aren’t necessarily applicable to other projects or to broad policy
decisions.
At most federal agencies, R&D spending is a mix of basic
research, applied research, and development. (Basic research
is to advance general knowledge, without a specific need or
product in mind. Applied research is directed toward
meeting a specific need. Development is the design and
production of actual products.) The exceptions are the
Department of Defense, where nearly all spending is for
development, and the National Science Foundation (NSF),
where nearly all is for basic research (see chart).
But these categories mask the fact that more than
90 percent of this spending, including that classified as
basic, could be considered “mission oriented,” that is, aimed
at fulfilling the specific goals of the funding agency rather
than at addressing a market failure, according to research by
David Mowery, an economist at the University of California
at Berkeley. A successful program thus might not be proof
that government involvement is necessarily the way to go.
Instead, “the economic effects … are linked to complementary policies or broader structural elements of the agencies’
missions. Apparently similar programs … may produce very
different outcomes in different contexts,” Mowery wrote in
a chapter of the 2009 anthology The New Economics of
Technology Policy.
In the case of the Internet, the Defense Department had
a unique and large-scale procurement strategy that encouraged new firms to enter the industry and contributed to the
rapid commercial diffusion of the new technologies. R&D
spending by the National Institutes of Health, one of the
agencies that funded the Human Genome Project, tends to
have large economic effects since many consumers of new
health care technologies are price-insensitive. Because
procurement strategies, demand conditions, and other
factors vary from agency to agency and from project to
project, caution is warranted when generalizing from one
project to another.
An additional source of federal support for R&D is
private-sector tax incentives. The largest incentive, the

PERCENT

Research and Experimentation Tax Credit, was enacted
Federal Agency R&D Spending
in 1981 as part of an economic recovery program bill. U.S.
100
90
companies claimed $8.3 billion in credits in 2008, accord80
ing to the National Science Foundation. (2008 is the
70
most recent year for which the IRS has published data.)
60
50
In addition, about 35 states, including the entire Fifth
40
District with the exception of Washington, D.C., offer
30
tax incentives for R&D. It’s unclear, however, what effect
20
10
such incentives actually have on R&D. For example, in a
0
Defense
Agriculture
National
Health and
survey of 20 empirical studies, economists Bronwyn Hall
Energy
NASA
(51.1%)
(1.7%)
Science
Human
(7.4%)
(4.5%)
of the University of California at Berkeley and John Van
Foundation
Services
Basic
(4.6%)
(26.7%)
Reenen of the London School of Economics and Political
Applied
AGENCY
(SHARE
OF
TOTAL
FEDERAL R&D)
Science concluded that a tax incentive increases R&D by
Development
NOTE: Data are for the top six agencies as a share of total federal R&D spending.
about dollar for dollar. As the authors noted, however,
SOURCE: National Science Foundation, “Science and Engineering Indicators 2012”
there is a great deal of variation in the results of the studies depending on the methods used to calculate the
amount of spending before and after a policy change to
effect. Hall and Van Reenen also pointed out that at least
determine the effects of that policy. But such measures are
part of the increase could be due to “relabeling” — compalacking, says Hunt of the Philadelphia Fed. “We have this
nies shifting activities in their budgets in order to qualify for
fundamental data constraint: We know what we want to
the credit — rather than to genuine increases in investment.
study, but what we have are a bunch of imperfect measures.”
Private-sector firms also are likely to use the credits to first
Changes over time in U.S. patent law and in patenting
fund projects that have the highest rate of private return,
strategies mean that patenting rates might not be compararather than those likely to advance basic knowledge. This
ble across time periods or industries, Hunt says. Prior to the
makes it difficult for policymakers and economists to deter1990s, for example, it was very difficult to patent software.
mine whether the incentives are supporting R&D that firms
But once federal courts began treating computer programs
would have undertaken anyway.
like other forms of technology, the number of software
patents grew from 1,000 to nearly 25,000 per year, according
How to Measure Innovation?
to research by Hunt and James Bessen, director of the nonIn order to study the effects of policy on innovative activity,
profit organization Research on Innovation. Rather than an
economists and researchers must make a crucial assumpincrease in innovation, this rise might reflect “defensive
tion: Innovation is something that can be measured. But
patenting,” a practice common in some high-tech industries
“the unit of analysis is quite complex,” says Julia Lane, senior
whereby companies obtain a high volume of patents for
managing economist at the nonprofit American Institutes
minor or trivial inventions in an effort to block their comfor Research (AIR) and the former director of the Science
petitors or protect themselves against future litigation.
of Science and Innovation Policy program at the NSF.
(See “Patents Pending,” Region Focus, Fourth Quarter 2011.)
Innovation depends on “a complex system of relationships,
Economists can try to address the limitations of patents
the whole process of sharing knowledge,” and is characterby weighting patents according to how frequently they are
ized by long time lags, Lane says, which makes it difficult to
cited by other patents, which gives some indication of a
quantify after it has occurred — not to mention difficult to
patent’s value, but patent quality and comparability across
predict in advance.
time and industry remain challenges for researchers.
At the macro level, the effects of innovation are often
Much of innovation policy is premised on the “notion
quantified as “total factor productivity” (TFP). In a basic
that you can just drop the magic number of R&D in one end,
model, economic growth occurs not only because labor and
and automatically out the other end comes innovation,” says
capital increase, but also because technological advances
allow those inputs to be used more efficiently. Technological
Lane of AIR. But as with patents, an increase in R&D
change cannot be observed directly, however, so the
spending does not always reflect an increase in innovation.
portion of growth in the model that cannot be attributed
As a working group organized by the Organization for
directly to labor or capital — the “residual” — is called TFP.
Economic Cooperation and Development (OECD) stated,
Estimates of TFP are very sensitive to the assumptions
“It is probably quite erroneous and misleading for
underlying the model being used to measure it, and it’s posappropriate and adequate policymaking for technology and
sible that the term might capture factors other than
competitiveness to equate R&D with innovative capacity.”
technological change.
The OECD, among others, is currently working to
Innovation can’t be measured directly at the firm or
develop new measures of innovation that better capture
industry level either. Two of the most widely used proxies for
the complex interactions between policy, firms, and the
innovation are patenting rates and R&D spending; a
economy as a whole.
researcher can compare the number of patents issued or the
The Congressional Budget Office reviewed more than

Region Focus | First Quarter | 2012

15

three dozen studies that measured productivity changes at
the firm, industry, and economy levels using R&D spending
as a proxy for innovation. Overall, the studies concluded
that additional R&D spending had a positive effect on output and productivity, but the magnitude of the effect varied
tremendously. Most significantly, the effect of R&D spending was much greater in studies that compared different
companies with different levels of R&D than in studies that
followed changes in R&D over time at the same company.
This suggests that factors other than R&D spending are
responsible for productivity differences seen between companies, and perhaps that increases in R&D might not have a
large effect on the economy as a whole.
As difficult as innovation is to measure after the fact, it’s
even harder to capture in the present. It is only with the
benefit of hindsight that the revolutionary properties of a
new invention seem inevitable: Johannes Gutenburg didn’t
know his printing press would lay the groundwork for the
Renaissance and the Age of Enlightenment. Farmers were
initially reluctant to try the new hybrid seed corn. And
technologies with great promise — such as hydrogenpowered cars or Betamax videotapes — may fail to live up to
their potential. As economic historian Joel Mokyr of
Northwestern University wrote in a 1992 paper, “If every
harebrained technological idea were tried and implemented,
the costs would be tremendous. Like mutations, most
technological innovations are duds and deserve to be
eliminated. Yet … if no harebrained idea were ever tried, we
would still be living in the Stone Age. Unfortunately, it is
impossible to know in advance whether an invention is a
true improvement or a dud until the experiments are
carried out.”

Planting the Seeds
“We’re betting our future on the idea that investing in
science is going to lead us to more economic growth and
competitiveness,” says Lane of AIR. “But the fact is, we
don’t know how that works. To say, ‘We’re going to spend
money on R&D and 20 years later a miracle is going to
occur’ is not a very scientific approach.” Lane and other
researchers are working to develop new datasets and data
infrastructure that will help scientists, entrepreneurs,

and policymakers better understand the links between
investment, innovation, and economic growth.
In the meantime, attempting to plan or direct innovation
might not be possible. “Innovation is like a forest,” says
Robert Litan, vice president of research and policy at the
Ewing Marion Kauffman Foundation, an organization that
studies entrepreneurship. “You can make sure the soil is
right, and you can fertilize it. You can plant the trees, but
you don’t know which trees are going to grow the highest.”
Still, policymakers could have an effect on the fertilizer.
Many economists agree, for example, that immigration rules
should make it easier for immigrants with high-tech skills
to enter the United States and for foreign students to
remain here after earning their degrees. More than one-half
of new high-tech businesses launched in Silicon Valley
between 1995 and 2005 had a foreign-born founder, yet the
number of H1-B work visas for scientists and engineers has
been cut by two-thirds in recent years. Opening up immigration “would bring a lot more energy and ideas to America,”
says Litan.
Even if the number of H1-B visas were restored to its previous peak of 195,000 per year in the early 2000s, that
represents only one-tenth of 1 percent of the U.S. labor
force; there is also a need to train more Americans in the
STEM fields (science, technology, engineering, and mathematics). Education policies focused on achievement in these
fields, such as higher salaries for top-performing STEM
teachers and new community college programs, could help
students and workers gain high-tech knowledge or advanced
manufacturing skills.
Completing new multilateral trade agreements could also
help encourage innovation, as a recent report by the
Brookings Institution noted. Companies in export markets
must compete internationally and can learn from technological advances in other countries. Finally, a more flexible
intellectual property system that can meet the needs of companies of different sizes and in different industries, and that
does not reward the filing of trivial or low-value patents,
could create better incentives for companies to innovate.
Such policy changes are not a guarantee that innovation and
economic growth will occur, but they could help to create an
RF
environment in which ideas and firms can flourish.

READINGS
Arnold, Robert. “R&D and Productivity Growth: A Background
Paper.” Congressional Budget Office, June 2005.

Lane, Julia. “Assessing the Impact of Science Funding.” Science, June
2009, vol. 324, no. 5932, pp. 1273-1275.

Gilbert, Richard J. “Competition and Innovation.” In Wayne Dale
Collins, (ed.), Issues in Competition Law and Policy. Chicago:
American Bar Association Antitrust Section, 2006.

Mowery, David. “What Does Economic Theory Tell Us
About Mission-Oriented R&D?” In Dominique Foray (ed.),
The New Economics of Technology Policy. Northampton, Ma.:
Edward Elgar, 2009.

Hall, Bronwyn, and John Van Reenen. “How Effective Are Fiscal
Incentives for R&D? A Review of the Evidence.” Research Policy,
April 2000, vol. 29, nos. 4-5, pp. 449-469.

16

Region Focus | First Quarter | 2012

Organization for Economic Cooperation and Development.
Measuring Innovation: A New Perspective. OECD Publishing: 2010.

How the Region’s
Political Aims Led
to a Complicated
Morning After
BY R E N E E H A LT O M

rom 27 B.C. to 180 A.D., the
territories now covering much
of Europe shared some basic
governing structures. They had a
common legal system that influences
Western courts to this day, and even
shared a common currency — 2,000
years before the euro came along.
That era of peace, known as Pax
Romana, replaced two centuries of civil
war and conquest. Peace didn’t last, of
course. After the era ended with the
death of emperor Marcus Aurelius,
attempts to create a unified Europe
came mostly through force, a pattern
that persisted right up through World
War II. After the war, Winston
Churchill called for a revived notion of
European political unification to promote lasting peace on
the continent.
Hence the idea of a “United States of Europe” is
millennia old, and still in progress. Though Europe has
achieved far from the degree of political and economic
unification of the United States, today it enjoys greater
policy coordination and more cross-border trade than ever
before in the region’s long history.
Still, the viability of Europe’s tremendous strides toward
economic integration has been called into question during
the financial crisis that has afflicted the region since early
2010. Investors have become concerned about the sustainability of current government deficits in light of projections
for future spending and anemic economic growth in
countries such as Greece, Ireland, Italy, Portugal, and Spain,
charging them much higher interest rates for new debt.
When the debt of several governments was downgraded,
it hurt the financial position of banks that held it, leading to
a funding freeze across a continent that was already
hampered by the global recession. The problems forced
governments to consider dramatic fiscal retrenchments to
put their books in order, which have potentially hurt their
economies further in the short run and were protested by
entire populations.

F

Europe’s economic problems are rooted in long-standing
issues. The structural flaws of the European Union, and the
euro monetary union in particular, are a byproduct of the
political trade-offs required to achieve the last 60 years of
economic integration. While many economists remain
optimistic about prospects that the euro will pull through its
current crisis, most also concede that some of those
structural flaws will have to be rectified to ensure the euro’s
long-term survival.

Steps Toward Integration
The appetite for political reconciliation in Europe was
strong following World War II, especially in France and
Germany. To make war “not merely unthinkable, but
materially impossible,” in the words of French Foreign
Minister Robert Schuman, Germany and France, along with
Belgium, Italy, Luxembourg, and the Netherlands, in 1952
pooled the production of coal and steel through the
European Coal and Steel Community (ECSC), the first
formal step toward economic integration in Europe.
Politics aside, integration also had plenty of economic
justification. Trade allows two regions to specialize and
increase production, improving living standards. Expanding
the area of trade with additional countries should increase

Region Focus | First Quarter | 2012

17

those gains. Thus, the next step in integration was to create
a common market, a free flow of labor, capital, and goods
across borders. Along with the ECSC, other economic
“communities” were created by 1957’s Treaty of Rome to
consolidate the production of major industries. In 1967, the
institutions governing the communities were combined into
what later became known as the European Community (EC).
European growth slowed with the worldwide oil crunch
of the 1970s, leading to a high-unemployment, low-growth
era of “Eurosclerosis.” That general economic malaise bled
into the 1980s and spurred hundreds of measures to remove
all remaining impediments to the flow of labor and other
production factors by 1992.
The stronger the EC grew, the greater was the incentive
to participate. Denmark, Ireland, and the U.K. joined the six
founding members of the ECSC in 1973, followed by Greece,
Portugal, and Spain in the early- to mid-1980s. Austria,
Finland, and Sweden joined in 1995 — the region had by then
received its current name, the European Union (EU) — with
eastern European nations joining in the 2000s after the euro
was adopted as the region’s common currency. Today there
are 27 members of the EU boasting more than half a billion
citizens; 17 of those nations belong to the euro monetary
union.
The EU’s diverse membership is divided into what are
informally known as the “core” and “periphery” of Europe.
The core includes the wealthy northern and central nations,
such as France, Germany, and Belgium, while the periphery
are comprised of the mostly southern poorer countries, such
as Greece, Spain, and Portugal. The key question surrounding integration has always been whether membership would
cause their incomes to grow closer together or further apart
over time. Research by trade economists Paul Krugman and
Anthony Venables in 1990 suggested that integration could
at first hurt the periphery nations at the benefit of the core
as human capital and economic activity flooded to the
latter to take advantage of economies of scale. Eventually,
however, they predicted some activity would flow back to
the periphery to take advantage of cheaper wages.
Empirically, the effects have been uncertain. The
incomes of several periphery nations converged after joining, but researchers haven’t agreed on the extent to which
that was due to the virtues of economic integration.

(Non)Optimal Currency Areas
The differences between nations mattered most when it
came to adopting a common currency. Europe had debated
the costs and benefits of taking that step for decades. By
eliminating exchange rate risk and the direct costs of changing currency, a common currency promotes trade and
investment within the union. The major downside is that
regions belonging to a currency union are bound by a single
monetary policy, which can at times be too easy for some
nations and too tight for others. That’s mostly a problem
when nations experience different economic shocks and
business cycles.

18

Region Focus | First Quarter | 2012

Still, losing monetary autonomy could be worthwhile if
the economies have other means of adjusting to shocks.
That rule of thumb was provided by economist Robert
Mundell, currently at Columbia University, who in 1961
came up with criteria for when it makes sense for a group of
countries to share a currency — that is, whether the countries are an “optimal currency area” (OCA). Each city and
town doesn’t need its own currency, but neither should the
entire world share one; his goal was to identify the happy
medium.
Although the term “optimal currency area” might sound
as if the concept is mathematically precise, the criteria that
Mundell set out were more like general guidelines: First,
nations in a currency union should have high labor mobility
between them to provide adjustment to a boom in one and a
slump in another. Second, they should have flexible wages
and prices to accomplish the same. Third, they generally
should experience similar economic shocks. And fourth,
there should be a centralized mechanism, such as taxes and
transfers from a common fiscal authority, to help regions
adjust to localized shocks and weaknesses. A currency union
among regions not meeting these criteria would be at risk
for rougher business cycles and living standards that grow
apart rather than together, leaving some nations worse off
on balance. Mundell won the Nobel prize for his work on
OCAs and other ideas in 1999, just as the euro was being
launched.
Most economists today agree that Europe did not fit
Mundell’s criteria for monetary union compatibility. Labor
mobility there remains notoriously low, even now. Just
0.1 percent of the EU population moved between member
countries annually in the mid-2000s, compared to 2 percent
to 2.5 percent of Americans who moved between U.S. states.
That’s not for lack of policy support: A Spaniard can get a job
in France with his existing passport, no visa required. (The
EU migrant must sometimes pass regional licensing exams,
just as a lawyer relocating to the Big Apple would have to
take the New York bar exam.)
Unfortunately, the remaining barriers to European
mobility are difficult to solve through policy. Barriers to
moving in the EU are mostly personal, according a study
produced for the European Commission, the executive body
of the EU. Language barriers are at the top of the list, which
also includes fears about finding relevant job opportunities
and cultural differences between old and new locations.
That may mean there is a natural limit to European mobility, and therefore also the adjustment to economic shocks.
And Europe’s shocks are much more “asymmetric” than
those experienced by the U.S. states, a currency union success case, according to 1997 research by Barry Eichengreen
of the University of California, Berkeley and Tamim
Bayoumi, currently at the International Monetary Fund.
Even when localized shocks occur in the United States, the
centralized system of fiscal transfers helps counter them:
Social Security, unemployment insurance, subsidies to
nonprofits, and the progressive tax system in general.

Figure 1: Government Borrowing Rates

Greece
Portugal
Ireland
Italy
Spain
France
Germany

2010
2011
2012

2008

2009

2005

2006
2007

2003

2004

2001
2002

1999

2000

1997

1998

1996

1995

PERCENT

Monthly 10-year bond yields
Jeffrey Sachs and Xavier Sala-i-Martin, both currently at Columbia University, estimated in 1992
30
that federal taxes and transfers in the United States
25
Euro Introduced
eliminated as much as 40 percent of declines in
20
regional incomes. Europe has no such mechanism to
15
address regional disturbances.
10
Some economists argued that it didn’t matter
that Europe wasn’t quite an optimal currency area. It
5
was possible that currency union could work in
0
reverse, actually causing the euro area to become
more fit to share a currency by increasing trade and
SOURCE: European Commission, data through March 2012
synchronizing business cycles — in other words,
that the attributes of an OCA could, to some extent,
Germany, hesitant to wed itself to less frugal countries,
be generated “endogenously,” as economists put it. The
urged adoption of the Stability and Growth Pact (SGP) in
effects of currency union on trade were an obvious area of
1997 to implement Maastricht’s fiscal criteria through the
focus: If commerce between nations picked up, their
threat of sanctions for breaches. Annual budget deficits
economies would naturally move more closely together.
were to be kept below 3 percent of GDP, and national debt
But the theory did not suggest that drastically different
no larger than 60 percent of GDP, with exceptions allowed
economies could be put in alignment by a currency union
when local economies were weak.
alone. Even though increased trade resulted from the euro
Many economists noted the contradiction between the
monetary union — somewhere in the vicinity of 20 to 40
OCA criteria and Maastricht guidelines, but recognized that
percent more since the euro’s launch, says economist
the objective of political unity was also a relevant consideraAndrew Rose of the University of California, Berkeley,
tion. “The standard of living of the typical European would
who contributed to the endogenous OCA literature — it
be lower in the medium term and long term if the [monetary
wasn’t enough to make the euro area suddenly qualify as
union] goes ahead than if Europe continues with its current
an OCA.
economic policies” of integration without monetary union,
Despite these concerns, a new political impetus for
predicted Martin Feldstein at Harvard, a prominent euro
monetary union arose when the Berlin Wall was torn down.
critic, in 1997. “But in the end, it should be for the
“What made the euro feasible was the end of the Cold War
Europeans themselves to decide whether there are net
and German unification,” says Jacob Kirkegaard at the
political advantages of [union] that outweigh the net
Peterson Institute for International Economics. When East
economic disadvantages.”
and West Germany unified in 1990, President François
Mitterrand of France wanted to secure a more equal place at
From Calm to Crisis
the bargaining table with Germany — a goal France had long
Commerce denominated in euros began on January 1, 1999,
held, but pushed for even harder given Germany’s new
and the currency was released in physical form in 2002. For
economic might. Helmut Kohl, the German “Chancellor of
the euro’s first 10 years, the economies’ fundamental differUnity,” as he became known, wanted to overcome that
ences didn’t seem to matter. The global economy was
nation’s image as a source of instability and three major wars
functioning well. Annual inflation stayed near the target of
since 1870. He saw monetary union as a way to anchor
2 percent set by the new European Central Bank (ECB), and,
Germany to Europe. “That’s why, in a relatively short period
even more remarkably, inflation expectations remained
of time, about a year, European leaders negotiated a new and
anchored despite the ECB’s nonexistent performance
very, very far reaching European treaty,” Kirkegaard says,
history. Banks ramped up cross-border lending, and bank
referring to 1992’s Maastricht Treaty to bring Europe’s
regulation became more aligned (although critics argue that
economies closer together in support of a common currency.
Europe still has a long way to go in this regard). Even during
The trouble, Rose says, was that the leaders focused on
the initial stages of the global financial crisis that started in
variables that would make the nations look more like an
2008, the euro seemed to anchor periphery nations by
OCA on the surface rather than focusing on the real
variables that Mundell emphasized. The Maastricht Treaty
preventing speculative attacks and high interest rates.
established that, to join the euro, countries must converge
A byproduct of the euro’s initial success was that the
on nominal indicators — inflation, interest rates, and fiscal
interest rates at which governments could borrow converged
measures — that are conceptually different from the real,
toward the levels of Germany (see Figure 1), the economic
structural similarities that Mundell said were crucial to
anchor of Europe, despite large fiscal differences between
ensure nations didn’t suffer after having relinquished their
the countries, says Alberto Alesina, an expert at Harvard
monetary and exchange rate policies. “The criteria by which
University on both Europe and fiscal policy. Countries pera country gets into the monetary union are simply unrelated
ceived this as a good thing because it allowed those with very
to an optimal currency area,” Rose says.
high debts, such as Greece and Italy, to sustain them

Region Focus | First Quarter | 2012

19

Figure 2: Government Debt
180
Euro Introduced

Maastricht Treaty from lending directly to governments.
The rest of the crisis response has been marked by
140
one-off interventions and summits to strike deals in
120
hopes of calming markets. The International Monetary
100
Fund and eurozone member states have provided loans
80
to governments in exchange for “austerity” measures to
Greece
Italy
reign in budgets, and EU leaders created two temporary
60
Portugal lending facilities guaranteed by member states and the
40
Ireland
European Commission. In one of the latest deals, holdFrance
20
Germany ers of Greek government debt agreed in March 2012 to
Spain
0
trade their bonds with ones of lower value, reducing the
Greek government’s debt — which was by then in excess
SOURCE: European Commission, annual data
of 160 percent of GDP — by more than a quarter. It was
the largest sovereign debt restructuring in history.
Reassuring markets that governments will avoid default
(see Figure 2). Instead of taking the opportunity provided
in the short run has been one challenge; reassuring them of
by low interest rates to get their fiscal houses in order
governments’ long-run fiscal sustainability has been quite
“some countries went on a borrowing binge,” Alesina says.
another. In January 2012, most EU states agreed to a “fiscal
“That was something that economists had not quite
compact” meant to prevent excessive deficits by writing limexpected.”
its into national constitutions. The compact is “the first step
The fiscal limits in Maastricht and the SGP proved diffitoward fiscal union,” ECB President Mario Draghi told the
cult to enforce. Italy, the third largest economy in Europe,
and Belgium, the home of the EU capital, were allowed into
Wall Street Journal in February 2012, a step most European
the monetary union with gross debt almost twice as large as
leaders now say is inevitable, but not easy. “Before we move
the agreements allowed. Greece falsified official economic
to a fiscal union we have to have in place a system where
data to become the monetary union’s 12th member in 2001,
countries can show that they can stand on their own. And
a charge it admitted to in 2004. Within a few years of the
this is the prerequisite for countries to trust each other.”
euro’s launch, even Germany and France had violated the
Previous monetary unions without fiscal union have
pact’s deficit limits following the global recession of the
failed. Examples include the Latin and Scandinavian unions
early 2000s (see Figure 3). A German official counted in late
of the 19th century, both of which dissolved after the eco2011 that the SGP had been violated 60 times in its 12-year
nomic shock of World War I. That’s no coincidence, argued
history, with the promised sanctions scantly applied.
economists Michael Bordo of Rutgers University and Lars
Jonung of Lund University in Sweden in several studies comRecognizing that the SGP wasn’t serving its intended
paring currency unions. The available research “tells you
purpose, it was amended in 2005 to make the fiscal rules
loud and clear that monetary unions within nation-states
more explicit and therefore more enforceable, but the ECB
(that are also fiscal unions) do a lot better than international
and many economists expressed fears that certain aspects —
monetary unions,” Bordo said in a recent interview.
such as increased reliance on the discretion of the enforce(See “Interview: Michael Bordo,” Region Focus, Fourth
ment committee to determine what constituted an
Quarter 2011). “My reading of history is that unless they go
acceptable breach of deficit limits — would serve to let
that way … they are not going to make it.”
governments off the hook. Indeed, Alesina says, “when the
financial crisis hit [in 2008], those countries would have
continued on page 43
been in a better position to deal with it” if they had
been living within the treaty’s limits.
Figure 3: Government Surplus (+) / Deficit (-)
When markets began to doubt the viability of
10
sovereign debt in the spring of 2010, Europe had no
5
clear crisis mechanism in place to address these
0
problems. Market volatility reflected that uncertainty,
along with growing speculation that Greece and
-5
possibly other nations would be forced to leave the
-10
euro for being in drastic violation of fiscal rules
-15
Germany
without agreement to adequate fiscal reform. “Going
Italy
-20
France
Euro Introduced
into the crisis you really only had one institution
Spain
-25
[the ECB] that was acting on behalf of the entire euro
Portugal
-30
Greece
area,” Kirkegaard at the Peterson Institute says. The
Ireland
-35
ECB stepped in as lender of last resort by making large
loans to financial institutions to preserve financial
SOURCE: European Commission, annual data
market functioning, though it is prohibited under the

Region Focus | First Quarter | 2012

2011

2010

2008

2009

2005

2006
2007

2003

2004

2001
2002

2000

1998

1999

1995

20

1996
1997

PERCENT OF GDP

2010
2011

2008

2009

2005

2006
2007

2004

2003

2001
2002

1999

2000

1997

1998

1995

1996

PERCENT OF GDP

160

Will helping
homeowners
help the
economy?
BY J E SS I E RO M E RO

one year earlier, according to the Federal Housing Finance
ince September 2008, there have been approxiAgency (FHFA). In the Fifth District, prices declined
mately 3.4 million completed foreclosures in
2.4 percent during 2011.
the United States, and between 1.4 million and
The drop in house prices contributed to the wave of fore1.9 million more properties are currently in the
closures. Overall, the Fifth District has fared slightly better
foreclosure process. Many foreclosed homes are still sitting
than the nation as a whole, with about 3.1 percent of homes
vacant: About 1 million more vacant homes are for sale
in foreclosure, compared to 4.4 percent nationwide, accordthan in the average market of the past two decades. New
ing to data from the Mortgage Bankers Association (see
home construction — usually a source of economic growth
chart). In addition to the homes already on the market, there
after a recession — has been at historic lows for the past
is a looming “shadow inventory” of homes that are more
four years.
than 90 days delinquent, and thus likely to enter foreclosure,
“It’s an economic, financial, and human tragedy that we
or that have already been foreclosed but are not yet listed for
have let the foreclosure problem fester as long as we have,”
sale. Another 12 million homeowners are currently “undersays Alan Blinder, an economist at Princeton University
water” on their mortgages — they owe more than their
and former vice chair of the Federal Reserve Board of
homes are estimated to be worth — and are thus a potential
Governors.
source of additional foreclosures.
Opinions differ, however, as to why the problem has
Problems in the housing market are felt throughout the
continued to fester. On one hand, it’s possible that policy
economy. Households have lost more than $7 trillion in
interventions have not gone far enough, and that addiwealth. While estimates vary, research suggests that contional support for underwater and distressed homeowners
sumers spend between $3 and $5 less for every $100 lost in
is essential to the recovery of the housing market, and by
housing wealth. Consumer spending is about 70 percent of
extension, the economy. On the other hand, continued
government intervention has thus far failed to
spur a recovery in housing, and instead might
Foreclosure Inventory, U.S. Total and Fifth District
have served only to delay the inevitable bottom
5.0
of the market. Given the costs and risks of new
4.5
4.0
or expanded programs that attempt to prevent
3.5
foreclosure, is the best course of action to let
DC
3.0
MD
the market determine house prices, and allow
2.5
NC
2.0
the strengthening economy to stabilize the
SC
1.5
housing market?
VA
PERCENT

S

2011 Q1

2011 Q3

2010 Q1

2010 Q3

2009 Q1

2009 Q3

2008 Q1

2008 Q3

2007 Q1

2007 Q3

2006 Q1

2006 Q3

2005 Q1

2005 Q3

2004 Q1

Six years after the housing bubble popped, the
market still looks grim. Nationwide, house
prices have declined about 33 percent from the
2006 peak. Prices were flat in the fourth quarter
of 2011 and were down 3 percent from their level

WV
U.S.
2004 Q3

Free Falling

1.0
0.5
0

NOTES: Foreclosure inventory is the percent of mortgages in the foreclosure process at the
end of the quarter. Shaded area denotes recession.
SOURCE: Mortgage Bankers Association, Haver Analytics

Region Focus | First Quarter | 2012

21

GDP, and in previous post-war recessions, it has actually
risen slightly, since households try to smooth their consumption. During the 2007-09 recession, spending actually fell
almost 2 percent, contributing to the large decline in GDP.
The loss of home equity also dampens the effect of monetary policy on the economy, since underwater homeowners
aren’t able to refinance and take advantage of low interest
rates.
In 2005, residential investment made up 6 percent of
GDP; today it is only 2.5 percent. The decline is manifest in
the construction industry, where the unemployment rate is
about 17 percent, and in sectors that depend on demand
from the housing market, such as cement and wood
products manufacturing.
Some economists believe the housing market also affects
the labor market through “housing lock”: If homeowners
with negative equity are unable to sell their homes, they will
not be able to move to areas with better employment
prospects. Underwater homeowners are 30 percent less
likely to move, according to research by Fernando Ferreira
and Joseph Gyourko of the University of Pennsylvania and
Joseph Tracy of the New York Fed. But there is debate about
how large the housing effect actually is. Research by Sam
Schulhofer-Wohl of the Minneapolis Fed suggests that
underwater homeowners are actually more likely to move;
rather than selling, they simply walk away from the homes.
Underwater homeowners also have the option of renting out
their homes and then moving.

Help for Housing
Policymakers have tried to address both the demand and
supply sides of the housing market. On the demand side, the
First Time Homebuyer tax credit, first enacted in mid-2008
and expanded twice in 2009, offered first-time homebuyers
a tax credit of up to $8,000 and repeat homebuyers a
credit of up to $6,500 for homes under contract before
May 1, 2010. Home sales rose 10 percent during the life of
the $29 billion program and the decline in house prices
slowed, but the effects were short-lived. The credit appears
to have gone largely to people who were planning to buy
homes anyway, and it merely changed the timing of their
purchase. Sales fell below their previous level in the months
after the credit expired, and prices reverted to their downward trend, falling 5.6 percent between May 2010 and
February 2012, according to the FHFA.
Historically low interest rates also have failed to boost
demand, which perhaps shouldn’t be surprising given the
state of the economy, says economist Paul Willen of the
Boston Fed. “There’s a reason people are reluctant to get
into housing. In 2005 people were over eager about housing
— if they’re under eager now, that’s understandable.”
On the supply side, the goal has been to reduce the
number of foreclosures by helping borrowers get mortgage
modifications or refinance at a lower interest rate. A modification changes the terms of an existing loan, for example
by lowering the interest rate or writing down the principal

22

Region Focus | First Quarter | 2012

amount; refinancing replaces the old loan with a new loan.
In theory, mortgage modifications are a win both for homeowners, who get to keep their home, and for lenders, who
recover more than they would in a foreclosure. “The deadweight loss caused by a foreclosure is massive,” says Blinder.
“If the home can be saved, there is a gain to be shared
between the mortgagor and the mortgagee.”
The fact that more modifications have not occurred is
often attributed to the packaging of loans into mortgagebacked securities. The incentives of mortgage servicers and
the investors who own the loans are not always aligned; for
example, because it is time-consuming to offer modifications, servicers might have an incentive to move quickly to
foreclosure even when a modification would benefit both
homeowner and investor. On the other hand, investors
tend to oppose refinancing, since refinancing means that
mortgage bonds are prepaid, and bondholders must then
reinvest their money in lower-yielding investments.
Two government programs, the Home Affordable
Modification Program (HAMP) and the Home Affordable
Refinance Program (HARP), were implemented in 2009 to
reduce these frictions by paying incentives to lenders who
agree to offer mortgage modifications, and by refinancing
the loans of underwater borrowers whose loans are owned or
guaranteed by Fannie Mae and Freddie Mac (governmentsponsored enterprises, or GSEs). Both programs have failed
to live up to expectations; about 970,000 modifications
have been offered through HAMP as of February 2012,
compared to initial projections of between 3 million
and 4 million, and only about 1 million loans have been
refinanced through December 2011, compared to projections of 4 million to 5 million.
Reasons for the low uptake include application processing problems, limited eligibility, and the reluctance of
lenders and the GSEs to offer modifications and refinancing
options. (Lenders have completed more than 4 million
modifications outside HAMP; the terms of proprietary
modifications tend to be less generous to borrowers, which
might make lenders more willing to offer them.) Both
programs were recently expanded to attempt to address
these problems, for example by making deeply underwater
homeowners eligible for HARP, eliminating GSE surcharges
on certain refinance offers, and tripling the incentives paid
to investors via HAMP. The Treasury Department also will
start paying incentives to the GSEs for principal reductions,
although the GSEs’ regulator, the FHFA, currently refuses
to offer them to borrowers. In April, the FHFA’s director
indicated that he would consider revising this position.
(No decision had been announced at press time.)
Many economists and policymakers believe that the
extensions of HARP and HAMP don’t go far enough. One
option, proposed separately by a Federal Reserve Board of
Governors white paper and the Obama administration,
among others, is to require Fannie Mae and Freddie Mac to
refinance non-GSE loans for underwater borrowers, potentially helping millions of additional borrowers take

advantage of low interest rates. While this proposal would
greatly increase the size and the risk of the GSEs’ balance
sheets, the Board of Governors concluded that the potential
benefits — stabilizing house prices, reducing foreclosures,
and boosting consumer spending — could likely outweigh
the costs.
Another proposal is large-scale principal reduction; for
example, Congress could legislate that the GSEs offer writedowns or have the government pay for them across the
board. Martin Feldstein, an economist at Harvard
University and chair of the Council of Economic Advisers
under President Reagan, proposed having the government
pay to reduce the principal for every homeowner whose
loan-to-value ratio was over 110 percent. He estimated that
the cost to help 11 million borrowers would be $350 billion,
writing in a New York Times op-ed, “As costly as it will be to
permanently write down mortgages, it will be even costlier
to do nothing and run the risk of another recession.”

Into the Unknown
The macroeconomic effects of housing intervention are
unclear, however. Economists at the Congressional Budget
Office (CBO) and the Massachusetts Institute of
Technology recently studied the effects of a hypothetical
refinancing program for both GSE and Federal Housing
Authority (FHA) borrowers. Their paper projected that the
program would not have a large effect on the economy as a
whole: About 2.9 million homeowners would refinance, leading to 110,000 fewer foreclosures, a relatively small amount
compared to the size of the housing market. The authors
also projected that each homeowner would save about
$2,600 in the first year, for a total savings of $7.4 billion, a
small stimulus relative to the size of the economy. Other
estimates are much higher, however. Economists at
Columbia Business School and the Absalon Project,
a mortgage finance consulting firm, projected that expanding refinancing for just GSE loans could reach 14 million
borrowers and save $36 billion. If the program were extended to include FHA and Veterans Administration loans, it
could reach 30 million borrowers and save up to $70 billion.
The challenge is predicting how many borrowers will
participate. “It has always been a puzzle in mortgage finance
why so few people take advantage of refinancing,” says
Willen of the Boston Fed. “They refinance much less than
standard analysis would suggest they would.” The participation rate might be especially low at present, even if new
programs make it easier to qualify, since borrowers don’t
have the option to take cash out of the refinancing, according to Willen.
The impact of large-scale refinancing could also be
muted by the fact that the money saved by households is
money lost to investors who own mortgage-backed securities, as the CBO working paper noted. Losses to investors
potentially limit the stimulative effect of refinancing.
There is also considerable debate about the $25 billion
settlement reached in February between the nation’s five

largest mortgage servicers and 49 state attorneys general.
The settlement sets aside $3 billion for refinancing and
$17 billion for modifications for distressed homeowners,
including about $10 billion for principal reductions.
Compared to 12 million underwater homeowners and $700
billion of negative equity, the settlement is quite small.
The settlement looks larger, however, when compared to
3.9 million seriously delinquent loans, and could reduce the
number of such loans by about 10 percent, according to calculations by Bill McBride of the economics blog Calculated
Risk. The attorneys general believe that the impact will be
amplified if other lenders see that the principal reductions
required by the settlement are a cost-effective option, and
thus become more willing to offer reductions in the future.
A major concern about principal reductions and other
modifications is moral hazard — the possibility that
borrowers would purposely default on their mortgages in
order to qualify for assistance, or that future borrowers
would be more likely to take out unaffordable mortgages
since they’ll expect to receive assistance in the future.
Another problem is one of information asymmetry: It’s
difficult for lenders to distinguish which borrowers actually
need help, or which borrowers are likely to default even after
a modification. “It’s a big problem from the lender’s perspective, because they don’t have full information about the
borrowers,” explains Urvi Neelakantan, an economist at the
Richmond Fed currently studying mortgage modification
programs. “They want to help the people who will succeed
with assistance, not those who will succeed without assistance, or who will receive assistance and then fail.” About
43 percent of HAMP modifications to date were canceled
before the trial period ended, often because the borrower
re-defaulted. Of those who went on to receive permanent
modifications, nearly one-quarter were more than 90 days
delinquent within 18 months, according to the Treasury
Department.
Research suggests that such informational frictions are a
greater impediment to mortgage modifications than securitization. Christopher Foote and Willen of the Boston Fed,
Kristopher Gerardi of the Atlanta Fed, and Lorenz Goette
of the University of Lausanne (Switzerland) found that there
was no significant difference in the likelihood of modification between securitized and nonsecuritized loans. The
reason, according to the authors, is that while lenders lose
money in a foreclosure, they also lose money when they
modify mortgages for borrowers who would have repaid anyway, or when assisted borrowers go on to default. “While
investors might be foreclosing when it would be socially
efficient to modify, there is little evidence to suggest they
are acting against their own interests when they do so,” the
authors wrote.
The authors’ research also suggests that the premise of
mortgage modification programs might be faulty. The
rationale for modifications is that many homeowners took
out “unaffordable” mortgages; if the monthly payment can
be lowered, then the homeowner is less likely to default. But

Region Focus | First Quarter | 2012

23

Foote and his coauthors found that the debt-to-income ratio
of the mortgage, the typical measure of unaffordability,
is a poor predictor of the likelihood of default. Instead,
falling house prices, expectations about future prices, and
especially the unemployment rate are all better predictors
of default. When unemployment is high and income is
volatile, a mortgage that is modified to become affordable

today might not remain affordable tomorrow.
The argument for wide-scale principal reduction is based
on concerns about the large number of underwater borrowers. It might not be an efficient approach, though. While
falling house prices are one factor in the default decision,
“most people with negative equity will not default on their
mortgages,” Willen says. In a related paper, Willen, Foote,

For Rent

2005 Q1
2005 Q3
2006 Q1
2006 Q3
2007 Q1
2007 Q3
2008 Q1
2008 Q3
2009 Q1
2009 Q3
2010 Q1
2010 Q3
2011 Q1
2011 Q3

CPI

PERCENT

private-equity firms are planning to bid, according to
Prices are falling for single-family homes; there are about
2.4 million on the market, and one-quarter of them are
The Wall Street Journal. The auction is intended to be the first
foreclosures. At the same time, rents are rising across the
step in a national REO-to-rental program, although no
country, and the vacancy rate in multifamily housing has
further details have been announced.
dropped nearly two percentage points since the 2009 peak
The other half of REO properties are owned by banks
(see chart). Large-scale conversion of foreclosed homes
and other investors. Some banks, including Charlotte-based
(called real-estate owned, or REO homes) into rental propBank of America, are exploring selling homes for rental or
erties is one option for addressing this apparent mismatch
acting as landlords themselves. Current supervisory policy
between the supply of homes available for purchase and conrequires banks to dispose of REO property as quickly as
sumers’ demand for homeownership.
possible, but the Federal Reserve is considering issuing
Until now, the Federal Housing Finance Authority
guidance that would give banks leeway to hold REO
(FHFA), which regulates Fannie Mae and Freddie Mac, has
properties on their books for longer, and thus open the door
been reluctant to allow bulk sales of homes to investors.
to a rental program.
Fannie and Freddie, together with the Federal Housing
While the goal of a rental program would be to reduce
Administration, own about half of all REO property on the
the number of homes on the market and thus keep prices
market. Bulk sales typically require taking a steeper dishigher, a large number of rental properties in a neighborcount on the sale price than selling to an owner-occupant.
hood might actually lower property values if renters are
Selling a portfolio of homes also requires the REO holder to
perceived as less stable occupants or as less likely to mainabsorb carrying costs such as property taxes and maintetain their homes than owner-occupants. An increase in the
nance while it assembles enough properties to make the sale
supply of rental housing could also lead to lower rents,
attractive. An additional obstacle has been the lack of
reducing households’ incentives to purchase a home.
financing available to investors.
But perhaps the greatest hurdle
Currently, mortgage products exist
to a successful rental program —
Rental Market Conditions, 2005-2011
for one-to-four family homes and
and a factor potentially holding
12
270
for large multifamily properties,
back many institutional investors
but not for portfolios of single— is the logistical challenge
11
250
family homes. Some economists
of managing a large number of
10
and policymakers have called for
single-family homes. “Single230
9
government-subsidized financing
family homes are way too idiosyn8
210
as a way to encourage investors to
cratic to have the economies of
7
enter the rental market.
scale that would work for a large
190
With millions more foreclosed
organization,” says Paul Willen, an
6
170
homes projected to enter the mareconomist at the Boston Fed.
5
ket in the next two years, the
Despite these concerns, many
4
150
FHFA’s interest in REO-to-rental
economists and policymakers
programs is growing. In August
believe that an REO-to-rental
2011, the FHFA issued a call for
program is the best option for
Vacancy Rate (Right Axis)
Rents (Left Axis)
proposals on designing a rental proaddressing the current and future
NOTES: Rents are based on the Consumer Price Index (CPI) for
gram. It received more than 4,000
supply of vacant homes on the
shelter. Shaded area denotes recession.
SOURCES: Vacancies, U.S. Census Bureau, Haver Analytics; rents,
responses. In February, the agency
market. “There are at least some
Bureau of Labor Statistics, Haver Analytics
announced the first pilot auction of
investors who think they can make
2,500 homes for qualified investors
a profit out of it,” says Alan
and began accepting applications at the end of the month.
Blinder, an economist at Princeton University and
There is also speculation that government financing will be
former vice chair of the Federal Reserve. “If they lose money
available in some form. Several large broker-dealers and
— JESSIE ROMERO
trying, that’s capitalism.”

24

Region Focus | First Quarter | 2012

House Prices, 1900-2011
250
225
200
175

Great Depression

1970s
Housing Boom

New Homebuyer
Tax Credit

World War II

150

INDEX

and Gerardi found that negative equity is a necessary — but
not sufficient — condition for default. Instead, their work
describes a “double trigger” theory: Negative equity must be
combined with some adverse shock, such as a job loss or
serious illness, before default occurs. Their results suggest
that a better focus for policymakers might be helping homeowners cope with job loss or other shocks — and that an
improvement in economic conditions might be the best
cure for the housing market.

125
100
75

Does Housing Come First?

50

Is assistance for borrowers a prerequisite for economic
recovery? Some economists and policymakers argue that the
policies enacted to date have only delayed the bottom of the
market. Although home prices have declined about a third
relative to their 2006 peak, they are still above their longrun average (see chart), and allowing prices to fall to the level
where supply matches demand might finally restore stability
to the housing market. It’s possible that withdrawing
government support for housing would leave millions of
homeowners even further underwater, or that another drop
in prices could permanently spook a large number of potential buyers. But in the long run, “there may be no pain-free
path to the eventual righting of the market,” wrote
Danielle DiMartino Booth and David Luttrell in a Dallas
Fed Economic Letter. “Allowing the market to clear may be the
path of least distress.”
While many commentators are concerned about the
“oversupply” of housing, there might in fact be enough
buyers — they just aren’t willing to buy at current prices.
Relative to population growth, the number of single-family
homes built in the United States during the past decade is
lower than during the 1990s. And currently, there are about
2 million fewer new households than would be expected
given population growth — eventually, the people who are
living with their parents to save money are going to want to
move into their own homes. Given the very low rates of new
construction during the past four years, pent-up demand for
housing might be building. “The population is growing, the
economy is growing, and eventually we need a place to put all
those people. At some point the value of housing has to go
up just because of population,” Willen says.

25

1980s
Housing Boom

0
1900 05 10 15 1920 25 1930 35 1940 45 1950 55 1960 65 1970 75 1980 85 1990 95 2000 05 2010

NOTES: The 1970s and 1980s booms were largely regional. Between 1975 and 1980, real house prices in
California rose 60 percent, affecting national home price indices. Price increases in the 1980s occurred
in the Northeast and California. Data are annual through 1953, and quarterly thereafter. Prices are
indexed to 1890=100.
SOURCES: The New York Times; Shiller, Robert. Irrational Exuberance. Princeton, N.J.: Princeton
University Press, 2005.

Those who believe that a recovery in the housing sector
is a prerequisite for economic recovery more broadly point
out that housing has led the way after previous recessions.
Typically, housing contributes about a half percentage point
to overall GDP growth in the two years following a recession; throughout much of the current recovery, housing’s
contribution has actually been negative. Previous recessions
weren’t precipitated by a boom and bust in housing like that
which occurred in the 2000s, however, so it might not be
surprising that the current recovery is different.
Since the end of 2011, there have been indications that
the economy is gaining strength: The unemployment rate
has declined from 9 percent to 8.1 percent since September,
and consumer spending is on the rise. There are also signs
that the housing market might begin to improve in 2012.
New housing starts picked up at the end of 2011 and beginning of 2012, and some forecasters predict that prices will
finally hit bottom in 2012. In March, the Pending Home
Sales Index published by the National Association of
Realtors increased 4.1 percent, and is 13 percent above its
level one year ago. If these trends continue, a recovery in
the housing sector could be the natural consequence of
RF
economic growth more broadly.

READINGS
Blinder, Alan. “How to Clean Up the Housing Mess.”
The Wall Street Journal, Oct. 20, 2011.
Foote, Christopher L., Kristopher S. Gerardi, and Paul S. Willen.
“Negative Equity and Foreclosure: Theory and Evidence.” Federal
Reserve Bank of Boston Public Policy Discussion Paper No. 08-3,
June 5, 2008.

“The U.S. Housing Market: Current Conditions and Policy
Considerations.” Federal Reserve Board of Governors White
Paper, January 4, 2012.
Weinberg, John A. “No Quick Fix for the Housing Market.”
Region Focus, Fourth Quarter 2011, p. 56.

Foote, Christopher L., Kristopher S. Gerardi, Lorenz Goette, and
Paul S. Willen. “Reducing Foreclosures.” Federal Reserve Bank of
Boston Public Policy Discussion Paper No. 09-2, April 8, 2009.

Region Focus | First Quarter | 2012

25

Small Airports
Support Flexible Flying
BY B E T T Y J OYC E N A S H

A

computer screen displays tiny blue airplanes
tracking toward the Washington, D.C., area from
across the United States. At the moment, flights
are en route from Texas, North Carolina, and
Florida. This traffic is headed, not for Dulles International
or Reagan National airports, but for Manassas, Va.,
population 38,000.
Manassas Regional Airport is one of 4,247 publicly owned
general aviation, or GA, airports nationwide — airports that
handle nonmilitary and nonscheduled flights. There are
57 public general aviation airports in Virginia, of which
Manassas is the busiest. In the Fifth District, North
Carolina has the most of these, about 62, while Maryland has
the fewest, with about 16, according to the 2011-2015
National Plan of Integrated Airport Systems of the Federal
Aviation Administration (FAA). Though largely invisible
to the public, these airfields play an essential role in the
nation’s transportation infrastructure.
They enable hassle-free and flexible flying for recreation,
business, and public service — including medical evacuations, law enforcement, and disaster relief. What’s more,
these smaller fields declutter air and runway space at commercial airports such as Dulles, 14 air miles and 18 road miles
from Manassas. That leaves commercial airports free to
handle higher passenger and cargo volumes. Dulles itself put
Manassas Regional on the map, nearly 11 years ago.

Manassas to Mars
On 9/11, the FAA shut down civilian aviation traffic across
the country. When flights resumed at Dulles several days
later, only scheduled airline flights were allowed. The other
traffic had to go someplace, and that someplace was
Manassas.
“We were jam-packed with airplanes,” recalls Juan Rivera,
director of Manassas Regional. Operations — takeoffs and
landings — soared to 800 a day, from roughly 360, and limos

26

Region Focus | First Quarter | 2012

lined up 10 to 12 deep, waiting for Washington VIPs.
That went on for weeks, until Dulles was allowed to reopen
to GA flights. Manassas can accommodate jet traffic,
which requires runway length of about 5,000 feet. Some
aviation customers stayed. They liked the convenient and
less-crowded Manassas airport, and the lower landing fees.
Before the 2007-09 recession, Manassas handled about
139,000 operations annually. Now, they’re at around
100,000. The recession hurt not only airports but aviation
in general, including aircraft manufacturing. Fewer planes
are flying. But on a Tuesday morning last winter, traffic
flowed steadily along the airport’s two runways. Student
pilots practiced in single- or twin-engines and business jets
came and went.
Business leases form the backbone of a GA airport’s
financial sustainability, according to David Byers of the
Aviation Institute at the University of Nebraska at Omaha.
“You can only charge the aviation community so much for
fuel and parking and other services,” he says. It’s really all
about managing real estate.
Manassas Regional pays its own way through leases, not
unusual for a busy general aviation airport outside a metro
area. But most publicly owned GA airports can’t sustain
themselves without a subsidy, especially in remote spots
where revenue opportunities are few and far between.
New leases have been a tough sell since 2007, but
Manassas Regional’s existing hangars are mostly full. There
are five flight schools on site, including one for helicopter
pilots, and myriad other businesses, including one that will
re-upholster or otherwise refurbish plane interiors.
And then there’s the small army of Ph.D. scientists working in aerospace at firms like Optical Air Data Systems and
Aurora Flight Sciences. Aurora relocated to Manassas
Regional in 1991, and flies staff to plants in West Virginia and
Missouri and another R&D center in Massachusetts.
Although there’s no launch pad, the firm’s longtime work on
its Mars orbiting detection device has sparked the slogan
“Manassas to Mars.”
For typical GA airports that need public subsidies,
economic development provides one rationale. Airports
rank at the top of economic developers’ must-have lists.
If there’s no airport nearby, says Virginia Department of
Aviation director Randy Burdette, site location scouts lose
interest. Without a runway, some businesses just can’t do
business.

Capital Aviation Instruments
& Avionics services and
installs avionics at the
Manassas Regional
Airport.The 35-year-old firm
also refurbishes aircraft.
Aviation Dept. Manager Scott
Moore pilots the Luck
Companies’ Beechcraft to dozens
of general aviation airports
everywhere, mostly in the East.

PHOTOGRAPHY: BETTY JOYCE NASH

The Time Machine
The Rock Hill-York County Airport in South Carolina is one
of several satellite airports ringing Charlotte. The
airport is a preferred fly-in spot for companies doing business locally. One of these is Luck Companies, a firm founded
in 1923, and headquartered in Manakin-Sabot, Va., near
Richmond. Luck Companies includes several businesses —
crushed and architectural stone, real estate development,
and clay tennis courts — whose clients and suppliers are
scattered far and wide. The firm’s pilots fly its nine-seat
turboprop plane from its own landing strip to dozens of the
nation’s general aviation airports; it celebrates four decades
of business flying this year.
For the types of products and services handled by Luck
Companies, often there’s no substitute for meeting face to
face. When the firm begins business discussions, locating
the nearest GA airport is critical, says Luck’s aviation
department manager Scott Moore. He points to a map
studded with thumb tacks representing each GA airport
they’ve flown into.
The plane saves time. Moore routinely flies associates,
for example, to the firm’s Leesburg, Va., quarry. The trip is
half an hour by air compared to two and a half hours by car.
Everyone’s back by lunchtime.
“What we’ve created is a time machine,” he says, referring to the firm’s Beechcraft King Air. The firm has flown
more hours than ever during the recession. Moore says that
none of the architects, designers, or customers they’ve flown
has failed to place a major order.
The firm prefers the Rock Hill-York County Airport to
Charlotte-Douglas International when associates fly to the
studio in Pineville, N.C. In Charlotte, Moore says, they
would pay a $150 landing fee; at Rock Hill, they pay $50. But
the fee plays only a minor role. The real attraction is the
proximity of the local GA airport and avoiding the traffic
jams that can occur at commercial hub airports. Once
Moore spent 45 minutes waiting for takeoff in Newark, N.J.
That sounds like a trivial delay to today’s typical air traveler,
but time is money. It’s an unheard of holdup in the GA
world. And then there are ground delays. At a GA airport,
the rental car is ready to go. “At worst, they have to walk in,
sign paperwork, get the keys, and walk out. They are at the
airport five minutes, no more than that ever,” Moore notes.
When Luck opened its studio in Charlotte, the firm ran
daily flights to bring in experts who had worked on the

Richmond studio. “We saved over $300,000 in fees by using
the plane,” he says. “If they’d had to go down there and
spend the night, all that time is on our clock. We’re able to
get them home for dinner.”
A public GA airport also serves just about every county in
South Carolina; the busiest handles about 60,000 operations in a year. Greenville Downtown Airport opened in 1928
as the city’s original passenger airport, and at the time was
conveniently located for textile business travelers. During
textile shows in that industry’s heyday, the airport’s apron
was crowded with planes, says airport director Joe Frasher.
Annual operations in the late 1970s climbed to 100,000.
Today, 70 percent of the 205 planes based at Greenville
Downtown Airport are business planes. This airport, unlike
many smaller ones, can fund itself. Times have been better,
but traffic picked up in 2011 by 6 percent over 2010.
The much-smaller textile industry may still fly employees
to remote plants, but most of the airport’s business nowadays comes from firms like Michelin and GE and the
Clemson University International Center for Automotive
Research. Three flight schools also operate on the grounds
and two charter companies base planes there. “Clemson and
the Medical University of South Carolina are also in and
out,” Frasher says.
These less-crowded GA airports also serve another
business market: show business. Celebrities of all stripes can
come and go without fanfare. Entertainers who play
Greenville’s venues regularly use Greenville Downtown
Airport. One day as Frasher was leaving, he noticed the
singer-songwriter Prince waiting for his limo in the airport
lobby. The driver, in a mix-up, had headed to the GreenvilleSpartanburg International Airport instead. But Frasher says
Prince’s anonymity was assured, despite his flashy getup:
“There was no one to hide from.”

Region Focus | First Quarter | 2012

27

These days Eric Ramsdell is preparing for a whopper, the
Democratic National Convention, due in Charlotte this
September. Ramsdell is airport manager at the Rock HillYork County Airport. Though convention traffic will spill
over to at least three other GA airports near Charlotte,
Rock Hill’s airport is only 20 minutes by car from the convention center. Ramsdell expects traffic not only from
delegates but also from corporate planes as companies from
all over the nation converge on Charlotte to host events.
Ramsdell especially welcomes this boost after the downturn. Annual operations peaked in 2007 at roughly 45,000;
they currently stand at about 35,000.
Private flying also is a secure way for business travelers to
work. “They can work in their own airplane without the fear
of someone looking over their shoulders,” Ramsdell says.
Roughly 11,000 U.S.-based firms of all sizes depend on
business air travel, according to the National Business
Aviation Association, and most, about 75 percent, use one
small turbine-powered plane or its equivalent. Business
aviation had begun to build until the recession. Sales of
business aircraft went from about $12 billion to $27 billion
between 2003 and 2008 before falling to $20 billion in
2009, along with corporate profits, according to Richard
Aboulafia, an aviation consultant with the Teal Group
Corp. He describes business aviation as a small but highvalue niche.
The planes, he says, are not toys of the rich. “A lot of it is
corporate workhorse stuff — not having to ship [employees]
through a hub and wasting time.”
Luck’s Beechcraft is such a workhorse. The firm has also
used the plane for humanitarian trips: ferrying supplies to a
hurricane-damaged island and flying combat-wounded veterans and families. Businesses commonly use company craft
for such purposes. After Haiti’s earthquake in 2010, business
planes flew more than 700 trips, transporting passengers
and delivering supplies to relief groups. The Corporate
Angel Network matches private corporate jet schedules
with cancer patients who need transport.

The Aviation Dream vs. Reality
Though general aviation grew through much of the last
century, it never reached the heights predicted after World
War II, when enthusiasts thought American garages would
eventually house planes as well as cars. The market for
personal aircraft peaked by 1980. The industry faltered, in
part, because of legal actions against manufacturers, which
drove up prices. GA airplane shipments topped out in 1978
at 17,811; the nadir came in 1994, when 929 single- and multiengine planes, turboprops, and turbojets were made. (That
year, Congress passed product liability reforms to aid the
industry’s rebirth.)
But innovations could yet transform flight, as demand for

business jets grows, if slowly. The aviation dream is alive and
well, certainly for Honda Aircraft Co. chief executive
Michimasa Fujino, who in the 1980s began the R&D for
“an advanced air-bound Honda.” The company employs
between 600 and 700 people in Greensboro, N.C., in a halfmillion-square-foot factory. Spokeswoman Kathleen Bangs
says the firm expects certification from the FAA in 2013 for
two proof-of-concept planes. A third is to be launched soon.
These are known as advanced light jets, comparatively fast,
light, and fuel efficient, with a range of 1,300 miles. The jet
will use substantially less fuel, which will make it more
affordable. Bangs says, “Jet fuel is a huge factor in terms of
operational costs.”
The market for these and other business jets will recover
slowly, says consultant Aboulafia. “Most forecasters, including me, seem to think we’ll get back to where we were
[pre-recession] in 2015 or 2016.”
The GA airports could gain as service to small cities
dwindles. Major airlines have cut regional jet use to small
cities because of rising fuel prices. It’s less expensive for
airlines to spread costs among many passengers on a larger
jet than it is to fly fewer passengers on a smaller one. In the
past two years, 27 cities have lost commercial air service
despite these airports’ subsidies.
All this may point to increasing charter business, including “fractional” leasing. Buyers may pay for a “share” of time
on the jet. Though the number of fractional share owners
fell slightly from a peak of 5,179 in 2008 to 4,862 in 2010,
that number could bounce back with an overall revival of the
small-jet market. The FAA forecasts that the jet fleet —
charter planes, among others — will average a growth of
4 percent a year over the next two decades. Flying hours for
jets are forecast to average 5.3 percent annual growth from
2012 to 2032, according to the FAA. (In contrast, the number of flying hours for piston-driven planes, the smaller
aircraft more likely to be used for recreational flying, is
expected to remain flat over the next decade.)
Hank Brown owns the Greenville Jet Center, which
handles fuel sales and services, such as arranging charters
and operating a flight school, at the Greenville Downtown
Airport. “We are going to see more charter business,” Brown
says. “Every part of aviation was on the increase just prior to
the recession.”
Between dwindling commercial service, the hassles of
security check-in, and flight delays at hub airports, flights in
and out of GA airports may present a practical alternative,
says Burdette. “Some businessmen will say, ‘You know what?
My overhead is so high I’ll hire an air charter. I can be there
quicker and more directly and get home sooner.’”
RF
Emporia-Greensville Regional Airport is only one of rural
America’s general aviation airports. See sidebar on our website.

READINGS
“FAA Aerospace Forecast Fiscal Years 2012-2032.” U.S. Dept. of
Transportation, Federal Aviation Administration, 2012.

28

Region Focus | First Quarter | 2012

“General Aviation Statistical Databook and Industry Outlook
2010.” General Aviation Manufacturers Association.

INTERVIEW
John B. Taylor

RF: What were the policy events and theoretical
developments in the economics profession that helped
lead you to formulate what has been dubbed the
“Taylor Rule”?
Taylor: I first presented it at a Carnegie-Rochester conference in November 1992. But I would go back quite a bit
before that. In some sense, I have been interested in policy
rules ever since I started studying economics. I had a professor as an undergraduate at Princeton named Phil Howrey,
who taught time series analysis and his approach to macroeconomics was to treat the economy very much as a dynamic
system, consistent with what we would today call dynamic
stochastic structural modeling. So I got interested in policy
from the point of view of feedback rules to stabilize a
dynamic economic system, and in many respects my whole
research focus has been from that perspective. Early on,
I worked on how to design policy rules when you don’t know
the model, and you have to do econometrics simultaneously
with your policy evaluation. I also did some stuff on optimal
policymaking when people are learning about the impact
of policy. Later on, I built models with sticky prices and
rational expectations in order to evaluate policy rules. And
in the 1980s, I was developing multicountry models with the
same purpose.

Region Focus | First Quarter | 2012

PHOTOGRAPHY: LISA HELFERT

Stanford University economist John Taylor has straddled the worlds of academia and government service,
with distinguished, complementary careers in each.
His academic work has informed his efforts as a policymaker, and his experience in government has provided
insights about potential research questions and
how to frame them. The most well-known example of
the latter is the “Taylor Rule” — a straightforward,
concise formula designed to guide monetary policy and
largely remove discretion from the policymaking
process — which he developed after working at the
Council of Economic Advisers from 1989-1991. Many
observers have stated that the Fed and other central
banks have, in large measure, implicitly followed the
Taylor Rule, though Taylor argues the Fed strayed from
the rule in the mid-2000s.
Taylor has been a critic of the Fed’s actions during
the financial crisis and the subsequent recovery,
arguing that the Fed has unwisely engaged in credit
policy, threatening its independence and bringing into
question serious constitutional issues about which
institutions of government have the power to disburse
funds. He also has expressed reservations about the
2009 fiscal stimulus package. While many policymakers and some economists have argued that the
“multiplier effect” of those government expenditures
was on the order of 1.5, Taylor and colleagues have
estimated that it was probably closer to zero or perhaps
even negative.
In his most recent book, First Principles: Five Keys to
Restoring America’s Prosperity, Taylor argues that the key
to economic success is economic freedom, which has
five defining principles: a predictable policy framework,
the rule of law, strong incentives, a reliance on markets,
and a clearly limited role for government. He then
explains how those principles can be applied in practice
to a number of current policy issues.
Aaron Steelman interviewed Taylor in Washington,
D.C., on Feb. 24.

29

I have never been to an FOMC
By the late 1980s, it appeared
I decided it was time
meeting but a number of members
that we had done all this research
to get practical and develop of the FOMC have talked favorably
on policy rules and it didn’t seem
about it, from Janet Yellen to
like the central bank was explicitly
something workable
Charles Plosser, so you know it’s out
following that approach. So I
rather than a theoretical
there. In the late 1990s Chairman
decided it was time to get practical
Greenspan told me that it explained
and develop something workable
policy rule with, say,
about 80 percent of what they were
rather than a theoretical policy
doing during his tenure, but that
rule with, say, 20 variables on
20 variables on the
doesn’t mean that he was looking at
the right-hand side. Another big
right-hand side.
it explicitly. And there is evidence
change was moving toward using
that a number of foreign central
the interest rate rather than
banks have acted in ways that are consistent with the rule,
the money supply as the policy instrument in the
which surprised me somewhat because I originally had U.S.
feedback rules. So it goes back a long time, and that
policy in mind.
rule was part of something I had been looking for
for many years. In a way, I did not think of the 1992 presentation as a big deal. I just was trying to find a way to write
RF: Empirically, do you observe any shortcomings with
down something that was consistent with all the research
the Taylor Rule as it was initially conceived?
I had been doing but also simple enough and workable
enough — and consistent with the way the Fed was thinking
Taylor: The worry I have always had, and many people have
about and doing policy, which was focusing on the federal
pointed this out, is that it calls for a response to the deviafunds rate, even though that wasn’t talked about much.
tion of potential GDP from real GDP, and potential GDP is
hard to estimate. So there is an inherent uncertainty and it’s
much worse in emerging market countries where potential
RF: Could you describe what the Taylor Rule says about
growth is hard to estimate — China, for example. So people
how central banks ought to generally respond when the
have various ideas about that. One is to lower the coefficient
inflation rate deviates from its target?
on the gap. But, of course, if you lower the coefficient too
much, then you are not responding adequately when there
Taylor: The rule is quite simple. It says that the federal
is a recession, so that seems to me not to be the answer.
funds rate should be 1.5 times the inflation rate plus .5 times
the GDP gap plus one. The reason that the response of the
I basically now say, let’s just take an average of various
fed funds rate to inflation is greater than one is that you want
people’s estimates of potential growth and use that to
to get the real interest rate to go up to take some of the inflacalculate the gap.
tion pressure out of the system. To some extent, it just has to
I think that is the biggest concern. You want central
be greater than one — we really don’t know the number
banks to respond to inflation and you want them to avoid
precisely. One and a half is what I originally chose because
large discretionary deviations so they don’t create their own
I thought it was a reasonably good benchmark.
instability or inflation. This is related to the issue of the
Fed’s mandate. I think there should be a single mandate:
RF: From observing policy actions and now reading the
price stability. But it appears contradictory when the rule
transcripts of FOMC meetings, to what extent, in your
has the central bank responding to real GDP. It is not
opinion, has the Fed implicitly adopted something
contradictory, however, because it is optimal to respond to
like the Taylor Rule and when has it deviated from its
real GDP even if you are only interested in price stability
general framework?
because that is indicative of where inflation is going.
But that is hard to explain sometimes.
Taylor: The biggest period where the deviations are apparent is the 1970s. It would have been a terrible policy rule if it
RF: How much do you think the Fed’s close interaction
had been estimated in the 1970s. I never really thought of it
with the Treasury during the crisis — and the central
as an estimate. I thought of it more as a recommendation.
bank effectively conducting credit policy — has comI also think there were significant deviations from the rule
promised the Fed’s independence?
from 2003 to 2005, when basically there were rate cuts
greater than I think any reasonable interpretation of the rule
Taylor: I think the Fed engaging in credit allocation has
would suggest. So I think the period when the rule was
been a problem and has led to a sacrifice of its independfollowed fairly closely was roughly from the 1980s through
ence. There are many reasons that it happened. But it really
2003. The way I think about it is that the Fed’s actions
doesn’t make much difference whether the Fed chose to
have been largely consistent with the rule without using it
voluntarily get involved in fiscal policy or if it was persuaded
explicitly. We do know from the transcripts, though, that the
to do so. It makes people question why you need an
rule and other rules have been referred to fairly commonly.
independent central bank to conduct monetary policy. I also

30

Region Focus | First Quarter | 2012

think it raises some serious constitutional issues about which agencies of
government are given the authority
to appropriate funds and whether
that was violated during the crisis.
RF: How should the Fed reduce its
balance sheet?

John B. Taylor
➤ Present Position
Mary and Robert Raymond Professor
of Economics, Stanford University, and
George P. Shultz Senior Fellow in
Economics at the Hoover Institution
at Stanford University

supply and demand of reserves. That
would prevent this extra instrument
from playing such a big role.
The other thing that happened
during this episode was that the
interest rate got to the zero lower
bound. That generated this idea that
something else had to be done, that
the balance sheet had to increase a
lot. That is not the implication. The
implication is that when the interest
rate is at the zero lower bound, you
should make sure money growth
doesn’t fall. Whatever aggregate you
look at, you need to make sure it
doesn’t decline. That is much different than massive quantitative easing.

➤ Previous Faculty Appointments
Taylor: Well, we can learn from what
Princeton University (1980-1984) and
the Fed did after 9/11, when it
Columbia University (1973-1980)
increased reserves quite a bit. There
was a real liquidity crunch in the
➤ Government and Business
markets and it was well within the
Positions Held
Fed’s role to respond that way.
Under Secretary of the Department of
During the worst of the 2008 panic,
Treasury for International Affairs
the Fed also provided funds that
(2001-2005); Member of the President’s
increased the balance sheet and if
Council of Economic Advisers
(1989-1991); Senior Staff Economist at
it had stuck to the exit policies that
RF: On balance, how effective
the President’s Council of Economic
it pursued following 9/11, those
were the fiscal-policy actions
Advisers (1976-1977); Economic Analyst,
reserves would have been reduced
implemented to help the economy
Townsend-Greenspan and Company
pretty quickly. But instead the Fed
recover? If they were relatively
(1978-1981)
moved after the panic into intervenineffective, in your view, were
tions in the mortgage market and
there structural problems in their
➤ Education
the medium-term Treasury market.
design that led them not to have
A.B. (1968), Princeton University;
Those actions, it seems to me, raised
the intended stimulative effects?
Ph.D. (1973), Stanford University
many precedential issues. In fact, in
➤ Selected Publications
the early part of 2009, Don Kohn was
Taylor: In November 2008 I was
Author or editor of numerous books,
on a panel with me at a conference;
asked to testify before the Senate
including First Principles: Five Keys to
I argued that while the Fed can talk
Budget Committee. As sort of a play
Restoring
America’s Prosperity (2012);
about these temporary interventions
on words, I said we shouldn’t do
Getting Off Track: How Government
during the panic, I would worry that
“temporary, targeted, and timely”
Actions and Interventions Caused,
if the recovery is slow, it will continue
policies but rather we should do
Prolonged, and Worsened the Financial
to do these sorts of things — not
“permanent, pervasive, and preCrisis (2009); Monetary Policy Rules
because there is a liquidity problem,
dictable” policies and then I outlined
(editor, 1999); Inflation, Unemployment,
but just because the economy is still
four steps. I thought that approach
and Monetary Policy (with Robert Solow,
sluggish. Kohn said, no we won’t do
would have been promising. But in
1998); and Macroeconomics: Theory,
that. But that, in fact, was what the
January 2009 a government white
Performance, and Policy (with Robert E.
Fed did.
paper was issued that argued that the
Hall, 1986); and author or co-author of
So now we have a situation where
multiplier of a temporary targeted
papers in such journals as the American
there are massive interventions that
stimulus was going to be 1.5, and that
Economic Review, Journal of Political
Economy, Journal of Monetary Economics,
are not conventional monetary policy
was a big disappointment to me
and Econometrica
and we need to get away from that.
because everything we had been
However, I’m not sure the Fed will
teaching our students over the years
get away from such policies, because
suggested that was not the case. I
now people are writing papers, including academic papers,
wrote a paper with some colleagues and we arrived at the
which say the Fed can and should do these things: It can
conclusion that the effect would be one-sixth of what was
have its role in terms of setting the interest rate and it also
estimated. In later research I found that even that estimate
can use its balance sheet to supposedly stimulate growth.
turned out to be optimistic. In fact, despite its large size, the
The reason it can do that, people argue, is that the Fed now
2009 stimulus did not result in much of an increase in govpays interest on reserves and thus it can ignore the supply
ernment purchases. There were two reasons for that. One,
and demand for money or reserves when setting the interest
there was virtually no increase in federal purchases of goods
rate. I think that is not a good approach. It is very unpreand services. Second, the logic that money sent to the states
dictable and it will inherently raise questions about the
would be used for infrastructure and to hire people to build
independence of the Fed. So I would like the Fed to go back
roads turned out to be flawed. It turns out, the best we
to a world where the interest rate is determined by the
can tell, the grants were not used for increased purchases.

Region Focus | First Quarter | 2012

31

Instead, they were largely saved and
the states and local governments
borrowed less. So while we debated
what the multiplier was, the overall
effect of the stimulus was probably
negative because, to the extent that
the states were required to increase
transfer payments — in particular,
Medicaid — they actually reduced
government purchases, including
infrastructure.

We need to deal with
crony capitalism for a lot
of reasons, including
maintaining people’s faith
in the market system. When
they see these inherently
unfair policies, their trust
is naturally eroded.

RF: In your most recent book,
First Principles: Five Keys to
Restoring America’s Prosperity, you
talk a lot about entitlement reform. Could you discuss
what you have in mind?

Taylor: The proposal is to bring federal spending as a share
of GDP to what it was in 2007. It went up a lot from 20082010 and according to the budget that was proposed last
year, it was going to stay out there at 23 percent or 24 percent
of GDP. The alternative proposal is to gradually bring spending back down to 19.5 percent, and thereby undo the binge
that occurred during the crisis. That raises the question of
what we do about entitlements.
Social Security is, in some sense, easier to deal with
because the current program increases the amount paid
per beneficiary in real terms by substantial amounts. For
example, a 30-year-old today is going to get much more in
retirement payments in real terms than a 60-year-old. So
what you need to do is to adjust that formula so that the two
receive the same level of benefits adjusted for inflation.
There is no reason why a younger person should expect to
receive more benefits than a person about to retire. If you do
that, you basically deal with the Social Security problem.
That’s my proposal and it is not unique to me, but I think
characterizing it in this way seems helpful. The counter to it
is that you should get benefits proportional to what you
earned over your lifetime, and my question is, what is Social
Security really for? I think Social Security was created to
make sure you had a good retirement and did not live in
poverty.
In terms of Medicare, there doesn’t seem to be too much
disagreement about how much growth there should be.
Roughly speaking, both parties agree that it should not grow
much faster than GDP. The difference is how do you do
that? One way is that you set the amount each beneficiary
gets so that it does not grow too fast and then you have individuals decide what type of insurance they will receive,
within limits, of course. And you also effectively means-test
it so that you don’t have as much growth for wealthier people as you do for poorer people, and you risk-adjust it. So
that seems to make sense to me. You are using the market
and since you need to control the growth, why not do it in a
way that is the least painful for people.

32

Region Focus | First Quarter | 2012

RF: In First Principles, you
discuss the dangers of “crony
capitalism.” Historically, when
one thinks of that term it is
often used in the context of
developing countries where the
rule of law is not particularly
well established, but you make
the case that we also now see it in
the United States. What do you
have in mind?

Taylor: The United States has traditionally been good on this due to
constitutional checks that have prevented the government from arbitrarily helping certain
groups or individuals. But as economists, I think we didn’t
emphasize how important those constitutional provisions
are. An example is how we thought about the Soviet Union
following its implosion. Many people thought that once the
new regime abandoned central planning and began to
embrace markets everything would be fine, but what has
been missing in Russia is the rule of law and that has caused
a lot of problems.
Compared to many developing countries, the United
States still does not have a significant problem with cronyism, but we are slipping a bit. It is an issue that is hard to
explain in the abstract. It is something that almost needs to
be experienced to be well understood. Some of the examples
of the United States slipping are fairly subtle. The bailouts
are one, where we skipped over the bankruptcy code and
even when we did use bankruptcy in the case of automobiles,
we gave preference to certain creditors who were not next in
line. In terms of the way the new health care law is applied,
there are a lot of waivers that are being given and the reasons
are not transparent. I think the same is true with too big to
fail, with more powerful entities receiving protection. So I
think cronyism is there and is a real danger. The other part
of crony capitalism that economists have talked about is regulatory capture, and I think we have a lot of evidence of that
in the case of Freddie Mac. One of the reasons for the
success of the deregulation movement with the transportation industry in the late 1970s and 1980s is that people
recognized that was a case of regulatory capture.
We need to deal with this issue for a lot of reasons,
including maintaining people’s faith in the market system.
When they see these inherently unfair policies, their trust is
naturally eroded.

RF: As someone who has spent decades in academia as
well as held high-level policy advisory and policymaking
positions, what have you taken from your experience in
government and how has it influenced your academic
work? And how did your academic work influence the
way you approached issues as a policy adviser and
policymaker?

Taylor: The short answer is a lot — and in both directions.
As an example, going back to the first Bush administration,
in the Economic Report of the President we decided that it
would be good to write down the advantages of policy rules
based on research that we talked about earlier in this conversation. Not everything was adopted, of course, but I think it
was useful to get it down on paper for policymakers and their
staffs to see. In general, I think having economists in government is a good idea, and that applies to many different
positions. When I was Under Secretary of the Treasury for
International Affairs, my job was mainly operational but
having an economic perspective was very helpful. I had
always been attracted to that job, by the way. When I
thought about the position I imagined I would be negotiating economic reform agreements with international
counterparts, but things turned out to be very different
because of 9/11. Instead a big part of the job was setting up a
new currency in Iraq and getting terrorist financing under
control. It was a fascinating time and a very challenging
experience.
When I went back to academia in the 1990s after my
experience at the CEA, I might not have come up with the
Taylor Rule if I hadn’t been in government. That was a
process of thinking through something that I believed
would help generate policy as well as be generally acceptable
to many at the Fed after having observed more closely how
the Fed operates. In fact, Alan Greenspan says that the Fed
should deserve an assist for the Taylor Rule because of
the type of conversations I benefited from when I was in
government. My undergraduate teaching also has improved
from being in government and having to explain economic
concepts to noneconomists. Also, in academia there are so
many different things that you can work on, but there are
only a few that are really helpful for policy. So you get a sense
of where you should focus your attention.
RF: Has the profession moved too much in the
direction of work that ultimately will not have policy
implications?
Taylor: There is certainly a place for work that will have no
policy implications at all. But I do sometimes think that
there could be more research that relates to policy. After
my stint in government in the early 1990s, I went back
to Stanford and started a series of conferences with the
San Francisco Fed. The idea was to get academic monetary
economists together with monetary economists involved in
the formation of policy. An even better example, of course,
that goes back much further are the Carnegie-Rochester
conferences. Karl Brunner and Allan Meltzer would consistently try to get topics that academics could work on that
would be very practical. I think the work that Brookings
does is very much the same way. There is a tendency among
academics to shy away from practical or operational policyrelated topics, unfortunately. Policy research does not always
lend itself to elegant work of the kind that is appealing to

academics and for which the professional rewards are very
great. However, the long-run rewards from research on more
policy-related topics can be very large. Keynes was not a
particularly technical economist but he was interested in
policy and his work became very influential among academics because of that. The profession is still trying to formalize
many of the things that he wrote in the 1930s.
RF: What are the big unanswered — or understudied —
questions in macroeconomics?
Taylor: I have been saying for a while that the nexus
between finance and macro and trying to understand how
the monetary transmission process works is very important.
It’s not like we haven’t been trying to improve our models in
this direction for a while. The flow of funds data were originally collected for that purpose but I think the progress has
been kind of disappointing. To me, it’s not just the banking
sector, although that’s a big part of it. Rather, it’s the whole
financial system and how it interacts with the real economy.
There is a worry I have about some of the models that we
are using in macro, in particular the New Keynesian or
dynamic stochastic general equilibrium models. They are in
a funny halfway place between fitting some theory and fitting the data. To me, they have prior distributions which are
too precise and as a result they pay too little attention to the
data. But that’s a very general statement and I don’t have any
alternatives right now. But I think that quantitative macro
modeling is an area where more work needs to be done.
RF: Which economists have been most influential in
shaping your research agenda and your thinking about
economic policy issues?
Taylor: As I mentioned at the beginning of this conversation, Phil Howrey was important when I was first starting
out in economics. He really helped me to begin thinking
about macroeconomics in a much different way than if I
just had taken a standard undergraduate macro class. I would
also say my Ph.D. thesis adviser Ted Anderson, a mathematical statistician who gave me a way to think about models
and mathematics in a rigorous way, was very important for
my research. On policy issues, I learned a great deal from
Milton Friedman, from both reading his work about policy
rules which I was very interested in and then being a colleague of his. I have benefitted from a lot of interaction with
George Shultz whose experience as a statesman has been
helpful to my thinking about policymaking in practice. Alan
Greenspan also was an influence. When I first went to work
as an economist in Washington at the CEA in the Ford
administration he was the chairman, and I liked his
approach to data and ways to think about getting deeper
into the economic analysis. I later went to work for him at
his firm in New York and doing forecasting work there
helped me later in my career. There are so many people who
I have been fortunate to meet and get to know.
RF

Region Focus | First Quarter | 2012

33

ECONOMICHISTORY
The Counterfeiting Weapon
BY K A R L R H O D E S

Attacks against American currency began in 1776
uring the Revolutionary War, Thomas Paine wrote
an open letter to British Gen. William Howe to
express outrage over the latest Redcoat atrocity.
“You, sir, have the honor of adding a new vice to the
military catalogue,” Paine charged, “and the reason,
perhaps, why the invention was reserved for you is because
no general before was mean enough even to think of it.”
What was this new outrage that made Howe seem
meaner than Genghis Khan? Firing upon surrendering
troops? Abusing prisoners of war? No, worse than that:
Howe was printing counterfeit money.
British forces were not the first to use counterfeiting as a
weapon of war. “Efforts in war or peacetime to undermine
the economies, societies and governments of adversaries by
falsifying their money have proliferated since ancient
times,” wrote journalist John K. Cooley in his 2008 book,
Currency Wars.
The British, however, wielded this monetary mace with
exceptional skill. Several months before the Colonies
declared independence, the British started counterfeiting
Continental currency (continentals) aboard the HMS
Phoenix, a gunboat anchored in New York harbor. By April
1777, New York newspapers were running the following
notice: “Persons going into other Colonies may be supplied
with any Number of counterfeited Congress-Notes, for the
Price of the Paper per Ream. They are so neatly and exactly
executed, that there is no Risque in getting them off, it being
almost impossible to discover, that they are not genuine.”
Plenty of colonists demonstrated their loyalty to the
crown by passing counterfeit continentals. Perhaps the most
notorious of these Tories was Stephen Holland, a wellrespected resident of Londonderry, N.H., who organized an
elaborate network of friends and acquaintances, according
to the late Kenneth Scott, a historian at the City University
of New York who documented Redcoat counterfeiting in his
1957 book, Counterfeiting in Colonial America. Holland was
captured, but he escaped from prison before the Colonial
authorities could execute him.
“Damn him,” said New Hampshire patriot John
Langdon, who was helping to finance and fight the war.
“I hope to see him hanged. He has done more damage than
10,000 men could have done.”
Was Langdon exaggerating the impact of the British
counterfeiting weapon? Benjamin Franklin didn’t think so,
according to Scott. “Paper money was in those times our
universal currency,” Franklin wrote. “But, it being the instrument with which we combated our enemies, they resolved to

D

34

Region Focus | First Quarter | 2012

deprive us of its use by depreciating it; and the most
effectual means they could contrive was to counterfeit it.”
Franklin grasped a nuance that was perhaps lost on Paine
and Langdon: Printing counterfeit continentals was indeed an
instrument of war, but so was printing genuine continentals.
The British lost the war, but they conquered the continental. “The artists they employed performed so well, that
immense quantities of these counterfeits … were circulated
among the inhabitants of all the States, before the fraud was
detected,” Franklin continued. “This operated considerably
in depreciating the whole mass, first, by the vast additional
quantity, and next by the uncertainty in distinguishing the
true from the false.”
Counterfeiting contributed to the complete devaluation
of the continental, but the notes probably would have lost
their value anyway because the Continental Congress
printed enormous quantities of them to fund the war. In a
letter to John Jay, president of the Continental Congress,
Gen. George Washington noted that “a wagon-load of
money will scarcely purchase a wagon-load of provisions.”

Colonial Counterfeiting
Should controlling the currency be a sovereign right or a
provincial prerogative? That question arose some 85 years
before the Revolution, when individual Colonies started
printing their own money, asserting some independence
from Great Britain in the process. Massachusetts printed
the first Colonial paper money in 1690.
“As soon as that experiment worked reasonably well,
other Colonies joined the game,” says Stephen Mihm,
associate professor of history at the University of Georgia
and author of the 2007 book, A Nation of Counterfeiters.
“By the 1730s, I believe every one of the original Colonies
had issued paper money.”
These emerging Americans demanded substantial quantities of notes because gold and silver coins (specie) were
scarce. Colonial counterfeiters did their best to supplement
the specie supply with forged coins made of pewter and
debased gold and silver, but it was the proliferation of paper
money that begat a new breed of counterfeiters in the New
World. By the 1730s, counterfeiting was a serious problem
throughout the Colonies, according to Scott. North
Carolina Gov. Gabriel Johnston expressed concern in 1735
over “the great Number of Counterfeits, which are gone
abroad into all the parts of the Province, by the villanous
Arts of wicked and ill disposed persons.”
Great Britain reasserted its power over Colonial paper

money with the currency acts of 1751
and 1764, and “Ben Franklin cited
them as one of the Colonies’ major
grievances,” Mihm says. But on the eve
of the Revolution, American counterfeiting had surpassed British
imperialism as the No. 1 threat to
Colonial currency.

PHOTOGRAPHY: IMAGE USED WITH PERMISSION FROM A GUIDE BOOK OF COUNTERFEIT CONFEDERATE CURRENCY, ©WHITMAN PUBLISHING, LLC. ALL RIGHTS RESERVED.

Currency Chaos
Following the Revolutionary War,
newly minted American citizens
greatly preferred coins over paper
The fine print in the bottom
money because of their bad experimargin of this counterfeit note says, “Fac-simile Confederate
ence with continentals, both spurious
Note — Sold Wholesale and Retail, By S. C. Upham, 403 Chestnut Street,
and genuine. “The Continental
Philadelphia.” Upham’s customers could clip off this bottom line and spend the $5.
Congress itself was kind of a counternew notes represented a giant step toward eliminating confeiter,” Mihm quips.
fusion and increasing enforcement. The national currency
The Constitution gave the federal government the right
created economies of scale for counterfeiters, but it also
to “coin money,” and it prevented the states from issuing
made counterfeiting more risky, Mihm explains. “If you were
coins or paper money. The federal government provided
counterfeiting the notes of the Merchants Bank of Virginia,
some currency through its national banks, but the federal
you were attacking the Merchants Bank of Virginia. If you
charters of those banks were allowed to expire.
were counterfeiting greenbacks, you were attacking the
In 1832, when President Andrew Jackson vetoed a bill to
Union.”
re-charter the Second Bank of the United States, the instituConfederate leaders also attempted to establish a
tion was well on its way to developing a common, uniform,
national currency in the South, but they struggled to proand exclusive currency, Mihm says. “But when that bank was
duce “graybacks” of sufficient quality and quantity. The poor
destroyed, lots of new state-chartered banks sprung up in its
quality was an engraved invitation to counterfeiters, and
place,” and the shortage of specie persisted. Much like the
several Northern printers responded by forging large
Colonies had defied British laws, the states circumvented
volumes of Confederate currency. To circumvent counterthe Constitution by empowering state-chartered banks to
feiting laws, Philadelphia printer Samuel Upham expanded
issue bank notes.
the lower margin of his funny money and added a
State-chartered banks proliferated rapidly in the first half
disclaimer that said, “Fac-simile Confederate Note — Sold
of the 19th century, and most of them issued their own
Wholesale and Retail, By S. C. Upham, 403 Chestnut Street,
brands of notes. By the 1850s, “the money supply became a
Philadelphia.” It was easy for his South-bound customers to
great confluence of more than 10,000 different kinds of
clip off this disclaimer and spend the money. (Upham’s shop
paper that continually changed hands, baffled the uninitiatwas one block from Independence Hall and directly across
ed, and fluctuated in value according to the whims of the
the street from the building that had housed the Second
market,” Mihm wrote.
Bank of the United States.)
While researching his 1995 book, Illegal Tender, historian
Upham and a New York printer, Winthrop Hilton,
David Johnson of the University of Texas at San Antonio
openly advertised their Confederate facsimiles in much the
found a story in the New York Times from 1862 that claimed
same way the British had promoted their counterfeit conti6,000 varieties of counterfeit bank notes were contaminatnentals during the Revolutionary War. U.S. government
ing the money supply. Johnson questions some of the
officials did not actively encourage the counterfeiting of
article’s statistics, but he does not quibble with the story’s
Confederate money, but they allowed Northern printers to
conclusion that counterfeiting was “a national evil demandcontinue the practice. At one point, however, they falsely
ing a national remedy.”
accused Hilton of printing real Confederate currency and
Help was already on the way. The Legal Tender Act,
smuggling it to the Confederate government. He was arrestsigned by President Abraham Lincoln in February of 1862,
ed but never prosecuted, and after they released him, he
designated a new national currency as “legal tender for all
debts public and private.” These notes employed highercontinued his counterfeiting enterprise with renewed zeal.
quality printing, including the use of green ink on the back
“I now felt pretty certain that I would no longer be interof the bills that branded them as “greenbacks.” The antirupted: I had even persuaded myself that my avocation was
counterfeiting measures were largely ineffective, but the
patriotic,” Hilton confessed in a New York Tribune story that

Region Focus | First Quarter | 2012

35

36

Region Focus | First Quarter | 2012

PHOTOGRAPHY: COURTESY OF THE U.S. SECRET SERVICE

were connected with the underworld,”
Johnson says. “When they went out on the
street and worked a case, they were talking to
their friends and acquaintances of long standing.” Lacking official police powers, Wood’s
operatives often made citizens’ arrests during
sting operations that would qualify as entrapment by today’s standards.
Wood left the agency in 1869, and the
Secret Service abandoned his questionable
tactics, but the agency continued to rely
heavily on confidential informants and
undercover operations. The most famous of
these took place in 1876, when a Secret
Service informant infiltrated a gang of counterfeiters plotting to steal Lincoln’s body
from its tomb in Springfield, Ill. The gang
planned to use his remains to ransom their
highly skilled engraver, Ben Boyd, from an
Illinois prison.
“Ben Boyd was probably worth his weight
in gold to the counterfeiters,” Johnson says.
This bogus greenback came from a plate engraved by Ben Boyd, one of the best
“He may have been the best engraver
American counterfeiters of the 19th century. His associates tried to steal
of counterfeit notes in the 19th century.”
Abraham Lincoln’s body and use it to ransom Boyd from prison in 1876.
Working with Pinkerton detectives, the
Secret Service thwarted the plot just as the
counterfeiters were sliding Lincoln’s coffin out of its marble
was withheld from publication until his death in 1906.
sarcophagus.
“Had not the British Government authorized or connived at
In the decades that followed, the Secret Service mainthe counterfeiting of our Revolutionary currency, as a war
tained its zealous pursuit of counterfeiters and its
measure?”
zero-tolerance policy toward anything that remotely resemAccording to Hilton, printers in New York, Boston, and
bled U.S. currency. The agency confiscated artists’
Philadelphia (including Upham) produced hundreds of milrenderings of money, advertising leaflets that looked like
lions of spurious Confederate notes, initially circulating
cash, even 160 boxes of play money from R.H. Macy’s
them through newspaper vendors. Hilton expressed no symdepartment store.
pathy “for the slave-holders, planters, and brokers who sold
“It wasn’t the play money per se as much as what it symtheir crops for worthless notes; but for the poor whites and
bolized, which was a disrespect for the national currency,”
the small negro dealers who were deceived by [the bogus
Johnson explains.
bills], I never ceased to entertain the keenest regrets.”
The zero-tolerance policy seemed to work. In 1911, the
Before the end of the war, Confederate money became
New York Times quoted a government estimate that only
nearly worthless, and many Southerners were conducting
0.001 percent of the money in circulation was counterfeit.
business with the new national currency of the North,
Johnson concedes that this number may have been
Mihm says. “The greenbacks were in some cases conquering
overly optimistic, but he says there is no doubt that “the
the South before the Union soldiers got there.”
Secret Service was so effective that counterfeiting of the
19th century type was no longer a viable occupation.”
The Secret Service
According to Secret Service lore, Secretary of the Treasury
Hugh McCulloch received Lincoln’s approval to create the
Communists and Nazis
agency on April 14, 1865, the same day Lincoln was assassiThe Secret Service may have won the war against counternated. It would be many years, however, before the Secret
feiting in the United States, but growing international
Service started protecting presidents. In 1865, the new
demand for U.S. currency presented new threats from
agency was charged only with catching counterfeiters.
abroad.
The first chief of the Secret Service, William P. Wood,
Soviet Premier Joseph Stalin, for example, ordered his
had been superintendent of the Old Capital Prison in
intelligence service to counterfeit $100 bills in the late
Washington, and about half of his early Secret Service
1920s, according to a 1984 article by Arnold Krammer, prorecruits had criminal backgrounds, according to Johnson.
fessor of history at Texas A&M. The quality of the Soviet
“They were effective because they hired operatives who
counterfeits was excellent, but their distribution network

was severely flawed. The operation started to unravel in
1930, when Berlin police seized a huge cache of the spurious
notes at a German bank that the Soviets were using as an
international distribution center. Two years later, large quantities of the bogus bills surfaced in Chicago after con man
Hans Dechow (aka Count von Buelow) persuaded some
local gangsters to launder the money during the height of the
Christmas shopping season. Eventually, the gangsters, the
count, and their communist suppliers landed behind bars.
The most famous case of government-sponsored counterfeiting occurred during World War II, when the Nazis
identified skilled printers and engravers among concentration camp prisoners and brought them together at the
Sachsenhausen concentration camp. Under constant threat
of death, the prisoners learned the counterfeiting trade and
printed millions of bogus British notes of such high quality
that they fooled bank officials in England and Switzerland
for a while. The Nazis used them to help fund the war. They
also attempted to counterfeit American dollars, but Allied
Forces closed in on them before they could circulate any
forged greenbacks.
The use of counterfeiting as a weapon of war has not
been limited to Nazis, Redcoats, and communists. There are
some reports that the United States counterfeited dongs,
the North Vietnamese currency, during the Vietnam War.
Certainly, U.S. forces dropped many millions of leaflets over
North Vietnam, including parodies and close approximations of North Vietnamese currency, but it is not clear
whether any of those leaflets were passable dongs. Versions
that came close featured detachable propaganda messages
on the sides of the notes, a ploy reminiscent of Upham’s
counterfeiting dodge during the Civil War.
The United States may have lost the shooting war in
Vietnam, but it won the currency war. Today, most
Vietnamese people prefer dollars over dongs, especially for
storing wealth and making large purchases. U.S. currency
remains popular in much of Southeast Asia. Dollars are
banned in North Korea, but the North Korean government
has counterfeited $100 bills for its own use, according to a

joint study released in 2006 by the Federal Reserve, the
Secret Service, and the Treasury Department.
“Since 1989, the U.S. Secret Service has led a counterfeit
investigation involving the trafficking and production of
highly deceptive counterfeit notes known as supernotes,”
the report stated. “The U.S. Secret Service has determined
through investigative and forensic analysis that these highly
deceptive counterfeit notes are linked to the Democratic
People’s Republic of Korea (DPRK) and are produced and
distributed with the full consent and control of the North
Korean government.” Internationally, from 1996 through
2005, the public received approximately $22.4 million in
supernotes, and the Secret Service seized approximately
$50 million in supernotes. (The Secret Service declined to
update these numbers or provide further information about
its supernote investigation.)
With $1 trillion of U.S. currency circulating worldwide,
even a large counterfeiting operation may seem more like a
numismatic nuisance than a weapon of war. But Secret
Service agents aggressively pursue all counterfeiters,
whether they produce highly deceptive supernotes or
easily detectable inkjet knockoffs. Lessons learned from
economic history give them little choice.
In essence, the level of counterfeiting is a function of the
value of currency relative to the cost of counterfeiting it,
both the production cost and the risk of getting caught and
punished. That’s why the Treasury Department began
adding significant new security features to U.S. currency in
1996. These security enhancements have driven up the production cost of counterfeiting, and stronger enforcement by
the Secret Service has increased the risk of getting caught.
These dual deterrents have limited the counterfeiting
of U.S. currency to low levels. The joint study by the
Federal Reserve, Secret Service, and Treasury Department
concluded that less than 0.01 percent of Federal Reserve
Notes in circulation worldwide were counterfeit in 2005.
That’s not as stunning as the government estimate from
1911, but it does indicate that U.S. currency stands strong
RF
against all enemies foreign and domestic — for now.

READINGS
Cooley, John K. Currency Wars: How Forged Money Is the New Weapon
of Mass Destruction. New York: Skyhorse Publishing, 2008.
Johnson, David R. Illegal Tender: Counterfeiting and the Secret Service
in Nineteenth-Century America. Washington: Smithsonian
Institution Press, 1995.
Malkin, Lawrence. Krueger’s Men: The Secret Nazi Counterfeit
Plot and the Prisoners of Block 19. New York: Little, Brown
and Company, 2006.

Mihm, Stephen. A Nation of Counterfeiters: Capitalists, Con Men,
and the Making of the United States. Cambridge, Mass.: Harvard
University Press, 2007.
Scott, Kenneth. Counterfeiting in Colonial America. New York:
Oxford University Press, 1957.
“The Use and Counterfeiting of United States Currency.”
Final report to the Congress by the Secretary of the Treasury in
consultation with the Advanced Counterfeit Deterrence Steering
Committee, September 2006.

Region Focus | First Quarter | 2012

37

BOOKREVIEW
Economic Thinking in an Age of Shared Prosperity
GRAND PURSUIT: THE STORY OF ECONOMIC GENIUS
BY SYLVIA NASAR
NEW YORK: SIMON & SCHUSTER, 2011, 558 PAGES

REVIEWED BY THOMAS M. HUMPHREY

distinctive feature of the modern capitalist
economy is its capacity to deliver sustainable, everrising living standards to all social classes, not just
to a fortunate few. How does it do it, especially in the face
of occasional panics, bubbles, booms, busts, inflations,
deflations, wars, and other shocks that threaten to derail
shared rising prosperity? What are the mechanisms
involved? Can they be improved by policy intervention?
Has the process any limits?
The history of economic thought is replete with
attempts to answer these questions. First came the pessimists Thomas Malthus, David Ricardo, James Mill, and his
son John Stuart Mill who, on grounds that for millennia
wages had flatlined at near-starvation levels, denied that
universally shared progress was possible. The problem was
seen to be labor’s prolific reproductive capacity, which
condemned the mass of humanity to bare subsistence living.
An “iron law of wages” dictated that temporary wage rises
above subsistence equilibrium would trigger the very population growth that eliminates the wage discrepancy.
Similarly, transitory wage declines below subsistence equilibrium produce starvation and population shrinkage until
wages return to their subsistence equilibrium along a
perfectly elastic long-run labor supply curve.
Karl Marx and Friedrich Engels
accepted the iron law of wages, albeit
without its Malthusian trappings.
Unwilling to blame poverty on labor’s
inability to prudently keep its own
numbers in check rather than upon its
exploitation by capitalist managers,
they claimed that capitalists, by threatening to replace employed hands with
idle ones drawn from “the reserve army
of the unemployed” huddled at factory
gates, could force labor to accept subsistence wages while appropriating all
surplus value produced by labor for
themselves. To Marx and Engels, capitalism creates great wealth, but only for
the capitalist 1 percent who seize it
from its rightful owners.
This picture changed after the
1840s and ’50s when rises in the British

A

38

Region Focus | First Quarter | 2012

workingman’s living standards signaled the demise of the
iron law and forced economists to recognize and explain the
phenomenon. Britain’s Alfred Marshall, popularizer of the
microeconomic demand and supply curves still used today,
was among the first to do so. He argued that competition
among firms, together with their need to match their rivals’
cost cuts to survive, incessantly drives them to improve
productivity and to bid for now more productive and
efficient workers. Such bidding raises real wages, allowing
labor to share with management and capital in the productivity gains.
Marshall interpreted productivity gains as the accumulation over time of relentless and continuous innumerable
small improvements to final products and production
processes. Joseph Schumpeter, who never saw an economy
that couldn’t be energized through unregulated creditfinanced entrepreneurship, saw productivity gains as
emanating from radical, dramatic, transformative, discontinuous innovations that precipitate business cycles and
destroy old technologies, firms, and markets even as they
create new ones. Schumpeter’s outcome, however, was much
the same as Marshall’s, namely an ever growing, ever more
affordable volume of goods whose steadily falling prices
enable all income classes, particularly the poor, to share in
their consumption.
Marshall and Schumpeter highlighted innovation and
technological advance. Other economists, notably Irving
Fisher, itemized additional necessary conditions — monetary and price level stability, absence of trade barriers, an
economic climate conducive to entrepreneurship (recognized also by Schumpeter ) — required
to ensure that the capitalist machine
yielded perpetual, universally shared
progress.
With these additional ingredients
incorporated into it, the augmented
Marshall-Schumpeter model prevailed
until the interwar period. Then came
the destruction, mass unemployment,
poverty, and destitution wrought by
two world wars and the Great
Depression. Capitalism came under
fire, and confidence in the validity and
relevance of its explanatory model
waned.
Here again economists came to the
fore. They devised powerful new
theories to diagnose the economic
devastation and to prescribe policies to
remedy it so that capitalism could be

Nasar’s command of theory Knut Wicksell, and business cycle
restored to its former prowess. Such
pioneers such as Clement Juglar and
novelties as Fisher’s compensated
is adequate to her task and
Wesley Clair Mitchell. Modern
dollar plan to stabilize the general
analysts including Robert Lucas,
price level by adjusting the gold consufficient to satisfy
Thomas Sargent, James Tobin,
tent of the dollar, his monetarist
economists while remaining Robert Solow, Franco Modigliani,
theory of the business cycle as due
to unanticipated fluctuations in the
completely accessible to the Don Patinkin, Michael Woodford,
and many others receive nary a
rate of change of the price level, his
general reader.
mention. True, Milton Friedman
famous debt-deflation theory of
makes an appearance, but mainly as
great depressions in which attempts
a young Keynesian in the U.S. Treasury in the early 1940s
to retire nominal debt result in money stock destruction and
where he devised income tax withholding at the source in
price level declines that perversely raise real debt burdens,
order to facilitate the Treasury’s quick receipt of tax
and his 100 percent reserves proposal all date from this era.
revenues. Curiously, little is said of Friedman in his later role
So do John Maynard Keynes’ ideas of the superiority of
as the leading monetarist critic of Keynesianism, the Federal
domestic price level stability over foreign exchange rate
Reserve, and big government. Likewise, little is said of
stability, of the liquidity trap, of investment and government
Hayek’s profound postwar analysis of the price system as a
spending multipliers, and of the efficacy of fiscal stimulus.
market coordination, discovery, and information assimilatFinally, there was Schumpeter’s theory of the tax state and
ing/synthesizing/economizing mechanism, although much is
his capital levy proposal. All these innovations to economic
said of his Road to Serfdom critique of statist planning and
theorizing were designed to put the derailed capitalist
cornucopia back on track. Hard times called for imaginative
control. Similarly, Paul Samuelson’s numerous pathbreaking
responses. Economists were glad to oblige. In this way,
contributions to theory are downplayed in order to highlight
economic theory contributed to economic progress.
his erroneous prediction of the U.S. economy’s lapse back
Economic reasoning came to the aid of progress again at
into depression following demobilization at the end of
the end of World War II. Keynes and others, having studied
World War II. And the formulators and developers of
the disastrous German reparations debacle after World
recent rational expectations, real business cycle, and New
War I, vowed to replace punitive reprisals on the defeated
Keynesian dynamic stochastic general equilibrium models
nations with generous recovery assistance. Both theory and
are totally ignored.
experience suggested that such assistance would benefit
In place of the missing economists, Nasar substitutes
victor and vanquished alike. The result was the Bretton
such noneconomists as Charles Dickens, the British novelWoods system, World Bank, International Monetary Fund,
ist/journalist obsessed with the Victorian problem of
and the Marshall Plan, all of which witnessed the launching
eradicating poverty; Henry Mayhew, a British investigative
of the long postwar global expansion of 1946-2006.
reporter whose 88-part newspaper series definitively
Sylvia Nasar, a professor at the Columbia Graduate
described the condition of London’s poor, circa 1850; and
School of Journalism, former economic correspondent for
most notably Beatrice Potter Webb, a founder of both the
the New York Times, and author of Nobel laureate John
London School of Economics and the Fabian Society. It was
Forbes Nash Jr.’s biography entitled A Beautiful Mind, the
Webb who, with husband Sidney, hatched the idea of a taxonly mathematician/economist’s biography ever made into
financed government social safety net both as a solution to
an Academy Award-winning movie, traces the foregoing
the poverty issue and as a partial corrective of inequality
developments and much more in her beautifully written and
arising from capitalist growth, thus paving the way for
intellectually compelling account of the evolution of
Britain’s cradle-to-grave welfare state of the 1940s, ’50s, and
economic doctrines. (The “grand pursuit” of her title refers
’60s. But perhaps Nasar’s most puzzling selection is British
to the quest to discover growth’s mainsprings, “economic
economist Joan Robinson, who, after co-inventing (with
genius” to those who found them or at least advanced the
E.H. Chamberlin) the theory of imperfect, or monopolistic,
search.) As a history of economic thought, however, Nasar’s
competition in the 1930s, later renounced that seminal work
book, though completely captivating, is somewhat idiosynto become a sympathizer of the communist regimes of Stalin
cratic and unconventional. Unlike standard histories, it
and Mao in the Soviet Union and China, respectively.
focuses on just a few key individuals — Marx, Marshall,
It’s hard to see how Robinson fits into Nasar’s theme of the
Schumpeter, Fisher, Keynes, Friedrich Hayek, Joan
link between economic ideas and rising living standards.
Robinson — writing during the period circa 1850-1950, but
Nasar’s unconventional treatment is noteworthy on two
says relatively little of economists practicing before or since.
further counts. First, in contending that economic thought
Both Mercantilist and Physiocratic Schools are omitted.
contributes to economic progress, she comes perilously
And British Classicals such as David Hume, Adam Smith,
close to implying that the former causes the latter, as if mere
David Ricardo (except for his iron law of wages), and Henry
theorizing about progress makes it so. To this reviewer, the
Thornton get short shrift, as do neoclassical marginalists
direction of causality is exactly the reverse: Technical
William Stanley Jevons, Leon Walras, Carl Menger,
advance and entrepreneurial initiative drive material
Region Focus | First Quarter | 2012

39

progress, which then stimulates improved economic theory
to explain and rationalize the process. Wal-Mart, Apple, and
Target, as well as the steel, rail, auto, aircraft, radio, TV, and
computer industries, all emerged as the brainchildren and
products of the efforts of their creators, not because economists anticipated them beforehand. Second, contrary to
standard thought texts, Nasar focuses primarily on the
personal histories — the lives, times, eccentricities, and
experiences — of her protagonists and only secondarily on
their contributions to economic analysis. In sum, she is long
on biographical detail, but relatively short on theory. While
these characteristics might seem to make her book more
suitable to the general reader than to professional economists, such is not the case. Her command of theory is
adequate to her task and sufficient to satisfy economists
while remaining completely accessible to the general reader.
This is especially true of her chapters on Schumpeter,
Fisher, and Keynes — the strongest analytical chapters of
the book.
Nasar’s comparison of Fisher and Keynes highlights the
resilience of their ideas, which continue to resonate in
policy discussions today where concepts like monetarism,
fiscal stimulus, zero interest rate bound, liquidity traps,
multipliers, debt leveraging and deleveraging, debt-deflation
cycles, fixed vs. flexible exchange rate regimes, gold vs. fiat
paper standards, external vs. internal devaluation, sticky
nominal wages, etc., are bandied about with abandon. In the
1920s and early ’30s, Fisher and Keynes were allied in their
monetarism. Both contended that misbehavior of the
money stock causes not just inflation and deflation but also
fluctuations in output and employment. Both believed that
causality runs from money to prices to real activity, with
disturbances to activity resulting from the stickiness of
nominal wage and interest rates in response to price level
changes. Fisher even estimated an empirical relationship
between price changes and unemployment, thus anticipating the famous Phillips curve (although unlike A.W. Phillips,
he traced causation as running from price change to unemployment rather than the reverse). Both Fisher and Keynes
held that money could not be trusted to take care of itself
but needed deliberate management by central banks
through discount window lending and open market operations. And both contended (1) that policy should aim at
stabilizing domestic prices instead of currency exchange
rates, and (2) that fixed exchange rate regimes (including the
gold standard) are inferior to floating rate regimes from a
stabilization standpoint because they deny a nation the
power to govern its own money stock, price level, and
nominal spending independently of other nations. For that
reason, Keynes criticized Britain’s 1924 return to gold at
the fixed prewar parity — a parity that necessitated painful
domestic price deflation (“internal devaluation”) and a
slump in real activity to correct the pound’s overvaluation.
Likewise both he and Fisher applauded President
Roosevelt’s 1933 decision to depart from the gold standard
and to let the dollar depreciate on the foreign exchanges.

40

Region Focus | First Quarter | 2012

Both saw their predictions validated when the dollar depreciation, which rendered U.S. goods cheaper in foreign
markets, helped spark the partial recovery of 1933-37.
Fisher and Keynes parted company in the mid-1930s
when persistent mass unemployment seemed impervious to
monetary remedies. Fisher, while continuing to advocate
monetary policy as the only way out, nevertheless discovered excess leverage, or overborrowing, as a new obstacle to
policy’s effectiveness. His debt-deflation theory explained
how overleveraged borrowers, attempting to pay off their
debts with checks drawn on their deposit accounts, would
cause bank money contraction and price level deflation.
Such deflation, by raising the real burden of debts, would
induce further attempts to deleverage, leading to further
monetary contraction and further price deflation and so on
ad infinitum in a self-reinforcing spiral. Monetary policy
would have to reverse the vicious cycle of debt deleveraging
and price deflation before it could make inroads into the
depression. Fisher thought Roosevelt’s policy of reflating
prices to their pre-slump level would do the job.
Keynes took a different route, abandoning monetary
policy for fiscal policy on the grounds that a “liquidity trap”
rendered the former ineffective and the latter effective in
depressions. He argued that with interest rates at near-zero
levels (as they were in the Great Depression) money
becomes a perfect substitute for Treasury bills in asset portfolios. At that point the demand for money becomes
infinitely elastic with respect to the interest rate such that
all newly central-bank-created money is absorbed into idle
hoards rather than into active circulation in the spending
stream. The result is to render monetary policy impotent at
the zero bound and to leave fiscal policy, with its multiplier
effect on income and spending, as the only game in town.
Their policy differences notwithstanding, Fisher and
Keynes remained united both in their opposition to
“do-nothing” and austerity measures and in their dedication
to eliminating the depression through activist intervention.
In this connection, Nasar correctly emphasizes that
although Keynes is sometimes accused of being a socialist or
a socialist sympathizer, in actuality he was anything but. He
detested socialism and admired capitalism as the economic
system most conducive to individual liberty, personal initiative, and intellectual and artistic creativity. He sought to save
capitalism by restoring it to its full-employment potential
where those qualities could flourish.
Nasar’s book is full of surprises. We learn, for example,
(1) that Hayek and philosopher Ludwig Wittgenstein were
cousins, (2) that Schumpeter’s pioneering The Theory of
Economic Development was ignored by most economists and
critiqued with extreme hostility by others upon its publication, (3) that libertarian Hayek, the darling of American
conservatives, in the 1950s “despised Republican politicians,
all cars, and practically everything else about life in America,
including the absence of universal health insurance and
government-sponsored pensions,” (4) that Irving Fisher was
perhaps the first U.S. employer to make automatic cost

of living adjustments to the wages of his employees,
(5) that Marx in Das Kapital condemned the squalor of factory workers without ever setting foot in an actual factory,
(6) that philosopher Frank Ramsey wrote at age 19 a criticism of Keynes’ Treatise on Probability “so devastating that
Keynes gave up any notion of a mathematical career,” and
(7) that the Bretton Woods conference was crawling with
Soviet spies, including Treasury economist Harry Dexter
White, FDR adviser Lauchlin Currie, and the University of
Chicago’s Oskar Lange. But perhaps Nasar’s biggest surprise
is the cordial personal and professional relationship she
finds existing between Keynes and Hayek, the two main
rival macroeconomists in the 1920s and ’30s, and bitter foes
on the causes of the trade cycle and mass unemployment and
of the need for stabilization policy. Although both economists ordinarily were extremely critical of each other’s work,
it was Keynes who congratulated Hayek on the excellence of
his Road to Serfdom and who nominated him for membership
in the British Academy. And it was Hayek who wrote to
Keynes’ widow in 1946 that Keynes was “the one great man
I ever knew, and for whom I had unbounded admiration.”
In sum, Nasar’s is a fascinating and accessible work, one
that will reward all readers, economists and noneconomists
alike. True, the book is not perfect: Rather it is a somewhat
awkward amalgam of three smaller books pressed into one.
It is an economic history, largely of England and Vienna, of

UPFRONT

the period circa 1850-1950. It is a series of scintillating intimate portraits of a too small subset of great economists. And
it is a partial catalog of their theories and policy analyses.
One wishes Nasar had chosen to expand the third book to
include additional great economists and their theories.
And one wishes she had given that expanded third book
pride of place. But she did not choose to do so.
Nasar wrote the bulk of her book before the appearance
of the recent financial crisis and the Great Recession. She
opines that these disturbances neither invalidate her thesis
of the long-run persistence of shared prosperity under
capitalism, nor do they necessitate revision of her book.
Maybe so, but this reviewer’s preference is that she extend
her coverage to include at least some of the economic and
policy debates sparked by these recent episodes. Given the
need to reassess mainstream macroeconomic thinking in
the light of its failure to predict the crisis, these debates
seem bound to impact the current and future evolution of
economic thought.
RF
Thomas M. Humphrey is a retired long-time economist
with the Richmond Fed’s research department and a
former editor of the Bank’s Economic Quarterly. He
specializes in the history of monetary thought, and
most recently has written on the history of the theory of
the lender of last resort.

continued from page 5

motorists from using local roads to bypass tolls while allowing local drivers to make some short trips for free. But for
longer trips, North Carolina’s entire I-95 corridor would
become a toll road upon completion of phase one in 2019.
Under this scenario, the study estimates that each of the
three toll zones in the first phase would charge $3.84, while
each of the six toll zones in the second phase would charge
$1.28. So the owner of a passenger car crossing the entire
state would pay $19.20. North Carolina expects to collect all
fees electronically as cars move at full speed through the toll
zones using a transponder system that would be compatible
with E-ZPass and other toll programs along the I-95 corridor. Owners of cars lacking toll transponders would receive
bills in the mail. The study projects that the tolls would raise
nearly $30 billion over 40 years.
Virginia’s plan to charge tolls on I-95 is less ambitious and
detailed than North Carolina’s proposal. But the Old
Dominion expects to place tolls on I-95 that would raise
$250 million in the first five years and more than $50 million
per year after that. The tolls would help fund comprehensive
improvements outlined in Virginia’s “I-95 Corridor Vision
Plan.” The Virginia Department of Transportation estimates
that it would cost $12.1 billion to fully execute the plan along
its 179-mile stretch of I-95, so additional funding would have
to come from other sources.

Critics of toll roads claim they discriminate against poor
people. But tolls connect the cost of highways to the people
who use them, says Brian Taylor, director of the Institute of
Transportation Studies at the University of California, Los
Angeles. Federal and state fuel taxes made that connection
in the early days of driving, but increases in fuel efficiency
and reluctance to raise fuel taxes have created huge gaps in
highway funding nationwide.
“A great deal of concern has been raised, some of it
justified, about the equity of returning to tolls, but critics
have been silent about the equity of using sales taxes to fund
highways,” Taylor says. Raising general sales taxes to pay for
highways is “a doubly regressive approach,” while tolls tend
to raise a greater share of funding from wealthier motorists.
Taylor argues that sales taxes are inherently regressive, and
when their proceeds are used to fund highways, they become
doubly regressive because wealthy people tend to use highways more than poor people.
Taylor attributes much of the recent interest in tolls to
advances in technology. “We have eliminated the need for
the traditional toll booth,” he says. “Tolling is much more
practical now.” And the North Carolina and Virginia proposals should be more palatable to local drivers on I-95, he adds,
because many out-of-state motorists travel this Maine-toFlorida throughway.
— KARL RHODES

Region Focus | First Quarter | 2012

41

AROUNDTHEFED
Death and Taxes
BY C H A R L E S G E R E N A

“Aging and Strategic Learning: The Impact of Spousal
Incentives on Financial Literacy.” Joanne W. Hsu, Federal
Reserve Board of Governors Finance and Economics
Discussion Series Working Paper 2011-53, October 2011.

omen tend to live longer than men. When a woman’s
husband dies, she faces the prospect of dealing with
the household’s finances alone. In households where the
wife was already primarily responsible for financial matters,
she is accustomed to this responsibility. In other households, it requires the wife to adjust to a new role under
difficult circumstances.
Joanne Hsu, an economist at the Federal Reserve Board
of Governors, created a model to examine a married
woman’s incentives to increase her financial literacy,
including her likelihood of widowhood in the future.
“In sum, the model predicts that a woman will acquire
financial knowledge very slowly at the beginning of the marriage and delay larger investments in human capital,” Hsu
explains. “The rate of investing will increase as the expected
time of widowhood approaches. After her husband dies, she
takes charge of the finances and accrues payoffs to her financial knowledge.”
Using data from a national survey of households and
other sources, Hsu ran the model and found that wives did
increase their financial literacy in various ways as their
husbands aged. As the time to potential widowhood grew
nearer, women accelerated their literacy efforts, though
the expected length of widowhood was not a statistically
significant incentive.

W

“Do Borrower Rights Improve Borrower Outcomes?
Evidence From the Foreclosure Process.” Kristopher
Gerardi, Lauren Lambie-Hanson, and Paul S. Willen, Federal
Reserve Bank of Boston Public Policy Discussion Paper 11-9,
December 2011 (also published as Federal Reserve Bank of
Atlanta Working Paper 2011-16, November 2011).

t’s commonly believed that the best way to stem the tide
of foreclosures is to strengthen protections for homeowners in default. Kristopher Gerardi at the Atlanta Fed,
and Lauren Lambie-Hanson and Paul Willen at the Boston
Fed decided to test this intuition by evaluating two types of
borrower protections. In both cases, foreclosures were
delayed but not prevented.
When someone defaults on a mortgage, the lender
usually has two choices — petition a court to foreclose on
the house and auction the property, or carry out the foreclosure process itself if the borrower agreed to give the lender
that right, known as the “power of sale.” In 20 states, only

I

42

Region Focus | First Quarter | 2012

judicial foreclosures are permitted. Gerardi, LambieHanson, and Willen found that the foreclosure process was
much longer in these states.
“A year after a borrower enters serious default, which we
define as becoming 90 days delinquent, lenders had auctioned off only 14 percent of properties in judicial states
compared to 35 percent in power-of-sale states,” noted the
paper’s authors.
This delay might seem like a good outcome because it
gives borrowers time to fix things. In fact, borrowers in
states with judicial foreclosures were no more likely to
become current on the mortgage or pay it off. They stayed in
their houses longer, but the unhappy ending still happened.
Gerardi, Lambie-Hanson, and Willen also examined a
Massachusetts law passed in November 2007 that suspends
foreclosure proceedings for 90 days. They compared
mortgage outcomes in Massachusetts before and after the
law’s implementation with outcomes in three neighboring
states with no major changes in their foreclosure regulations. Here, too, the borrower protection resulted in no
significant change in modification rates.
The authors surmise that the 90-day waiting period
wasn’t enough time for borrowers in default to solve their
problems. Massachusetts lawmakers may have come to the
same conclusion — they extended the waiting period to
150 days in August 2010. There is insufficient evidence to
evaluate the results of that change.
“How the U.S. Tax System Stacks Up Against Other G-7
Economies.” Anthony Landry, Federal Reserve Bank of Dallas
Economic Letter, vol. 6, no. 12, November 2011.

nthony Landry, a senior research economist at the
Dallas Fed, combed through aggregate data from the
Organization for Economic Cooperation and Development
to see how U.S. tax policy compares with that of the other
Group of Seven industrialized nations: Canada, France,
Germany, Italy, Japan, and the United Kingdom.
As in other G-7 countries, income taxes account for the
bulk of the government’s revenue, mostly levies on workers’
paychecks rather than taxes on capital or corporate income.
Value-added and excise taxes on goods and services account
for a smaller percentage of revenue here, partly due to the
fact that the United States has the lowest consumption sales
tax rate among the G-7 nations.
Together with the lowest labor income taxes among the
G-7, this arguably puts the United States in a competitive
position globally to attract skilled workers. On the other
hand, the United States had the second-highest corporate tax
rate among G-7 countries (second only to Japan).
RF

A

EUROZONE

continued from page 20

An Uncertain Path
The degree to which fiscal consolidation will occur depends,
like much of the EU’s historical development, almost
entirely on political will. Countries whose governments
would provide stability to a centralized tax and spending
system, such as Germany, have little incentive to sign on if
indebted nations refuse longer-term fiscal reform within
their own borders. The new fiscal compact notwithstanding,
that has been difficult to achieve to everybody’s satisfaction.
Economists and European leaders are far from agreed on
which parties should make the greater concessions. Public
opinion may be another impediment. Europeans mainly
identify with their home countries rather than Europe.
“It matters because where you have your self-identity to a
large extent indicates in the name of what you’re willing to
be taxed,” Kirkegaard says.
The underlying problem of the eurozone’s structure
remains: The euro conjoins fundamentally different
economies. “Countries like Greece and Portugal have a seri-

ous competitiveness problem,” says Rose at UC Berkeley.
They are unable to produce as cheaply as the European core,
and unable to compensate to boost their growth and exports
by devaluing their currencies. That leaves only two options:
adjustment through higher unemployment and lower real
wages, which several countries are currently experiencing —
nearly a quarter of Spaniards are unemployed, the highest
rate in the eurozone — or structural reform in labor and
product markets to cheapen production, which is not an
overnight process. Until structural reform happens, their
lack of competitiveness leads to persistent capital outflows,
stagnating real wages, and worsening fiscal positions —
“exactly what you’d imagine coming out of the optimum currency criteria” when not followed, Rose says.
“That’s one of the main reasons that the problems have
proven so time consuming to solve for the euro area,”
Kirkegaard says. “Politically, it’s not just about writing a big
check and bailing out Greece. It’s about correcting some of
these design mistakes.”
RF

READINGS
Eichengreen, Barry, and Tamim Bayoumi. “Shocking Aspects of
European Monetary Unification.” In B. Eichengreen (ed.),
European Monetary Unification: Theory, Practice and Analysis.
Cambridge, Mass: MIT Press, 1997.
Frankel, Jeffrey A., and Andrew K. Rose. “The Endogeneity of the
Optimum Currency Area Criteria.” Economic Journal, July 1998,
vol. 108, no. 449, pp. 1009-1025.

Krugman, Paul, and Anthony Venables. “Integration and the
Competitiveness of Peripheral Industry.” In Christopher Bliss and
Jorge Braga de Macedo (eds.), Unity with Diversity in the European
Economy: The Community’s Southern Frontier. Cambridge and
New York: Cambridge University Press, 1990.
Mundell, Robert A. “A Theory of Optimum Currency Areas.”
American Economic Review, September 1961, vol. 51, no. 4,
pp. 657-665.

You’ll find summaries of work by economists at the Richmond Fed
published in external journals and books — along with full citations
and links to those articles.
This midyear annual will feature articles from the first of June 2011
through May 2012.
Look for the Richmond Fed Research Digest coming soon on our website:
richmondfed.org/publications

Region Focus | First Quarter | 2012

43

DISTRICTDIGEST

Economic Trends Across the Region

The Great Recession and State Unemployment
Insurance Funds
BY R I C K K AG L I C

R

12

50

10

40

8

30

6

20

4

10

2
0
1960 1965 1970 1975 1980 1985 1990 1995 2000 2005 2010
Unemployment Rate (Left Axis)
Long-term Share (Right Axis)

SOURCE: U.S. Department of Labor, BLS

44

Region Focus | First Quarter | 2012

PERCENT

PERCENT

Long-Term Unemployment in the U.S.

The Unemployment Experience in the
Fifth District
The unemployment insurance program is a joint federalstate initiative that began back in 1935 to ease the burdens
on workers following the Great Depression. When workers
become unemployed due to circumstances beyond their
control, they may become eligible to receive unemployment
insurance benefits. The state pays these claims from an
unemployment insurance trust fund derived from taxes on
employers. Benefit levels, as well as the tax rates and the portion of wages that are taxable, vary considerably across
states. The revenues from the state’s unemployment insurance taxes are held by the federal government in individual
accounts for each state.

Payroll Employment
(Three-month moving average)
INDEX, DEC. 2007 = 1.00

oughly 8.8 million nonfarm payroll jobs were lost
nationwide during the labor market downturn of
the Great Recession, and fewer than 3.5 million
have been generated since the recovery got under way in
mid-2009. The official unemployment rate in the United
States has come down a bit from recession highs, yet
remains well above the peaks established following the
recessions in 1991 and 2001. Perhaps the most striking statistic in labor market data, however, is that the percentage of officially unemployed workers who have been out
of work for 27 weeks or more has been stuck in a range
between 40 percent and 45 percent (see chart below). This
is sharply higher than the previous peak established during
the deep double-dip recessions of the early 1980s, when
the share of long-term unemployed hit roughly 26 percent.
The persistence of long-term unemployment increased
the average length of time that workers were collecting
benefits and stressed states’ regular unemployment reserves,
in many cases exhausting them entirely. Yet even if a state’s
fund becomes insolvent, it is still statutorily obligated to
continue paying benefits to qualifying unemployed workers.
To do so, many states had to borrow money from the
federal government simply to meet their regular benefits
obligations. (This article sets aside the issue of extended
benefits since most were paid for by the federal government
during much of the period being discussed here.)
The recent downturn has had major effects on the unemployment insurance programs in Fifth District states,
especially those hardest hit, and has required states to take
various measures to meet their unemployment insurance
promises. Those measures, in turn, are likely to have effects
on businesses, workers, states’ budgets, and possibly even
the program itself.

1.06
MD
1.04
SC
1.02
DC
1.00
0.98
0.96
0.94
0.92
0.90
2000 01 02 03 04

1.06
1.04
1.02
1.00
0.98
0.96
0.94
0.92
0.90

NC
VA
WV

05 06 07 08 09

10

11

SOURCE: U.S. Department of Labor, BLS

The adjustments that Fifth District states have had to
make, or will have to make, to their programs as a result of
the downturn depend primarily on two factors: the depth
and longevity of the state’s labor market contraction and
how well positioned its trust fund was heading into the
downturn. With regard to the first factor, the Fifth District
on average lost fewer jobs than the nation as a whole, but
there was considerable variation across the region. States
with stronger ties to the federal government and military
(Maryland and Virginia) and the District of Columbia fared
better than those tied to manufacturing and construction
industries (North Carolina and South Carolina). In the
Carolinas, the recession resulted in combined job losses
amounting to nearly 500,000 from peak to trough, or 8.1
percent of total payrolls. (The 8.8 million lost nationwide
represent 6.3 percent of the national total.) In the rest of the
District, job losses totaled 4.5 percent (see chart).
Beyond the magnitude of the job contraction in the
District, this cycle is notable for its prolonged and

positioned well to deal with the severity
disappointing recovery. Employment
Preparedness and Generosity
of the Great Recession. The years used
growth during the most recent expan(as of 2007 Q4)
to calculate states’ AHC multiples gension pales in comparison to recoveries
Average High AWBA as a %
erally fell in the era known as the “Great
from prior deep recessions, contributCost Multiple of AWW*
Moderation,” a period characterized by
ing to longer spells of unemployment.
1.11
22.4
DC
relatively prolonged economic expanOutside of the District of Columbia,
33.0
0.78
MD
sions and two short and very shallow
where the recession was comparatively
NC
38.7
0.23
recessions in 1991 and 2001. Thus, the
short, employment in each of the states
35.7
0.26
SC
last two deep recessions, those in the
remains below prerecession levels, with
0.70
33.1
VA
mid-1970s and early 1980s, were not
the states having varying degrees of suc0.45
39.2
WV
used to calculate the multiples.
cess recapturing the jobs that were lost.
NOTE: *AWBA denotes Average Weekly Benefit
An additional factor influencing the
On the far ends of the spectrum, West
Amount, AWW denotes Average Weekly Wages.
SOURCES: U.S. Department of Labor, ETA, OWS,
current
health of states’ trust funds is
Virginia’s economy has regenerated
Division of Fiscal and Actuarial Services
how generous their unemployment
more than 95 percent of its lost jobs,
insurance benefits were. In more normal
while North Carolina has regained less
economic times, states often sought to maximize the benethan a third. This jobs gap has left unemployment, and
fits paid to furloughed workers to help them through short
unemployment benefits payments, elevated in much of the
stretches of unemployment. These benefits help workers to
Fifth District.
provide for their families during the unemployment spell,
The second factor, how well positioned the state’s unemwhile at the same time helping to stabilize overall economic
ployment trust fund was to weather the sudden surge in
activity by minimizing the shock to aggregate demand.
unemployment insurance claims, was more within the conStates have a lot of flexibility in determining the level of bentrol of policymakers. State governments have to perform a
efit payments to individuals. Most use a formula that pays
careful balancing act with their unemployment insurance
half the worker’s wage up to a certain maximum, which may
funds. On the one hand, policymakers want to have enough
be adjusted for the number of dependents and other factors.
in reserves to meet their obligations and mitigate the shock
To get a sense of relative generosity in each state, the actual
of rising unemployment to the economy during a recession.
average weekly benefit amount (AWBA) as a percent of the
On the other hand, they want to minimize the tax costs
average weekly wage (AWW) can be used as a rudimentary
associated with hiring a new worker so as not to stifle labor
proxy. A higher number suggests a relatively more generous
demand during expansions.
benefits program. Based on this criterion, the programs in
One measure that can be used to gauge a state’s preparedWest Virginia, North Carolina, and South Carolina were relness is its Average High Cost (AHC) multiple. A useful way
atively more generous at the start of the recession than that
to think about the AHC multiple is the length of time,
of the District of Columbia.
measured in years, that it would take for the state’s trust
fund to run out of reserves if a significant recession were to
Pressure on the Trust Funds
occur. For the calculation of the AHC multiple, “significant”
To show how pressures began to materialize when the recesis the average of the three highest insurance payout years in
sion started, we can look at inflows and outflows for each
the last 20 years. A multiple of 1.00 suggests that the state
state’s unemployment trust fund. The chart below shows the
would have enough money in its trust fund to pay those
ratio of revenues collected to benefits paid by jurisdiction
benefits for one year if a severe recession were to hit.
for the period 2006 to 2011. The revenues component refers
While there is no federal statutory definition of “adequately
only to those payroll taxes collected from employers for the
funded,” the U.S. Department of Labor suggests that states
purpose of funding the state’s unemployment insurance
should have a multiple of at least 1.00 heading into a recession to be considered minimally solvent.
So how adequately funded were the Fifth District’s
Ratio of Revenue Collected to Benefits Paid
unemployment trust funds heading into the downturn?
1.80
Here again, there is quite a bit of variation across the region.
1.60
Prerecession multiples ranged from a high of 1.11 in the
1.40
District of Columbia to lows of 0.23 and 0.26 in North
1.20
Carolina and South Carolina, respectively (see table). Based
1.00
0.80
on the multiples, it appears that the District of Columbia
0.60
had the only trust fund that was “minimally solvent” accord0.40
ing to the Labor Department standard. While Maryland,
0.20
Virginia, and West Virginia were not adequately prepared to
0
2010
2009
2011
2007
2008
2006
deal with a significant recession, North Carolina and South
Carolina appeared even less so.
DC MD NC
SC VA WV
Of course, prepared is a relative concept. No state was
SOURCES: U.S. Department of Labor, ETA, OWS, Division of Fiscal and Actuarial Services
Region Focus | First Quarter | 2012

45

outstanding balances with the federal government. North
program. A ratio greater than 1.0 indicates that the state was
Carolina had the largest balance at roughly $2.5 billion
collecting more in state unemployment taxes than it was
(which ranked as the nation’s fifth highest). South Carolina
paying out in unemployment insurance benefits; less
had a balance of more than $850 million and Virginia owed
than 1.0 means the state was paying out more than it was
about $211 million. Maryland had borrowed roughly $90
taking in.
million in early 2010 to meet its obligations but was able to
At the beginning of the period, prior to the recession,
pay the balance off by year-end.
inflows were exceeding outflows in each state but South
Thus, while the District of Columbia, Maryland, and
Carolina, and trust fund balances were rising. In 2007,
West Virginia have put their respective trust funds on more
however, initial unemployment claims increased from the
solid footing, North Carolina, South Carolina, and Virginia
prior year in four of the District’s six jurisdictions, and
remained indebted to the federal government at the end
benefits payments increased in five. (Virginia was the lone
of 2011. In addition to paying interest on that debt,
exception.) With that, the revenue-to-benefits-payments
employers in these states have seen their effective Federal
ratios began to fall. Still, only two states, Maryland and
Unemployment Tax Act (FUTA) taxes increase by 0.3 perSouth Carolina, experienced trust fund deficits for the year.
cent as a penalty for the state having continuous unpaid loan
It was not until 2008, as the Great Recession got under
balances for more than two years. That penalty will rise until
way in earnest and job losses mounted quickly, that benefits
the state’s debt is paid off.
payments exceeded revenues by significant margins in most
jurisdictions. The pressure came from both directions: Not
only did unemployment claims increase dramatically, but
Closing the Gap
also the pool of taxable employees dwindled. As a result,
So what is a state to do in order to restore health to its trust
District-wide benefits payments jumped 41 percent in 2008
fund in times of continued stress? On the expenditure side,
while revenue collections dipped 5 percent. Conditions
states have few practical options. They can cut weekly beneworsened considerably in 2009: Benefits payments in the
fits payments, reduce the maximum duration for which
District doubled from 2008 while revenues continued to
those payments are made, or carry out some combination of
fall. Revenue-to-benefits ratios plunged everywhere.
the two. None of the above is a politically appealing option,
The shortfalls in 2009 were particularly severe in the
however, especially during a downturn when they all fly in
Carolinas and Virginia, where the states were collecting
the face of the spirit of the program. The hurdles to reducroughly 35 cents in revenue for each dollar that was being
ing benefit levels and duration are even greater because of
paid out in benefits. The Carolinas had a twofold problem:
the severity of the most recent recession. (As part of the
Far more jobs were lost during the downturn and benefits
American Recovery and Reinvestment Act, the federal
were relatively more generous. In addition to the increase in
government temporarily provided 100 percent of the fundbenefits payments, these two states experienced the biggest
ing for states’ extended benefits programs, plans usually
declines in revenue collections of District states. Virginia’s
funded 50 percent by the state, but the law prohibited states
problems, in contrast, were mostly on the benefits side; benthat accepted the funding from reducing benefit levels
efits payments increased dramatically while collections
unless existing state law allowed for it.)
declined only slightly. The District of Columbia, Maryland,
At the time of the recession, the vast majority of states
and West Virginia experienced their own difficulties, but to
(including all of those in the Fifth District) had a maximum
a lesser extent.
benefit period of 26 weeks written into state law. As is the
Despite rapidly falling trust fund balances, states
case with reducing benefit levels, cutting the number of
were still statutorily bound to continue payweeks of eligibility is an unappealing option
ing benefits to qualified unemployed
during times of high and sustained unemployworkers, even if that balance fell to zero. If
ment. Nationally, few states have done so. In
If a state can no longer go
the state can no longer go to its trust fund to
the Fifth District, only South Carolina has
to its trust fund to pay
meet its obligations (if it reaches insolvency),
opted to take this step. In mid-2011, the
unemployment benefits
the state then has to borrow money from the
state’s legislature voted to reduce the maxi— if the trust fund runs
federal government to continue making
mum to 20 weeks from 26 weeks.
out of money — the state
those payments. During the course of this
As a practical matter, with little political
has to borrow money
downturn, and the continued strains on labor
appetite to slash benefits when needs are perfrom the federal governmarkets in its wake, four states — Maryland,
ceived to be the greatest, much of the
ment to cover the cost of
North Carolina, South Carolina, and Virginia
adjusting is left to be done on the revenue colpaying the benefits. North
— were forced to borrow money from the
lection side. States often make adjustments
Carolina, South Carolina,
federal government. South Carolina’s fund
to the unemployment taxes they impose on
and Virginia remained
was the first to reach insolvency in late 2008,
employers based on the relative health of
indebted to the federal
but the others were not far behind.
their trust funds. Some are triggered autogovernment at the end
Unfortunately, those strains have continmatically when the trust fund attains a
of 2011.
ued. As of Sept. 31, 2011, three of the four had
certain level of duress (as was the case in

QUICK
FACT

46

Region Focus | First Quarter | 2012

Maryland, Virginia, and West Virginia), while others require
further legislative intervention (as in North Carolina and
South Carolina).
The two variables through which a state can easily affect
the revenue stream are the tax rate and the taxable wage
base. States have a range of unemployment tax rates that are
assessed to employers based on their past experience with
the unemployment insurance fund (more claims against the
fund equal higher tax rates), as well as a separate rate for new
employers. The average tax rates have increased in each
Fifth District jurisdiction since 2007 (see chart).
States also have the option to increase the taxable wage
base. Because the federal unemployment tax applies to the
first $7,000 of a covered employee’s wages, all states have a
minimum tax base of at least $7,000. Individual states are
free to set the base as they see fit, however. In the Fifth
District, taxable wage bases range from a low of $8,000 in
Virginia to a high of $19,700 in North Carolina. Since 2009,
three states — North Carolina, South Carolina, and West
Virginia — have raised the taxable wage base to help shore
up their trust funds. Of those, North Carolina’s increase was
the smallest, as its base increased by just $400.
Unsurprisingly, the result of the changes in rates and
bases has been higher taxes on employers. The average tax
rate expressed as a percent of total wages, which reflects
changes in both the tax rates and the taxable base, has
increased in all Fifth District states since the trust funds
came under severe pressure. The most significant increase
has taken place in Maryland, where the average tax rate
more than doubled from 0.36 percent in 2008 to 0.94 percent in 2011. The least significant increase took place in
North Carolina, where the average tax rate edged up from
0.79 percent to 0.87 percent. It is perhaps surprising that
North Carolina’s adjustments on both the tax rate and tax
base are comparatively lower, considering that its trust fund
woes are the most challenging of District states.

Unemployment Insurance Average Tax Rates
AS PERCENT OF TOTAL WAGES

1.2
1.0
0.8
0.6
0.4
0.2
0
US

DC

MD

NC

SC

VA
WV
2008
2011
SOURCES: U.S. Department of Labor, ETA, OWS, Division of Fiscal and Actuarial
Services

Carolina adjust only on legislative initiative. While legislators in the Carolinas ultimately raised tax rates, it appears
the automatic triggers in Virginia’s law helped the state
stabilize its revenues sooner and in a more orderly fashion.
And Virginia expects that those tax increases, along with
some revenue transfers and the increase in Federal
Unemployment Tax Act (FUTA) tax dollars, will enable
it to pay off its unemployment insurance debt by 2012.
That would allow the commonwealth to avoid further
interest payments and would reduce the FUTA taxes that
employers pay.
South Carolina’s tax increases and stronger job growth
are expected to allow the state to pay off its debt by 2015.
In contrast, North Carolina’s trust fund remains out of
balance and the state’s job growth lags other District states.
Moreover, its path forward is unclear. Without more decisive policy actions, North Carolina’s trust fund problems
will persist for the foreseeable future. This means that
employers in the state will be facing higher unemployment
insurance taxes, and considerable uncertainty surrounding
them, at a time when labor demand is already weak.
Critics of the unemployment insurance program have
argued that unemployment insurance benefits are contributing to persistently high unemployment rates by
reducing the cost of being unemployed. Meanwhile, proponents argue that unemployment insurance also provides
workers with some latitude to find “the right job,” one that
makes the best use of their skill sets. As policymakers
(federal and state) rethink their programs in the wake of the
Great Recession, they are well advised to do so with an
eye toward doing more than simply resolving trust fund
imbalances. In the end, a well-rounded program that ties
unemployment insurance benefits to efficient skills training
and job matching programs may help ease labor market
friction and speed the healing process.
RF

Conclusion
As in most of the rest of the nation, unemployment insurance funds in Fifth District states came under extreme
pressure during the Great Recession. In its aftermath, states
have taken a variety of steps to continue paying unemployment insurance claims and to replenish reserves in their
trust funds. With significant political constraints to cutting
benefits, states mostly accomplished this by raising taxes on
employers.
Virginia has triggers written into state law that automatically adjust tax rates when its unemployment trust fund
reaches certain thresholds (up when needed, down when
possible). In contrast, rates in North Carolina and South
u

Region Focus | First Quarter | 2012

47

State Data, Q3:11
DC

MD

NC

SC

VA

WV

727.5

2,550.0

3,922.3

1,836.6

3,681.8

756.2

Q/Q Percent Change
Y/Y Percent Change

0.3
2.4

0.5
1.0

-0.1
1.2

0.2
1.0

0.1
1.0

0.7
1.0

Manufacturing Employment (000s)
Q/Q Percent Change
Y/Y Percent Change

1.1
3.2
0.0

113.6
0.2
-0.4

434.9
0.1
0.4

218.3
1.5
5.3

229.2
-0.7
-0.2

49.6
0.3
0.5

Professional/Business Services Employment (000s) 149.7
Q/Q Percent Change
0.5
Y/Y Percent Change
1.3

397.0
0.6
2.2

512.5
0.0
4.8

229.6
-0.3
4.1

660.1
-0.7
0.9

62.7
1.1
3.0

Government Employment (000s)
Q/Q Percent Change

244.3
-1.9

508.6
0.6

695.6
-0.1

342.3
0.7

707.9
-0.4

152.6
1.8

-0.1

1.1

-0.5

-1.8

0.7

0.1

342.5
-0.3

3,070.3
0.1

4,655.4
0.1

2,159.1
0.1

4,310.5
0.5

798.9
0.1

Y/Y Percent Change

-0.5

0.2

1.0

0.3

1.3

-0.2

Unemployment Rate (%)
Q2:11
Q3:10

10.5
10.2
10.1

7.2
7.1
7.8

10.7
10.5
10.7

10.4
10.4
10.9

6.4
6.2
6.8

8.1
7.9
8.5

39,720.4

261,512.9

306,567.3

138,230.8

326,347.0

54,615.8

Q/Q Percent Change
Y/Y Percent Change

0.3
2.5

0.5
1.8

0.1
1.4

-0.2
1.4

0.3
1.4

0.0
1.3

Building Permits
Q/Q Percent Change
Y/Y Percent Change

889
24.0
278.3

3,228
15.2
3.3

7,844
-9.6
-7.6

3,638
-12.0
7.9

6,163
12.9
1.4

517
12.4
22.2

House Price Index (1980=100)
Q/Q Percent Change
Y/Y Percent Change

573.8
0.3
1.0

416.3
1.8
-4.0

308.2
0.6
-3.7

309.8
0.8
-4.4

400.7
1.1
-2.7

216.8
2.3
-1.4

8.4
-8.7
5.0

70.4
-7.4
10.0

129.2
-4.7
18.3

69.6
-0.6
19.2

106.0
1.9
3.5

27.2
7.9
9.7

Nonfarm Employment (000s)

Y/Y Percent Change
Civilian Labor Force (000s)
Q/Q Percent Change

Real Personal Income ($Mil)

Sales of Existing Housing Units (000s)
Q/Q Percent Change
Y/Y Percent Change

48

Region Focus | First Quarter | 2012

Nonfarm Employment

Unemployment Rate

Real Personal Income

Change From Prior Year

First Quarter 2001 - Third Quarter 2011

Change From Prior Year

First Quarter 2001 - Third Quarter 2011

First Quarter 2001 - Third Quarter 2011

8%
7%
6%
5%
4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%

10%

4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%

9%
8%
7%
6%
5%
4%
3%
01 02 03 04 05 06 07 08 09 10

11

01 02 03 04 05 06 07 08 09 10

Fifth District

11

01 02 03 04 05 06 07 08 09 10

United States

Nonfarm Employment
Metropolitan Areas

Unemployment Rate
Metropolitan Areas

Building Permits

Change From Prior Year

Change From Prior Year

First Quarter 2001 - Third Quarter 2011

First Quarter 2001 - Third Quarter 2011

First Quarter 2001 - Third Quarter 2011

7%
6%
5%
4%
3%
2%
1%
0%
-1%
-2%
-3%
-4%
-5%
-6%
-7%
-8%

Change From Prior Year

30%

13%
12%
11%
10%
9%
8%
7%
6%
5%
4%
3%
2%
1%
01 02 03 04 05 06 07 08 09 10
Charlotte

Baltimore

11

20%
10%
0%
-10%
-20%
-30%
-40%
-50%
01 02 03 04 05 06 07 08 09 10

Washington

Charlotte

Baltimore

FRB—Richmond
Manufacturing Composite Index

First Quarter 2001 - Third Quarter 2011

First Quarter 2001 - Third Quarter 2011

30

30

20

20

10

First Quarter 2001 - Third Quarter 2011

16%
14%
12%
10%
8%
6%
4%
2%
0%
-2%
-4%
-6%
-8%

-20
-10
-30
-20

-40

-30

-50
01 02 03 04 05 06 07 08 09 10

11

United States

Change From Prior Year

-10

0

Fifth District

11

House Prices

0

10

01 02 03 04 05 06 07 08 09 10

11

Washington

FRB—Richmond
Services Revenues Index

40

11

01 02 03 04 05 06 07 08 09 10

11

01 02 03 04 05 06 07 08 09 10
Fifth District

11

United States

NOTES:

SOURCES:

1) FRB-Richmond survey indexes are diffusion indexes representing the percentage of responding firms
reporting increase minus the percentage reporting decrease.
The manufacturing composite index is a weighted average of the shipments, new orders, and employment
indexes.
2) Building permits and house prices are not seasonally adjusted; all other series are seasonally adjusted.

Real Personal Income: Bureau of Economic Analysis/Haver Analytics.
Unemployment rate: LAUS Program, Bureau of Labor Statistics, U.S. Department of Labor,
http://stats.bls.gov.
Employment: CES Survey, Bureau of Labor Statistics, U.S. Department of Labor, http://stats.bls.gov.
Building permits: U.S. Census Bureau, http://www.census.gov.
House prices: Federal Housing Finance Agency, http://www.fhfa.gov.

Region Focus | First Quarter | 2012

49

Metropolitan Area Data, Q3:11
Washington, DC
Nonfarm Employment (000s)
Q/Q Percent Change

Hagerstown-Martinsburg, MD-WV

2,433.2
-0.1

1,295.0
-0.3

97.6
-1.6

Y/Y Percent Change

0.9

1.4

0.0

Unemployment Rate (%)
Q2:11
Q3:10

6.0
5.7
6.1

7.6
7.4
7.8

9.3
9.3
10.1

Building Permits
Q/Q Percent Change

4,827
11.0

1,285
42.0

156
10.6

Y/Y Percent Change

43.4

-1.7

5.4

Asheville, NC

Charlotte, NC

Durham, NC

168.3
-1.0

825.2
-0.6

272.0
-0.7

Y/Y Percent Change

0.7

3.0

0.2

Unemployment Rate (%)
Q2:11
Q3:10

8.3
7.8
8.4

11.1
10.5
11.4

7.8
7.2
7.5

349
22.5
-36.8

1,868
19.1
51.3

527
-3.5
-16.1

Raleigh, NC

Wilmington, NC

339.0

508.5

137.9

-1.3
0.0

-0.1
1.6

-1.0
0.1

Unemployment Rate (%)
Q2:11
Q3:10

10.7
10.1
10.8

8.5
7.8
8.4

10.7
9.7
10.0

Building Permits
Q/Q Percent Change
Y/Y Percent Change

451
9.5
-15.9

1,419
-33.4
8.7

515
13.9
26.5

Nonfarm Employment ( 000s)
Q/Q Percent Change

Building Permits
Q/Q Percent Change
Y/Y Percent Change

Greensboro-High Point, NC
Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change

50

Baltimore, MD

Region Focus | First Quarter | 2012

Winston-Salem, NC

Charleston, SC

Columbia, SC

205.7
0.5
1.7

295.7
-0.5
2.8

344.0
-0.8
0.4

Unemployment Rate (%)
Q2:11
Q3:10

9.7
9.3
9.9

9.3
8.6
9.2

9.6
8.7
9.2

Building Permits
Q/Q Percent Change
Y/Y Percent Change

345
2.4
11.7

773
-25.0
16.9

715
-8.3
-8.6

Greenville, SC

Richmond, VA

Roanoke, VA

304.3
0.1

609.4
-0.7

155.9
-0.5

Y/Y Percent Change

2.6

1.2

0.9

Unemployment Rate (%)
Q2:11

9.3
8.5

7.0
6.8

6.4
6.4

Q3:10

9.5

7.6

7.2

Building Permits
Q/Q Percent Change

413
-16.4

864
12.9

98
-3.0

Y/Y Percent Change

29.9

-16.4

-13.3

Virginia Beach-Norfolk, VA

Charleston, WV

742.3
-0.2
0.3

148.6
0.5
-0.1

112.3
-1.2
-0.4

7.0
6.8
7.3

7.3
7.9
8.4

8.3
8.5
8.9

1,409
19.9
31.9

43
38.7
4.9

21
-27.6
133.3

Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change

Nonfarm Employment (000s)
Q/Q Percent Change

Nonfarm Employment (000s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q2:11
Q3:10
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Huntington, WV

For more information, contact Sonya Ravindranath Waddell at (804) 697-2694 or e-mail Sonya.Waddell@rich.frb.org

Region Focus | First Quarter | 2012

51

OPINION
Reflections on Sarbanes-Oxley 10 Years Later
BY J O H N A . W E I N B E RG

n response to the 2007-2008 financial crisis, U.S.
lawmakers passed the Dodd-Frank Act (DFA), the
most sweeping financial reform package in decades.
Prior to the DFA, the legislation holding that title was
Sarbanes-Oxley, the 2002 regulatory response to fraudulent accounting practices by several of the nation’s largest
companies. SOX, as it became known, heightened
disclosure and auditing requirements for all publicly traded
firms to enhance transparency for investors. Ten years after
its passage, some reflections on its impact may be relevant
as regulators continue to implement the DFA.
The expectations placed on SOX — which passed
Congress with near unanimity — were extraordinary. One is
that it should have prevented events like the recent financial
crisis. Some of the firms that engaged in excessive risktaking and ultimately received government support were
not transparent about their true financial conditions. But
the pervasive expectation of government support for
systemically important firms and markets was arguably a
larger catalyst for risk-taking than the more isolated
instances of financial misrepresentation that occurred.
Another claim often made around the time of SOX’s
passage was that it would impose hugely burdensome
compliance costs on firms, especially smaller firms with
modestly staffed compliance departments. Data on direct
compliance costs are sparse, but it is not obvious that they
have been as large as predicted. John Coates of Harvard Law
School suggested just five years after SOX that compliance
costs were on the order of $1 million for every $1 billion of
revenue, or about 0.1 percent of revenues, and that costs
appeared to fall with firm size and over time with learning.
This is evidence that U.S. firms are quite adaptable at
navigating — and perhaps eventually bypassing — new
regulations. Perhaps that adaptability and innovation
may have been better spent on other, potentially more
productive endeavors, but the quantitative impact of such
diversion of effort is hard to gauge.
One way to assess whether the costs of a regulation are
“too large” is to look at how the regulation changes behavior.
Since SOX applies only to public companies, the burden of
compliance costs could be manifested through a decline in
initial public offerings (IPOs). There has been a clear decline
in IPOs in the United States, from averages of 311 annually
from 1980 through 2000 to 102 per year from 2001 through
2009, according to University of Florida economist Jay
Ritter. The decline in IPOs is most prevalent for small firms
(those with less than $50 million in sales), which is what one
might expect if oppressive compliance costs were a primary
catalyst. But there are possible explanations other than
SOX. For example, Ritter and co-authors of a recent

I

52

Region Focus | First Quarter | 2012

study argue that decreasing profitability of small firms,
rather than compliance costs, has made it increasingly
desirable for those firms to be sold to larger firms rather
than to go public.
There is, however, some evidence that an increasing
number of firms have gone from public to private due to
SOX. Companies that go private often cite SOX as the
reason, and the number of private equity deals has grown
since SOX. Relatively small American firms were more
likely than their European counterparts to sell to private
buyers immediately following SOX, though not thereafter,
suggesting rapid adjustment to the legislation. Still, this may
not always be a bad thing. Going private might indicate that
SOX is working by restricting riskier firms to more sophisticated investor pools. Coates suggests that the increasing
use of private equity could be due to some firms exiting
or avoiding the public market rather than suffering a
loss in share value following increased disclosures. On the
other hand, if firms that go private accept funding on
less advantageous terms than they could have obtained
publicly, that could make them riskier, a potential social cost
of SOX.
In an ideal world, researchers could gain more clarity
on the effects of new regulations by studying the counterfactual — for example, what the world would have looked
like without SOX. That world would almost certainly have
involved more public scrutiny as a natural byproduct of the
accounting scandals. SOX may have prevented some
extreme cases of fraud, but had a few firms committed such
malfeasance — which no doubt would have been made more
difficult by enhanced attention from investors — those
actions might still have imposed fewer costs on the economy
than those created by SOX.
Today there are very large expectations surrounding the
DFA’s ability to solve perceived problems in financial
markets. At the time of its passage, SOX was thought to be
an inscrutable piece of legislation, both in terms of its length
and the degree to which regulators had to interpret the
written statute to implement Congress’ intent. Yet SOX is
orders of magnitude shorter than the DFA, and the expectations placed on the DFA for preventing the next would-be
crisis appear even greater. One lesson from SOX is that the
indirect and even direct effects of large-scale regulations are
not always obvious or expected. Ten years from now, economists will almost certainly be talking about the difficulty of
interpreting the true impact of many aspects of the DFA,
as they are — and may still be — with SOX.
RF
John A. Weinberg is senior vice president and director
of research at the Federal Reserve Bank of Richmond.

h

NEXTISSUE
Cover Story
Labor force participation in the United States is at the
lowest rate in decades. While much of the decline is due to
demographic changes, many people have dropped out
because they’re discouraged about the job market. What
does the decline in labor force participation mean for the
U.S. economy — and can it be reversed?

Hope for Honduras?
This relatively poor Central American country plans to develop
a largely autonomous city on 1,000 square kilometers of
uninhabited land. Economist Paul Romer is pushing the idea
that “charter cities” can ignite economic growth by offering
better rules and governance. With Romer’s help, Honduras is
trying to test that theory.

Shale Gas Development
Deposits of natural gas locked in shale may promise
comparatively cheap and plentiful domestic supplies of energy.
But tensions have flared over the environmental costs of
“fracking,” a process used to extract the gas.

Federal Reserve
The Fed affects households primarily
through interest rates, inflation, and unemployment — but there’s no reason to believe
it affects everyone equally. Extremely low
interest rates have helped lower household
interest income by $400 billion since 2008,
while inflationary periods benefit borrowers
at the expense of lenders. Should the Fed
worry about these redistributional effects
when it steers the macroeconomy?

Economic History
North Carolina trucker Malcom McLean
sparked global trade in 1956 when he
shipped his first container load from
Newark, N.J., to Houston. He had a plan:
Boxes could travel, unopened during transit,
and be transferred by crane among ships,
trucks, and railcars. This slashed shipping
costs, which made it possible to ship goods
over long distances cheaply, leading to
today’s era of global trade.

Interview
John List of the University of Chicago on the
use of field experiments in economics

Visit us online:
www.richmondfed.org
• To view each issue’s articles
and Web-exclusive content
• To view related Web links of
additional readings and
references
• To add your name to our
mailing list
• To request an e-mail alert of
our online issue posting

Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261

Change Service Requested

To subscribe or make subscription changes, please email us at research.publications@rich.frb.org or call 800-322-0565.

First Quarter 2011
vol. 15, no. 1

Second Quarter 2011
vol. 15, no. 2

Third Quarter 2011
vol. 15, no. 3

Fourth Quarter 2011
vol. 15, no. 4
Cover Story
American Made:
The manufacturing sector is
stronger than you might
think — but new vulnerabilities
are emerging
Federal Reserve
Checking the Paychecks
Interview
Michael Bordo
Rutgers University

Cover Story
Why Aren’t We Creating More Jobs?
Job growth usually rebounds quickly
after a severe recession, but this
time is different
Federal Reserve
The Dodd-Frank Act and
Insolvency 2.0
Interview
Derek Neal
University of Chicago

Cover Story
Foreign Housing Finance:
How mortgage finance in America
differs from the rest of the
developed world
Federal Reserve
Sifting for SIFIs
Interview
Bruce Yandle
Clemson University and George
Mason University’s Mercatus Center

Cover Story
What Drives Changes in
Economic Thought?
Why economists study what
they do — and how the
crisis might change it
Federal Reserve
Stigma and the Discount Window
Interview
Joel Slemrod
University of Michigan