View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

CoversFall08Final_1.30

FALL

1/30/08

11:39 AM

Page 1

2007

THE

• Subprime Mortgage Lending
• The Great Migration
• Interview with Susan Athey

FEDERAL

RESERVE

BANK

OF

RICHMOND

VOLUME 11
NUMBER 4
FALL 2007

COVER STORY

12

Downtown is Dead. Long Live Downtown: America is busy
rebuilding its downtowns. But these are not the downtowns
of yesterday
Downtowns today may not be for everybody. They are a niche
product, likely geared to a certain demographic or two, and whose
broader payoffs are important to the city. Downtowns are really
being reinvented rather than restored to their former glory.

FEATURES
18

Armed Against ARMs: Educating low-income borrowers may
be an effective — if oft-overlooked — way to help stabilize
the mortgage market

Our mission is to provide
authoritative information
and analysis about the
Fifth Federal Reserve District
economy and the Federal
Reserve System. The Fifth
District consists of the
District of Columbia,
Maryland, North Carolina,
South Carolina, Virginia,
and most of West Virginia.
The material appearing in
Region Focus is collected and
developed by the Research
Department of the Federal
Reserve Bank of Richmond.
DIRECTOR OF RESEARCH

John A. Weinberg
EDITOR

The focus on the role of mortgage brokers and Wall Street — and even
on regulators in the recent decline of the subprime housing market —
is richly deserved. But another player deserves attention: borrowers.

Aaron Steelman
SENIOR EDITOR

Doug Campbell
MANAGING EDITOR

21

Professional Prognosticators: Is forecasting a science or
an art?

STA F F W R I T E R S

Models of all stripes can never perfectly predict the future because
they are not exact replicas of the actual economy. To get an accurate
forecast, you need information that gets closer to the current state
of affairs.

E D I TO R I A L A S S O C I AT E

Betty Joyce Nash
Vanessa Sumo

24

Runs Make the Bank: The fragile capital structure of banks
makes them inevitably prone to runs, and that’s a good thing
Economists with ties to the Richmond Fed study how a bank’s
distinctive asset and liability structure is precisely what allows the
bank to provide liquidity at all times; that is, to make funds available
to both long-term borrowers and short-term depositors whenever a
need arises.
28

Crash: In Virginia, private insurers test vehicles for safety
A nonprofit, private-sector organization performs functions that one
might otherwise assume would be done by the government. Insurers
who fund the Insurance Institute for Highway Safety see returns on
their investments in other ways beyond goodwill.

DEPARTMENTS

1 President’s Message/Looking Forward
2 Federal Reserve/Before the Fed
6 Jargon Alert/Signaling
7 Research Spotlight/Central Bank Governors
8 Policy Update/Resale Price Maintenance
9 Around the Fed/Greenspan’s Rule
10 Short Takes/News from the District
30 Interview/Susan Athey
36 Economic History/The Great Migration
40 Book Review/Prophet of Innovation
44 District/State Economic Conditions
52 Opinion/When Disclosure is Not Enough
PHOTO: CORBIS IMAGES

Kathy Constant

Julia Ralston Forneris
R E G I O N A L A N A LY S T S

Matt Harris
Matthew Martin
Ray Owens
CONTRIBUTORS

Charles Gerena
Borys Grochulski
Thomas M. Humphrey
William Perkins
Ernie Siciliano
DESIGN

Beatley Gravitt
Communications
C I RC U L AT I O N

Walter Love
Shannell McCall
Published quarterly by
the Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA23261
www.richmondfed.org

Subscriptions and additional
copies: Available free of charge
by calling the Public Affairs
Division at (804) 697-8109.
Reprints: Text may be reprinted
with the disclaimer in italics
below. Permission from the
editor is required before
reprinting photos, charts, and
tables. Credit Region Focus and
send the editor a copy of the
publication in which the
reprinted material appears.
The views expressed in Region Focus
are those of the contributors and not
necessarily those of the Federal Reserve
Bank of Richmond or the Federal
Reserve System.
ISSN 1093-1767

BookFall08Final

1/28/08

2:40 PM

Page 1

PRESIDENT’SMESSAGE
Looking Forward
n this issue of Region Focus,
we take a look at the
business of economic forecasting. Some call it a science,
others an art. Clearly, forecasting contains elements of both.
And while no single forecast is
always going to be 100 percent
accurate, it’s also clear that
forecasts provide value to those
who read them, from Wall Street
to Main Street.
Forecasting is an input into
our everyday decisionmaking process. Your decision about
whether to buy a car, for example, is based on a personal
forecast that you will have the income to pay for it, and
perhaps on a forecast that the car will maintain value over
the years should you decide to sell it. Such a forecast may be
based on a detailed analysis or on a simple gut feeling.
The Federal Reserve System produces some of the most
detailed economic forecasts in the world. Recently,
Chairman Ben Bernanke announced that some of the
Fed’s forecasts would be released to the public on a more
frequent basis. Instead of semiannual projections with
horizons of two years, now there will be quarterly forecasts
with three-year horizons. Federal Open Market Committee
(FOMC) participants will also add overall (or “headline”)
inflation to their forecasts, which already encompass
changes in real gross domestic product, unemployment, and
core inflation (which excludes prices of food and
energy). Summary narratives will now accompany the
numerical projections, giving a richer account of the
Fed’s outlook.
These are important changes. The Federal Reserve Bank
of Richmond will continue preparing its own economic
forecast and submit it as usual to the Board of Governors.
The addition of the third year to the forecast horizon,
along with the new narrative, will give an indication of
individual members’ preferences for inflation and perspectives on other longer-term trends. This should shed more light
on the diversity of opinion around the FOMC table.
The perspective in, say, San Francisco may be quite
different than in Philadelphia, both geographically and
philosophically.
Increasing the frequency and the depth of Fed communications with the public is part of a broader strategy that
should help improve the effectiveness of monetary policy.
In part, this is because forecasts are crucial to the process
of conducting monetary policy in the first place. Changes
in the target federal funds rate do not affect the economy
immediately; there is a lag. Monetary policy is thus

I

necessarily forward-looking, aiming to anticipate how policy
actions will mesh with ever-moving economic conditions.
Externally, forecasts can help guide public expectations
about future monetary policy. If the public uses the
forecasts to gain a better understanding of where the Fed
believes the economy is headed, then it is more likely to
respond accordingly. Asset prices in financial markets,
for example, may be more likely to move in directions
favorable to the Fed achieving its objectives of price
stability and sustainable growth.
Meanwhile, a public that is able to compare economic
forecasts with central bank behavior can better discern
patterns in monetary policy. If the pattern is consistent,
policymaking becomes more credible, and inflation
expectations become anchored. This is particularly useful to
the Fed during times of economic shocks. The central bank
can take policy actions in response to shocks — such as a
spike in oil prices, for example — without shaking the
public’s confidence that the long-term inflation objective
remains the same. Moreover, expectations are crucial to the
behavior of inflation, and an informed public can better
learn to form inflation expectations that are consistent
with monetary policy.
A forecast of future economic conditions is just one
piece of information that the Fed shares with the public.
It also conveys objectives, its current policy stance, and, to
some extent, its decisionmaking process. Together these
messages form the core of the Fed’s overarching strategy for
explaining its policy actions to the public.
Chairman Bernanke called the Fed’s communications
strategy “a work in progress.” Indeed, the past two decades
have witnessed an evolution in Fed communications. It was
only 14 years ago that the Fed first started announcing
policy changes at the time they were made. More recently,
FOMC minutes have been released three weeks after a
meeting, instead of five to seven weeks later. Each step
builds on past advances.
It’s important to keep in mind that the way the Fed
conducts monetary policy is not changing. For now, we are
just trying to explain it better.

JEFFREY M. LACKER
PRESIDENT
FEDERAL RESERVE BANK OF RICHMOND

Fa l l 2 0 0 7 • R e g i o n Fo c u s

1

BookFall08Final

1/28/08

2:40 PM

Page 2

FEDERALRESERVE
Before the Fed
BY E R N I E S I C I L I A N O

Between the Civil
War and the founding of the Federal
Reserve, the U.S.
banking system was
largely unregulated,
with mixed results

The First Bank of the United States
opened in 1791 and operated for 20
years. This check was drawn on the
bank on March 26, 1794.

2

R e g i o n Fo c u s • Fa l l 2 0 0 7

he United States’ first two
central banks were shortlived. The First Bank of the
United States founded in 1791, was a
source of constant political debate,
and its 20-year charter was not
renewed. The Second Bank opened
in 1816, then lost its charter in 1836
under the antagonistic Andrew
Jackson administration. Thus began a
period when U.S. banking was significantly less regulated than today.
For a time, government intervention was limited to setting reserve
requirements. It was easy for any bank
to obtain a state charter, provided it
met the $100,000 minimum capital
requirement, about $2 million in
today’s dollars. Most alien to today’s
customs was that 7,000 state banks
issued their own currency. Yet the system functioned with surprising
efficiency. A financial press listed the
prices of all outstanding currencies, giving full information to the market.
During the 1850s, the number of statechartered banks grew by 79 percent,
and the availability of financial
capital enabled
strong economic growth in the
antebellum era.
“Those states
that promoted
financial development
the
most, either
through liberal
chartering, free
banking, or broad-based branch banking experienced moderate to high
rates of growth,” economist Howard
Bodenhorn wrote.
Some have labeled this the era of
“free banking.” It lasted until the
early years of the Civil War. It was
not market failure that derailed free
banking, but rather President
Abraham Lincoln’s war debts.

T

The National Banking Act
The North spent $3.2 billion to win
the Civil War, and Lincoln recognized that existing taxes and tariffs
could not cover the entire cost.
During the war, the federal government printed U.S. notes — paper
money called greenbacks. This fiat
currency was expected to be retired
after the war. But because the political environment favored an expanded
money supply, a limited amount
remained in circulation and can still
be exchanged for cash today.
However, the greenbacks that
remained could not entirely fund the
war so Lincoln instituted the
National Banking Act in 1863. The
act chartered national banks to compete with state banks. Banking at the
time was largely local because the
economy was not fully integrated. A
disproportionate majority of the
national banks were concentrated in
the Northeast, especially in New
York City. Because of the Civil War,
the government neglected to charter
banks in the South.
Not that the South cared.
Southerners generally distrusted the
federal banks as government overreach. Had the South not seceded,
Southern votes in Congress likely
would have prevented the passage of
the National Banking Act. After the
war, national banks in the South continued to lag the North because the
war had gutted Southern infrastructure, and so Northern banks were
viewed as more secure. Federal officials also were biased toward granting
national charters to already existing
banks, thus setting the South at an
even bigger competitive disadvantage.
The national bank notes would
finance
Lincoln’s
government
because to issue them, banks had to
purchase government bonds. In the
event that a national bank defaulted,

BookFall08Final

1/28/08

2:40 PM

Page 3

customers could redeem the notes for
up to 90 percent of value at the
Treasury and the government would
cancel the banks’ bonds. Lincoln
hoped that the security bond-backing
provided would cause people to use
the notes to the exclusion of state
bank notes. Still, state banks, especially the most profitable ones, were
reluctant to leave the status quo.
They feared the prospect of federal
regulation. By mid-1865, 85 percent of
American currency remained in state
bank notes.
The next year, a 10 percent tax was
imposed on state bank notes, which
put them at a severe disadvantage and
made national bank notes the nation’s
primary currency.

Growth of Retail Banking
The predictable consequence of
the 10 percent tax was the death of
state bank notes. The unintended
consequence was innovation in banking services. With the ability to issue
currency gone, state banks had to
invent new financial services to
remain in business.
Checkable deposits, although
around before the creation of the
bank act, grew in popularity after the
tax on bank notes. By 1881, checkable
deposits made up 82 percent of banking receipts. Checking became
especially popular among farmers
who lived in rural areas not widely
served by national banks.
In addition to checking, state
banks drew upon farmers’ need for
credit and issued real estate and commercial loans. Besides their proximity
to most farms, state banks derived a
competitive advantage in the loan
market because they were much less
regulated than national banks. Those
regulations that did exist were regularly flouted. Laws for commercial
lending dictated that loans be shortterm, with promise of immediate
payment, but banks regularly made
loans to farms based on mortgages of
cattle. Loans were made for farmers’
long-term fixed investments, as
opposed to helping with moving
short-term sales. National banks had

higher loan limits and were prohibited from making real estate loans.
However, they, too, exploited lax
enforcement. In fact, roughly half of
all national banks were already making real estate loans before the law
was changed to allow them to do so.
As the farmers’ demand for credit
services grew, so did the demand
from wealthy people for banking
services. Speculation exploded at this
time, and banks fueled it by issuing
call loans, which were loans given to
investors to purchase stocks. If an
investor defaulted, banks could seize
his stock portfolio instead. By 1870,
one-third of all loans in New York
were call loans.
Trusts, which first developed
before the Civil War with the chartering of United States Trust Company
in 1853, also expanded, and by 1913
there were more than 1,800 trust
companies. While trusts traditionally
handled only land management for
the wealthy, they expanded their services to include investment banking
and even checking accounts. They
loaned freely and under no government regulation. Pretty soon, trusts
became almost indistinguishable
from state banks.

The Flaw(s) in the System
While the banking system was partly
responsible for the era’s robust
economic growth, it was not perfect.
Although bank failures for nonnational banks were around
17.6 percent (compared with 6.5 for
national banks), a government
comptroller’s review of the failures
between 1865 and 1911 found
that most were due to incompetence.
The comptroller found that only
13 percent of banks failed due to
adverse business conditions while
the rest failed due to corruption
or mismanagement.
If there was a fundamental flaw in
the system it was that banks were
vulnerable to runs, which often led to
wider panics involving other banks.
There were five panics between the
passage of the National Banking Act
and the Panic of 1907. Four panics

resulted in depressions, the lone
exception being the Panic of 1890.
Panics generally followed several
patterns. Sometimes there would be a
well-publicized default at a major
bank, often caused by economic
downturn or a big-name speculator
placing a bad bet on the market.
When the public found out, they lost
confidence in the banks and scrambled to retrieve their savings. At other
times, farmers would rush to get cash
from banks to move crops in the fall.
Banks had loaned more than they had
on reserve so they could not meet
all of their requirements. Because
there was no central bank or banking
system, these panics were confined
to specific regions and there was
little contagion.
There were many reasons why
banks struggled to deal with panics.
For example, the large concentration
of banks in New York, and the banks’
loose lending of call loans to riskprone speculators, made defaults
more likely. Some economists have
argued that the banks adhered too
religiously to the reserve requirement
(usually around 25 percent) and were
too quick to stop making payments.
They argued that had banks dipped
below the reserve requirement to pay,
confidence would never have slipped
and panics would have stopped. On
the other side of the debate, economists, including Milton Friedman,
argued that banks’ closings were
necessary and actually reduced panics.
He reasoned that had banks stayed
open and then failed, it would have
forced other banks to close, thereby
lengthening panics. Today, economists
still debate the extent to which the
banks’ behavior exacerbated panics.
What economists agree on is the
primary cause of panics: an inflexible
currency. (See “Runs Make the Bank”
on page 24, where we present an
economist’s story about panics as
deriving from the funding of illiquid
assets with liquid liabilities.) Unable
to expand currency to meet demand,
banks were handcuffed to a limited
amount of currency. Increasing the
number of national bank notes in

Fa l l 2 0 0 7 • R e g i o n Fo c u s

3

BookFall08Final

1/28/08

2:40 PM

Page 4

circulation was too costly because
for banks to get notes, they had to
buy government bonds. Greenbacks
— U.S. notes printed during the Civil
War that passed as legal tender —
were set at a fixed amount by the
government. The government also
had legislated steep reserve requirements of about 25 percent on
deposits, further constricting the
money supply.

The Clearinghouse Solution
After the Panic of 1857, banks devised
a market-oriented solution to address
panics. They established clearinghouses, or bank-like organizations,
whose purpose in part was to serve as
central places where banks could
hold reserves and borrow and lend to
each other. “The existence of the
clearinghouse suggests that private
agents can creatively respond to
market failure,” economist Gary
Gorton has written. “In fact, it
is almost literally true that the
Federal Reserve System was simply
the nationalization of the private
clearinghouse system.” When banks
faced high currency demand, they
would withdraw their reserves from
clearinghouses. But because clearinghouses were wary of risking collapse
by giving out their reserves, they
issued certificates worth 75 percent
of the value of the amount they held
for the banks. In exchange for the
certificates, banks would pay back
the value of the certificates plus
6 percent interest.
The clearinghouse certificates
began in New York City in 1860.
After 1860, other cities’ clearinghouses began issuing notes. By 1907,
the practice became so widespread
that A. Piatt Andrew, an assistant
secretary of the Treasury from 1910
to 1912 and assistant to the National
Monetary Commission, estimated
(with some questions over his accuracy) that among cities with more than
25,000 people, clearinghouses issued
a cumulative total of $330 million in
clearinghouse notes.
Over time, the practice evolved. In
1873, clearinghouses began pooling, or
4

R e g i o n Fo c u s • Fa l l 2 0 0 7

putting all banks’ assets and liabilities
on a single balance sheet. The practice
added confidence to the banking
system because, by lumping all banks
together, it made failing banks seem
more stable. Also, clearinghouse
checks were issued. Although not
backed by anything, these checks
served as currency until they were
withdrawn, though they had to be
cashed at an official clearinghouse.
Clearinghouses later began issuing
loan certificates in substantially
smaller denominations. Originally,
certificates had been in $5,000 and
$10,000 denominations. However, as
the certificates began to be used in
the buying and selling of regular
goods, the clearinghouse system in
Atlanta, for example, began issuing
$10 certificates. Pretty soon, it was
even possible to get 25 cent certificates. Such small denominations
were necessary because when sellers
made change, the currency detracted
from bank reserves, so naturally
clearinghouses wanted sellers to use
certificates instead.
Although it did not completely
prevent
economy-wide
panics,
the clearinghouse system greatly
improved the banks’ ability to meet
currency demands. Well-timed issues
of clearinghouse certificates are
credited with preventing large-scale
spreading of the panics in 1884
and 1890. Interestingly enough,
the default rate on clearinghouse
notes was low. In 1890, Spring
Garden National Bank defaulted
on $170,000 worth of clearinghouse loans from the Philadelphia
Clearinghouse Association, which
represented the only recorded default
of the era.
“The most extraordinary fact
associated with the several clearinghouse episodes between 1857 and
1907,” wrote economist Richard
Timberlake, “is that the losses from
all the various note issues, spurious
and otherwise, were negligible!”
However, the clearinghouses were
not without problems. At the time, it
was illegal for state banks to issue
private money, which included

certificates. Even if they were legal,
the certificates would be subject to
the 10 percent tax on state bank
notes. However, like so many other
regulations, banking officials overlooked the obvious illegality of
clearinghouse notes because of
the clear benefits they provided to
the economy.
Clearinghouses also posed moral
hazard and conflict-of-interest
problems. With clearinghouse certificates largely available, banks might
be prone to profligate lending and
ignore their reserve requirements,
knowing that clearinghouses might
bail them out. In fact, as Gorton
notes, “In general, banks were not
allowed to fail during the period of
suspension of convertibility, but
were expelled from clearinghouse
membership after the period of
suspension had ended.” In addition
to the delayed suspensions, the
clearinghouses set reserve requirements and conducted their own
audits. The efforts could not
completely prevent loose lending, a
moral hazard problem that still exists
today with the Federal Reserve.

Panic of 1907
The biggest weakness of the clearinghouse system was that it did not do
anything to make more currency
available when the economy needed
it. For banks to acquire national bank
notes, they needed to buy bonds.
However, in 1900, the United States
returned to the gold standard, meaning the supply of government bonds
was tied to the supply of gold. The
government couldn’t buy bonds if it
didn’t have the gold to back it.
At first, the system worked well,
as the return to the gold standard
coincided with new gold discoveries.
The new gold meant that the government had money to put into the
economy, and in 1904 and 1907,
Treasury Secretary Lyman Gage used
the excess gold to inject money into
the economy by buying up bonds. He
timed the purchases so that the
money entered the economy around
the time farmers began demanding

BookFall08Final

1/28/08

2:40 PM

Page 5

currency to move crops to market.
However, the country’s banks still
remained handcuffed. Gage himself
advocated a “large central bank with
branches,” a harbinger of the Fed,
and the Panic of 1907 highlighted
the ill effects of an (essentially)
fixed currency.
The panic, easily the most damaging up to this time, began when F.W.
Heinze, famed speculator and president of Mercantile National Bank,
lost a huge bet on United Copper Co.
In less than 24 hours, he lost $50
million as the stock plunged from $62
to $15.
At first, the clearinghouse system
held up and Heinze’s banks were able
to clean up their balance sheets and
remain in business. However, some of
Heinze’s associates were not so lucky.
When it was reported that Heinze
was in financial trouble, the public
suspected his friends were in similar
straits and promptly rushed those
banks. Knickerbocker Trust, whose
president was an associate of Heinze,
paid more than $8 million in just
three hours as part of the run.
Because Knickerbocker was a trust
and not literally a bank, it could not
be bailed out by the clearinghouse.
The collapse of Knickerbocker
inspired a run on other banks. The
panic was quelled by the bailouts of
J.P. Morgan, who also enlisted the
support of other financiers like John
Rockefeller and Secretary of the
Treasury George Cortelyou. To help
stem the run on Knickerbocker
Trust, Cortelyou pumped $23 million
of taxpayer money into New York
national banks. Meanwhile, Morgan
managed to raise $25 million from

various financiers in 15 minutes after
a run on the Trust Company of
America. He would later finance
another $25 million to help the
brokerage firm Moore and Schley.
The bailouts re-instilled Americans’
confidence in the banking system,
and the panic itself lasted about a
month and a half.

Federal Reserve Act
Although the panic was brief, it had
lasting effects on legislators and they
decided to reform the banking system. The first attempt was the
Aldrich-Vreeland Act in 1908, which
deviated little from the clearinghouse
system. The act authorized the
Treasury Department to print out
a new series of notes that would be
lent to banks, like clearinghouse certificates, during times of crisis. The
only difference was that, unlike clearinghouse notes, these new notes were
subject to taxes. The new system
successfully averted its first panic in
1913, when, at the start of World War
I, Britain and Germany left the
gold standard, which caused a bank
run in the United States.
The act was intended to be just
a temporary solution, and its most
influential provision was the
creation of the National Monetary
Commission, made up of a number of
congressmen, including Sens. Aldrich
and Vreeland and Special Assistant
Treasury Secretary A. Piatt Andrew.
The commission went on a secret trip
to Jekyll Island, Ga., emerging with
a proposal to create the National
Reserve Association, which would
consist of a group of reserve associations with the power to issue

currency in exchange for reserves
as well as assets such as payments
for services.
Though setting the groundwork
for the Federal Reserve, the
association was never approved by
Congress. Vreeland was a Republican,
and in 1912 Democrat Woodrow
Wilson won the presidency. For the
Democrats, it marked a change from
52 years of Republican rule interrupted
only by the Cleveland administrations. They were not going to spoil it
by voting for a Republican-sponsored
banking act. Appealing to their rural
and populist base, the Democrats
denounced it as a giveaway to wealthy
Northeast banks.
The Democrats responded by
passing the Federal Reserve Act in
1913 instead. It established up to
12 district banks that worked
with a seven-member committee in
Washington, D.C., to coordinate
and regulate banking in the United
States. The Federal Reserve banks
issued notes backed by gold to
increase the money supply. The
banks also served as a lender of
last resort by lending money to banks
to meet currency demands.
The Federal Reserve Act
patched up some problems of the
clearinghouse system. It eliminated
distortions caused by different states’
regulations and enforced laws.
Having the various districts meant
there would no longer be a piling up
of reserves in New York banks. Most
important, it addressed the issue of
currency elasticity. By issuing new
currency and lending to banks,
the Fed would be more effective in
meeting demand for currency.
RF

READINGS:
Calomiris, Charles W., and Larry Schweikart. “The Panic of 1857:
Origins, Transmission, and Containment.” Journal of Economic
History, 1991, vol. 51, no. 4.

Klebaner, Benjamin. American Commercial Banking: A History.
Boston: Twayne Publishers, 1990.

Degen, Robert, A. The American Monetary System. Lexington, Mass.,
and Toronto: D.C. Heath and Co., 1987.

Moen, Jon R., and Ellis Tallman. New York and the Politics of Central
Banks, 1781 to the Federal Reserve Act. Federal Reserve Bank of
Atlanta Working Paper no. 42, December 2003.

Gorton, Gary. “Clearinghouses and the Origin of Central Banking
in the United States.” Journal of Economic History, 1985, vol. 45,
no. 2, pp. 227-283.

Timberlake, Richard. Monetary Policy in the United States: An
Intellectual and Institutional History. Chicago: The University of
Chicago Press, 1993.

Fa l l 2 0 0 7 • R e g i o n Fo c u s

5

BookFall08Final

1/28/08

2:40 PM

Page 6

JARGONALERT
Signaling
yundai cars were once known for being faulty
and unreliable. They were the butt of American
late-night talk-show jokes, with one suggesting
that a good way to frighten astronauts was by placing the
Hyundai logo on the spacecraft’s control panel. But
Hyundai has since regrouped, investing heavily in making
much sturdier cars. And judging by the rave reviews, its
efforts have been largely successful.
The Korean carmaker, however, had to fight hard to
dispel its shoddy image. One way was to provide car
buyers with a very generous 10-year or up to 100,000
miles warranty on its cars’ engine and transmission, the
first in the industry to do so. A warranty as bold as this
effectively backs Hyundai’s claims of a better car.
Consumers understand that a warranty would be too costly to
provide if the company knows
that its product will frequently fall apart. Buyers typically
cannot discern the quality of
a car before purchasing it, so a
warranty conveys a “signal” to
the buyer that this car is truly
as reliable as the company
says it is.
Signaling is used in a large
number of settings, where information about the strengths of a product or seller may be
difficult to observe directly but rather communicated
indirectly by using a signal. A company willing to
pursue an expensive advertising campaign likewise
tells consumers that it believes it has a quality product to
offer buyers; otherwise it wouldn’t spend the money
getting that information out to the public. Prior to widespread labeling regulation, food makers who wanted to
signal that their products were healthy often voluntarily
placed the ingredients and nutritional values of their
goods on packages. Even if many consumers were illequipped to make judgments about all of the information
provided, the fact it was there demonstrated that the producers had nothing to hide to the health-conscious — in
fact, quite the opposite. Signals are also used in corporate
finance such as when a firm takes on debt to signal its
confidence about future profits.
Without signals, buyers and sellers might have a frustrating time finding each other. Take the market for used
cars: If a buyer can’t tell the difference between good and
bad quality, then the best he is willing to pay is somewhere
in between. The problem is that the price is bound to be

H

6

R e g i o n Fo c u s • Fa l l 2 0 0 7

lower than what the seller of the good car is asking, but
would undoubtedly make the seller of the bad car very
happy, because the price is much higher than what his car
is really worth. A possible result is that all good cars will
be taken off the market, and the used car lot will be left
with only the “lemons.” (In economics, this is known as
the problem of adverse selection.)
Signals are also pervasive in the job market,
the example used by Stanford University economist
Michael Spence, who won the Nobel Prize for his influential work on signaling. Spence supposes that there are two
types of workers, one with a higher productivity than the
other. Both are looking for a job, and a prospective
employer or a firm would like to pay each
type according to what he is worth.
The problem is that the firm has
no way of separating the highly
productive types from the rest of
the pack, and so like in the market for used cars, the firm will
simply offer the average wage.
But the more productive
fellow can do something to
distinguish himself, for instance,
by going to school. Acquiring
education signals to a prospective
employer that he is the more able
worker and deserves a higher wage. But what would stop
a less talented job candidate from also acquiring education, in the hopes of signaling that he is as good as the
others? Spence notes that the cost of schooling must be
much higher for the less productive worker for the signal
to be believable. This might be true, for instance, if he
takes a much longer time to finish an academic degree.
He would then find it unprofitable to go to school just to
convince the employer that he is more capable than he
really is.
Taken to the extreme, the theory of signaling suggests
that people acquire schooling because it is valuable as a
signal, but it does not make them more productive.
The truth is probably somewhere in the middle, that
education is partly about acquiring skills and partly about
trying to communicate one’s ability to a job recruiter. But
even some signaling, while beneficial for the individual
and the firm, can be a waste of resources from a broader
societal viewpoint. Indeed, if people had perfect information, then a car dealer who aims to convince that he is not
some fly-by-night operator would not have to spend so
much money on building that swanky showroom.
RF

ILLUSTRATION: TIMOTHY COOK

BY VA N E S S A S U M O

BookFall08Final

1/28/08

2:40 PM

Page 7

RESEARCHSPOTLIGHT
Influential Chairmen
BY E R N I E S I C I L I A N O

expected inflation will rise, causing falling exchange rates,
ormer Federal Reserve Chairman Alan Greenspan
rising bond yields, and falling stock prices. However, the
reportedly received an $8.5 million advance for
authors find that financial markets do not follow this trend
his memoir, The Age of Turbulence. The lofty price
when a new governor is named. The lack of directional
illustrates the large cache of the title “Chairman of the
movement suggests that financial markets do not specifically
Federal Reserve,” a position that is widely perceived as
view incoming governors as weak or strong.
second in power only to the president.
The markets’ reactions indicate that the announcement
Indeed, “central banks’ policies can have significant
of a central banker provides some tidbit of information about
macroeconomic effects, and it is often assumed that the
future policy. The markets do respond to this tidbit, the
governor exerts a disproportionate influence over those
authors find, but the reaction only occurred the day of
policies,” say economists Kenneth Kuttner of Oberlin
the announcement, and there was no significant reaction in
College and Adam Posen of the Peterson Institute for
two days before or after the announcement. (Indeed, this
International Economics, explaining the lionization of
is what one would expect if the
bank governors. In their new
announcement was not leaked in
study published by the National
“Do Markets Care Who Chairs the
advance and the capital markets
Bureau of Economic Research,
efficiently incorporated the new
“Do Markets Care Who Chairs the
Central Bank?” By Kenneth N. Kuttner
information.)
Central Bank?” they find that
Further demonstrating the
markets respond to central bank
and Adam S. Posen. National Bureau
efficiency of financial markets,
governors even before they have a
of Economic Research Working
the economists found that the
chance to act.
foreign exchange market react
Kuttner and Posen looked at
Paper 13101, May 2007.
only to newsworthy appointments
the behavior of markets after a
— as the market had already
new governor is announced and
priced in previously named governors. The bond market
find that announcements result in fluctuations in exchange
reacts to newsworthy events, but curiously also react to
rates, bond yields, and, to a lesser extent, stock prices. Such
non-newsworthy events. Stock markets react only to
fluctuations indicate that markets expect certain behaviors
newsworthy events. According to the authors, the weaker
from the new governor.
significance is probably due to the fact that stock prices
Kuttner and Posen test two related hypotheses of
reflect future earnings more so than central bank policy.
what financial markets anticipate from new governors. (The
Moreover, future earnings are affected by many factors, of
authors use the term “governors” interchangeably with
which central bank policy is only one. However, the
“chairmen.”) First, markets believe that new central bank
economists cited “a few strong reactions” in the stock
governors are “weak” on inflation until proven “strong.”
market, such as in 2005 when Ben Bernanke took control
If true, then this hypothesis would mean that
at the Federal Reserve.
the announcement of new governors would be associated
Such strong reactions are emblematic of U.S. financial
with heightened inflation expectations. Second, markets
markets, which generally react more aggressively than
may interpret the announcement of a new governor
foreign markets. Kuttner and Posen offer two
as a harbinger for future monetary policy, but without the
explanations: First, U.S. data “tended to contain a larger
presumption that new chairmen will be “weak.”
element of surprise than many of the other appointments in
To test their hypotheses, the authors analyze data from
the sample,” and thus may have biased the results. Second,
1974 to 2006 from 15 industrialized countries with flexible
the Federal Reserve’s announcements may face more
exchange rates. They found 62 announcements of a new
scrutiny in America — the result of more aggressive press
central bank governor. The economists divided the
coverage, a more active Federal Reserve, a lack of “a clearly
announcements into 42 “newsworthy” and 20 “non-newsdefined policy mandate” such as inflation targeting, or what
worthy” announcements. Non-newsworthy announcements
the authors describe as a “certain American institutional
were when the incoming governor was already anticipated,
tendency to ‘personalize’ monetary policy.” By that, the
while newsworthy appointments were surprise resignations
authors refer to the tendency of the public to attribute
by incumbent or unknown appointments.
the effectiveness of monetary policy to the individual
If, as the first hypothesis suggests, incoming governors are
personality or wisdom of the chairman.
RF
initially viewed as weak, Kuttner and Posen argue that

F

Fa l l 2 0 0 7 • R e g i o n Fo c u s

7

BookFall08Final

1/28/08

2:40 PM

Page 8

POLICY UPDATE
The Supreme Court Rules on Retail Price Pacts
BY B E T T Y J OYC E N A S H

hen a Texas retailer marked down its Brighton
brand leather collection, the manufacturer cut
off its supply. That set off a chain of legal cases
that finally wound up in the U.S. Supreme Court.
Earlier this year, the court overturned the presumption,
and almost 100 years of antitrust legal precedent, that
resale price maintenance arrangements (RPMs) always,
per se, violate antitrust laws. RPMs are agreements that
give manufacturers say over the prices retailers charge for
their goods. The court ruled 5-to-4 in Leegin Creative
Leather Products, Inc. v. PSKS, Inc. that those cases
should be decided by the “rule of reason” rather than be
considered automatically, or per se, illegal. Manufacturers
traditionally have sidestepped such agreements by “suggesting” retail prices.
University of Virginia economist Kenneth Elzinga noted
that it’s never made any economic sense for resale price
maintenance to always be presumed anticompetitive. In fact,
price agreements can enhance distribution and marketing
that may benefit consumers and promote competition.
Elzinga served as the economic expert for the manufacturer
in the case.
“Resale price maintenance can give downstream retailers
incentives to offer more in-store information and services
about a product, stay open longer hours, display a product
more attractively, and offer other retail amenities that will
expand the demand for the product [benefiting the product’s manufacturer], and make the shopping experience
more attractive [benefiting consumers],” Elzinga says.
Their marketing investments will pay off and “not be
subject to free riding by discounting retailers who do not
offer these services but free ride off the retailers who
do,” he says.
Since the 1911 decision which held that it is always illegal to
use market power to set prices, there have been gargantuan
changes in the retail industry. It’s not likely that big-box retail
companies, in a strong position to dictate terms to manufacturers, would be interested in resale price maintenance
contracts, especially in light of intense international price
competition. That leaves smaller retailers and boutiques,
where service is more important than price, as the most likely
partners in RPM agreements. But Mallory Duncan, counsel
for the National Retail Federation, says all manufacturers will
ask themselves whether they want to lose the push from low
price leaders by retrenching to full-service stores.
Quentin Riegel, vice president for litigation for the
National Association of Manufacturers, says the interpretation may have a modest effect. But price agreements will be
hard and expensive to defend, so few companies will adopt

W

8

R e g i o n Fo c u s • Fa l l 2 0 0 7

them, he predicts. “First of all, if a company wants to set the
retail price of its product, it’s going to have to do so
in the face of competition,” he says. “Their first hurdle
[is that] they have to believe that price is really going to
increase sales.” Second, the firm will need a “very good reason to do it that’s competitively justified,” Riegel says. He
adds that it’s still illegal (with triple damages) to set unjustified price floors. Now, however, a plaintiff in a vertical
pricing case must prove that competition has been lessened.
The National Retail Federation, unlike the National
Association of Manufacturers, filed no brief on the issue —
its members sit on both sides of the fence. Duncan points
out that there’s been tension in the law. As long as there was
no explicit price maintenance, manufacturers could do
business with whomever they wished, even pulling product
“if someone wasn’t looking.”
Power retailers might decide to throw their weight
behind a competitor who is not going to condition sales,
Duncan says, and that could radically shift market share.
The Consumer Federation of America opposes the
court’s decision. So does the American Antitrust Institute
(AAI), which insists that higher prices will result.
There’s also fear that the decision will stifle retail innovation
which has been seen over the last century, especially if
manufacturers and retailers get together on deals.
However, economists think it is unlikely manufacturers
would want to discourage competition among retailers
because that would hurt sales.
The AAI also says it will be too expensive to successfully
bring a “rule of reason” case, so it’s “inevitable that
Leegin will mean an increased incidence of anticompetitive
RPM and higher prices for consumers.”
But Elzinga points out that it’s also expensive to lose a
case under the per se rule and unfair if the action did not hurt
competition, as in the Leegin case. “RPM contracts are
voluntary contracts between manufacturers and retailers,”
he says. “That alone should afford them some protection
from litigation or regulation. With regard to Leegin, most
stores who sold the Brighton brand were pleased to enter
into the ‘Brighton Pledge’ to maintain the resale prices that
Leegin requested. No one held a gun to anybody’s head on
either side of the transaction.”
But uncertainty abounds as to how states will react.
Thirty-seven states, including the Fifth District states of
North Carolina, South Carolina, Maryland, and West
Virginia, filed briefs in opposition. Some states said they
will enforce the per se rule despite the Supreme Court
decision because they have explicit rules against resale
price agreements.
RF

BookFall08Final

1/28/08

2:40 PM

Page 9

AROUNDTHEFED
Greenspan’s Rule

cts

BY D O U G C A M P B E L L

“A Taylor Rule and the Greenspan Era.” Yash P. Mehra
and Brian Minton. Federal Reserve Bank of Richmond
Economic Quarterly, Summer 2007, vol. 93, no. 2, pp. 229-250.

tanford University economist John Taylor suggested
what became known as the “Taylor rule” in 1993 as a
means for central banks to control inflation while stabilizing
the economy. In general, the Taylor rule instructs policymakers to lean against the wind — to keep interest rates
relatively high when inflation is elevated or employment is
above full, and to set a low target rate when conditions are
reversed. Policymakers take into account the “output gap”
— the difference between actual and full-employment
output levels — and the difference between actual inflation
relative to the central bank’s target level. Overall, following
the Taylor rule may help the Fed implement policy, insofar
as its predictability helps generate reasonable public
expectations about future short-term interest rates.
While Taylor originally proposed the rule as a guide to
policy, he and other economists also established that the rule
neatly summarized actual monetary policy behavior during
the 1980s and 1990s. More recent research suggests that
policy actions taken by the Federal Reserve under former
Chairman Alan Greenspan followed the Taylor rule but with
“interest rate smoothing” — that is, making changes in the
target federal funds rate in small, cautious, and predictable
movements. Also, some economists have found that
monetary policy follows a “forward-looking” Taylor rule,
focusing on expected economic developments and seeking
an equilibrium rate consistent with price stability and
full employment, and that it focuses on “core” inflation.
(The core inflation measure usually eliminates items like
energy and food products.)
In a new paper, economists with the Richmond Fed
generally confirm that monetary policy under Greenspan is
accurately described by the Taylor rule. Further, Yash Mehra
and Brian Minton find empirical support that the
Greenspan Fed’s policy rule “was forward-looking, focused
on core inflation, and smoothed interest rates.” A key
innovation of their paper is that it uses real-time data
(the numbers available to policymakers at the time of their
decisions) for economic variables and then checks whether
the results change with final, revised data. Also, the authors
used state-of-the-art forecasts from the Fed’s Greenbook.
The authors do identify a few periods of departure from
the rule, probably due to special macroeconomic developments. But overall their research suggests that the Taylor
rule “predicts very well the actual path of the federal funds
rate from 1987 to 2000.”

S

“Doing Good or Doing Well? Image Motivation and
Monetary Incentives in Behaving Prosocially.” Dan Ariely,
Anat Bracha, and Stephan Meier. Federal Reserve Bank of
Boston Working Paper No. 07-9, Aug. 27, 2007.

articipants in a unique experiment were asked to donate
between a choice of two charities — one perceived by the
donors as “good,” the other as “bad,” and randomly assigned
either public or private settings. In return, some donors
received monetary incentives. The authors set up this experiment to test the notion that, when it comes to prosocial
behavior, people won’t respond very strongly to monetary
incentives in public settings. Individuals seeking social
approval want to signal traits which are generally seen as good
— like charitable giving and volunteering. But if people are
offered a tax break for a donation — and everybody knows
about it — then this may erode the image gain.
Their results bear out this intuition: The “bad” charity
did better when donors operated in private settings, and vice
versa with the “good” charity. “Monetary incentives are more
effective in facilitating private, rather than public, prosocial
activity.” The authors conclude: “People want to be seen as
doing good; without extrinsic incentives, an observer will
attribute the prosocial act to one’s innate good traits which
motivate people to behave prosocially.” A possible policy
implication is that government should expect tax benefits
for items like environmentally friendly water heaters to be
more popular than for hybrid cars — because neighbors can’t
see into people’s basements.

P

“Economic Theory and Asset Bubbles.” Gadi Barlevy. Federal
Reserve Bank of Chicago Economic Perspectives, Third
Quarter, vol. 31, no. 3, pp. 44-59.

he author provides a contrarian view on asset bubbles.
Chicago Fed economist Gadi Barlevy says that the
popular press inaccurately terms a “bubble” as a situation in
which the price of an asset has risen so high so fast that it is
susceptible to a collapse. Academics prefer a more rigorous
definition: “a situation where an asset’s price exceeds the
‘fundamental’ value of the asset.”
Of course, many asset prices do display bubble-like
tendencies, in both the popular and academic sense. In such
cases, Barlevy warns that meddling with bubbles can be
treacherous. The main reason that bursting a bubble might
be advantageous is because bubbles “divert resources
from other productive uses.” But pricking a bubble might
aggravate some fundamental inefficiency in the economy,
or make some households worse off.
RF

T

Fa l l 2 0 0 7 • R e g i o n Fo c u s

9

BookFall08Final

1/28/08

2:40 PM

Page 10

SHORTTAKES
THE STATE OF THE ARTS

Struggling for an Encore

M

10

R e g i o n Fo c u s • Fa l l 2 0 0 7

Music director David Stahl has been faithful to the Charleston
Symphony Orchestra since 1984.

annual budget of more than $2.3 million, receives no state
or federal funds, but does receive funds from the
city, Charleston County, and the nearby town of Kiawah
Island. The orchestra also raises money from individuals
and corporations. In fact, a successful fund-raising effort
allowed the orchestra to finish the most recent fiscal year
with a surplus.
To draw crowds, particularly occasional concert-goers,
symphony orchestras must stage blockbuster performances
and invite superstar musicians. But big productions require
big budgets, and that means only big groups can invest in
those expensive performances.
— VANESSA SUMO
STABILITY, CREDIBILITY

D.C. Makes Fiscal Progress

W

hen the new mayor of Washington, D.C., Adrian
Fenty, took office in 2007, he inherited a government in better shape than it was when Anthony Williams
took the job in 1999. In Williams’ eight years as mayor and
his previous tenure as the city’s chief financial officer, he
was widely credited for bringing stability and credibility to
a government plagued by scandal and insolvency.
“His reputation as a comptroller and accountant was a big
factor in building confidence for investment in the city,”
recalls Tim Priest, an economist by training who leads the
marketing efforts at the Greater Washington Board
of Trade.
For example, with its investment-grade bond rating, the
city has been able to raise capital for infrastructure and
social projects. Those included mixed-income developments
that have replaced public housing complexes and a new
stadium for the Washington Nationals baseball team.
The District’s experience illustrates the relationship
between economics and stable, responsive, and fiscally
sound government.
Economists avoid passing judgment on what forms of

PHOTOGRAPHY: CHARLESTON SYMPHONY ORCHESTRA

oney troubles nearly closed the Charleston, S.C.,
symphony orchestra’s doors for good in 2006.
The orchestra has been a fixture on the city’s performing
arts scene for more than 70 years.
Finances have been touch and go for several years, says
Leo Fishman, president of the Charleston Symphony
Orchestra’s board of directors. Each crisis brought shortterm solutions and unusual donations. In 2003, for instance,
the symphony’s full-time musicians agreed to an 18 percent
pay cut for three years just to keep the orchestra playing,
a typical move for arts nonprofits when finances fizzle.
This is not a new problem: Symphony orchestras nationwide struggle to balance budgets, and some have folded.
The problem is particularly acute in mid-sized cities
like Charleston.
Savannah, Ga., 100 miles south of Charleston, lost its
symphony in 2003, for instance. Charlotte’s symphony music
director has decided to step down in 2009, reportedly over
the group’s precarious finances.
“The forces that are driving their financial squeeze
haven’t changed,” says Kevin McCarthy, an arts and cultural
affairs expert at RAND Corporation. Symphonies fight
growing competition from other entertainment as well as
aging audiences.
The dependence on public and private contributions has
been predicted since at least 1965 when economists W.J.
Baumol and W.G. Bowen, formerly at Princeton University,
dissected arts groups’ economic structure.
Technology brings little in the way of increased efficiency
for performing arts groups, yet they still face increasing
costs, just like any business. “The output per man-hour of
the violinist playing a Schubert quartet in a standard concert
hall is relatively fixed, and it is fairly difficult to reduce the
number of actors necessary for a performance of Henry IV,
Part II,” the authors wrote in “On the Performing Arts: The
Anatomy of Their Economic Problems.”
And since a symphony is a “supplier of virtue,” it makes
sense that it “distribute its bounty as widely and as equitably
as possible,” Baumol and Bowen wrote. And so it isn’t
possible to raise ticket prices enough to pay the bills.
For such groups to flourish, a wide variety of funding sources
must be tapped.
But support for midsized orchestras in cities like
Charleston can pose a problem. Nationally, less than half
of a symphony’s revenues comes from earned income,
according to the American Symphony Orchestra League.
Private contributions, endowments, and government grants
make up the rest. Public money represents about 4 percent
of revenues. The Charleston Symphony Orchestra, with an

BookFall08Final

1/28/08

2:40 PM

Page 11

governance are good or bad for economic development,
according to Beth Honadle, director of the Institute for
Policy Research at the University of Cincinnati. Rather, they
“empirically study what the likely effects of various
approaches will be relative to a number of generally
accepted criteria or measures.” These criteria may include
equity, efficiency, and the influence of government actions
on private business decisions.
Economists do have some idea of what works and
what doesn’t when it comes to governance. “Government
discourages the attraction of industry, new business
formation, and the retention and expansion of existing
industry when it under invests in education, fails to control
crime rates and protect people and property through public
safety, and allows public infrastructure to deteriorate so that
it impedes transportation and the sustenance of health,
peace, and quality of life,” Honadle says.
For example, a local government facing a fiscal crisis may
drastically cut “nonessential services” that undermine
quality of life. In the long run, this may deter new residents
and businesses, which can inject new tax revenue and
spending into a community. Also, borrowing may become
prohibitively expensive, since the government’s risk of
default – real or perceived – is greater. Thus, fewer funds
are available for municipal projects.
The District’s fiscal progress has contributed to the city’s
economic progress. “The District’s record over these last
eight fiscal years of consistently balanced budgets … has
taken the city’s bond rating from ‘junk’ status up to grade
A, a first for this city,” noted Alice Rivlin during her Senate
testimony in July 2006. (The former Federal Reserve
governor chaired the Control Board that took over management of Washington’s local government from 1995 to 2001.)
Private investors have been confident enough in
Washington’s government to make long-term commitments.
More than $12 billion of projects were completed in
Washington, D.C., between 2001 and 2005. “When Mayor
Williams took office nine years ago, there was a huge surge
in real estate investment in the city,” says Priest of the Board
of Trade. “The migration of residents out of the city
stabilized and job growth strengthened.”
— CHARLES GERENA
RELOCATION STATION

N.C. Workers Bound for Richmond

B

obby Hines has worked for Philip Morris USA for 28
years, the last eight of them at a plant in Cabarrus
County, N.C. He transferred from Louisville, Ky., when
PM USA closed down that shop. Now, he may again pull up
stakes, this time for Richmond, when the company shuts
down its North Carolina facility by decade’s end. But that
would be his last stop, if he’s even offered a position,
because Richmond will be the last remaining domestic
plant for the makers of Marlboro.
After the closing announcement, the company set up a

“Richmond room” at the Concord plant, and has said
it will issue bonuses of $50,000 to relocate workers.
“They give you updates on house sales” as well as data
on schools, recreation and other information about the
area, says Hines. He is president of Local 229-T, the Bakery,
Confectioners, Tobacco Workers and Grain Millers
International Union. Union rules, he says, require the
company to offer jobs to members if they have openings.
Most of the 1,900 hourly employees are members of one of
two unions. “I guess it all depends on how many openings
and how many people retire up there [in Richmond],” he
says. “I hope I get the opportunity.”
Falling U.S. cigarette consumption and exports have
driven the Richmond-based company to close the plant.
In 2006, PM USA expanded the North Carolina plant,
adding 12 high-speed cigarette machines and an 11-story
automated storage facility. But even that, and the $1
million that state and local officials contributed to keep
them, didn’t sufficiently make up for the stateside
consumption slide. The firm announced in June it will
produce cigarettes closer to where the customers are —
overseas — under its sister company Philip Morris
International based in Switzerland. It will return the
$1 million.
Domestic demand for cigarettes has continued to fall –
Philip Morris USA cigarette sales declined by 1.1 percent in
2006 compared to 2005 – and the company now sells four
times as many cigarettes overseas as it does here. The firm
will shift its export production, about 20 percent of
cigarettes made at the Cabarrus plant, to Europe.
The consolidation to the Richmond plant won’t be
complete until 2010, and by then those who choose or are
chosen to relocate should know Richmond pretty well.
While it may be common for firms to cultivate and
place their salaried employees in various locations, it is
an unusual move to do so for hourly workers. It could be
designed to lighten the blow of the surprise announcement, says North Carolina State University economist
Mike Walden, or simply to draw on their high skill levels.
Union negotiations are likely to contribute too.
“We try to demonstrate that we value our employees,”
communications manager Paige Magness says, confirming
that their training will benefit the firm. “I think our effort
to attract them to Richmond to keep them in those jobs
[makes that] evident.” She does not yet know, however,
how many hourly or the more than 500 salaried workers
will be offered jobs at the Richmond plant, or, of course,
how many will choose to leave North Carolina.
The shutdown marks the end of the cigarette giant’s
hefty contribution to the municipal tax base, $5 million in
2006, as well as the ancillary community spending that
the plant’s high wages generate. Tobacco manufacturing
largely has faded from North Carolina’s economic
landscape, with the last big operation consisting of
6,800 employees who remain in Winston-Salem at the
R.J. Reynolds Tobacco facilities.
— BETTY JOYCE NASH

Fa l l 2 0 0 7 • R e g i o n Fo c u s

11

BookFall08Final

1/28/08

2:41 PM

Page 12

DOWNTOWN
IS DEAD.
America is busy rebuilding its
downtowns. But these are not
the downtowns of yesterday.

Greenville, South Carolina

BY VA N E S S A S U M O

eb Ayers Agnew remembers the thousands of people who had gathered in downtown Greenville,
S.C., waiting for eggs to drop from the sky. It was a
few weeks before Easter day of 1958. A helicopter, an uncommon sight at that time, was about to drop prized plastic eggs
that contained candies and gift certificates from participating Main Street merchants.
Downtown in those days was accustomed to the crowds
that habitually converged there to work, shop, dine, and
amuse themselves. After all, downtown was the center of

D

12

R e g i o n Fo c u s • Fa l l 2 0 0 7

everything. “All the main things that you would need in life
could be purchased strictly by walking up and down Main
Street,” Ayers Agnew says. Her family owns Ayers Leather
Shop, which opened at the bottom floor of the grand
Poinsett Hotel almost 60 years ago (it has since moved to
another location on Main Street). Throngs of locals and outof-towners would patronize Greenville’s downtown
amenities, she recalls.
But like most downtowns across America, the automobile portended the decline of Greenville’s city center. Stores

PHOTOS: COURTESY OF THE CITY OF GREENVILLE; VANESSA SUMO

LONG LIVE
DOWNTOWN!

BookFall08Final

1/28/08

2:41 PM

Page 13

and businesses followed the people
who moved their homes to the suburbs. Even Greenville’s Easter event
was organized to compete against the
shopping centers that were starting to
come up, says Ayers Agnew. When the
first indoor mall opened in the area in
the late 1960s, the downtown exodus
began. As malls prospered, the big
department stores and smaller stores
moved out. Even Ayers Leather Shop
opened a store in this mall. It kept its
downtown store, though, because the
rent there had become cheap and it
made sense to keep it for storage and
repairs. Downtown Greenville in the
1970s had become fairly abandoned
and somewhat seedy.
Today, cities all across America are
busy reviving their downtowns. From
Richmond to Raleigh, and from
Charleston, W.Va., to Charlotte, business and government leaders in the
Fifth District are trying to build up
their downtowns, with mixed results
among them. Greenville, a city of
about 56,000 people, has been slowly
rejuvenating its center for more than
25 years. Other cities have visited
downtown Greenville to take notes on
how to proceed with their own revitalization efforts.
It is clear from the crowds that walk
around on a warm summer evening
that Greenville is achieving much of
what it had set out to do. On a typical
Thursday night, there could be a concert playing by the river against a
backdrop of restored industrial buildings, while another band plays to
mostly 20- and 30-somethings after
work, drinks on hand, in an outdoor
plaza on tree-lined Main Street.
Shakespeare could be performed in the
park to delighted families sitting on the
grass and enjoying the outdoors, while
a minor league baseball game plays to
sports fans in a new stadium down the
street. All these events would likely be
well-attended and all within reasonable
walking distance (it is about a mile
from one end of Main Street to the
other end). Main Street is lively even
after 5 p.m., when many other city centers would look like ghost towns after
office workers have gone home.

Downtown Greenville will never be
the center of industry that it was in the
19th and early 20th centuries. It will
no longer house most of the offices or
shops. There will, on the contrary,
always be a mall or an office park just a
few miles away. “The day of downtown
as the center of the regional economy
is dead almost everywhere,” says Joel
Kotkin, an expert on cities and author
of The City: A Global History. There is
simply no way to reverse the speed
and comfort of the automobile, which
will take you anywhere, anytime you
want. Greenville understands this.
“We realized that we couldn’t make it
into what it was before,” says Nancy
Whitworth, director of economic
development for the city. Greenville’s
city center bears little resemblance to
what it was in its heyday — save for the
bustle of people.
Today’s downtowns are different, as
they surely have to be if they hope to
compete with various concentrations
of shopping, business, and entertainment. What they offer is an urban
lifestyle where one can live, work, and
play, and where walking is a predominant form of transportation. As such,
downtowns today may not be for
everybody. They are a niche product,
likely geared to a certain demographic
or two, and whose broader payoffs are
important to the city. In this sense,
downtowns today are really being
reinvented rather than restored to
their former glory.

An American Invention
The word downtown was coined in
America. In the early 19th century,
New Yorkers referred to the northern
section of Manhattan as “uptown,”
and to its southern end when speaking
about “downtown.” But the words
gradually took on a more functional
meaning. The business district became
commonly known as downtown, while
the residential area as uptown. By the
1870s, writes Massachusetts Institute
of Technology urban studies and
history professor Robert Fogelson, the
functional meaning had largely taken
over the geographical because in very
few cities was downtown south and

uptown north. “Downtown lay to
the south in Detroit, but to the
north in Cleveland, to the east in St.
Louis, and to the west in Pittsburgh,”
notes Fogelson.
In the early days, American cities
clustered around water-based transportation nodes, says Edward Glaeser,
an urban economist at Harvard
University, in an interview. Eastern
cities formed in spots that hit the sea
or a harbor, while inland cities were
built on riverways or canals. One of
New York City’s great manufacturing
industries, sugar refining, was located
close to the water. Because sugar
crystals coalesce during a long, hot sea
voyage, raw sugar was shipped from
the Caribbean to New York.
Moreover, to take advantage of
economies of scale, sugar refining
was consolidated in one place so
refineries were set up close to the
port. From here, refined sugar could
be transported to the rest of the
country and to Europe.
People and businesses then gravitated toward this center of activity.
“Ports and railway stations were
massive pieces of infrastructure, and
they could not be produced willy-nilly
throughout metropolitan areas,”
wrote Glaeser and Matthew Kahn of
Tufts University in a working paper for
the National Bureau of Economic
Research. Even when other forms of
locomotion such as buses opened up
the city, it still made sense to cluster
commercial activity around transportation hubs. People would then
move around by hub and spoke — they
would arrive by train or bus and from
there walk to their destination.
Another transportation innovation
that encouraged the formation of a
high-density urban area was the
elevator (in particular, the “safety
elevator” invented by Elisha Otis). By
allowing people to move vertically,
downtowns could build higher and
higher, instead of pushing farther out.
But just as transportation technology shaped downtown’s dominance,
the internal combustion engine weakened its relevance. “The car and the
truck have had an immense decentral-

Fa l l 2 0 0 7 • R e g i o n Fo c u s

13

BookFall08Final

2/5/08

9:50 AM

Page 14

SHARE OF AGE GROUP

is often a lot of architecture and
izing effect,” says Glaeser.
Where the Young and the Baby Boomers Want to Be
history there to make them
Cars and trucks allowed
Twenty-five- to 34-year-olds made up 24 percent of all downtowners
authentic and interesting places.
people to travel from
in 2000, compared with only 13 percent in 1970. The group of 45- to
But cities are adding another
point to point, rather than
64-year-olds was a close second, comprising 21 percent of downtown
dimension to their downtowns
move by hub and spoke.
residents in 2000.
today. They are remaking them
The economies of locating
30
into a place where people
by ports and railway stacan live.
tions greatly diminished.
25
That is perhaps the biggest
Moreover, because nothing
20
difference between the downcould beat the speed of the
town of today and yesterday, and
car (it significantly reduced
15
one of the keys to sustaining its
commuting time), resi10
growth. “The downtowns that
dences and jobs became
we’re building today are being
increasingly spread out. “I
5
driven by housing,” Leinberger
think of transportation
0
says. In the early days, people
technology as very much
1970
1980
1990
2000
didn’t really live downtown.
driving the urban form,”
Share 25 to 34
Share 45 to 64
Share 35 to 44
Share 18 to 24
The city center contained
says Glaeser. As a result,
Share Over 65
Share Under 18
offices, warehouses, factories,
Americans today live in
NOTE: Based on author’s analysis of selected cities using data from the U.S. Census
and stores, but typically not
less-dense areas miles from
SOURCE: “Who Lives Downtown” by Eugenie Birch. The Brookings Institution,
residential dwellings. Those
the city center, and tradiNovember 2005
who did reside there often had
tional downtowns contain
relatively low incomes. But
only a small share of metrotoday, people who choose to live
politan employment, Glaeser and
nearby composed of the Millennium
downtown are often those who can
Kahn note. For instance, across the 150
Campus (a technology and research
afford to live anywhere they please.
metropolitan statistical areas they
office park), and Clemson University’s
The demand for downtown living
analyzed, only about a quarter of total
International Center for Automotive
seems to be driven by the tastes of
employment is within three miles of
Research. And then there’s downtown.
those in their 20s and 30s as well as by
the city’s center.
Because cities can support these variempty nesters tired of keeping big
Although downtowns are more
ous concentrations, downtowns that
homes and big yards and wanting the
robust in bigger cities like Boston and
are making a comeback have had to
convenience of many things they need
San Francisco, these are still a far cry
reposition themselves to offer someclose by. A November 2005 Brookings
from what they once were, writes
thing different, knowing that they can
Institution report that analyzes the
Fogelson. “Nowhere in urban America
no longer aspire to be the centers of
downtown population in 44 cities,
is downtown coming back as the only
everything. And just as transportation
finds that downtowns have a higher
business district … The almighty downhas defined the urban landscape,
percentage of young adults and
town of the past is gone — and gone for
the renewed interest for downtown is
college-educated residents than the
good. And it has been gone much
rooted in the most rudimentary form of
country’s cities and suburbs. (In this
longer than most Americans realize.”
transportation: walking.
study, the city is defined by the
Some say that there is a growing
political boundaries at the time of the
interest
in
“walkable
urbanism,”
or
the
Reinventing Downtown
census and includes the downtown.
privilege of walking between restauToday, many centers of activity can
The suburb is the metropolitan
rants, entertainment venues, the
exist almost side by side because they
statistical area and includes the city.)
grocery, the shops, and possibly to
serve different functions at different
Twenty-five- to 34-year-olds made up
work. Christopher Leinberger, a downlevels of density, says Barry Nocks,
about 24 percent of downtown resitown redevelopment expert and
an urban planning professor at
dents in 2000, closely followed by
visiting fellow at the Brookings
Clemson University.
45- to 64-year-olds at 21 percent.
Institution, thinks that there is a very
In Greenville, Haywood Mall and
As baby boomers age, more empty
strong demand for a walkable urban
the shopping belt along Haywood Road
nesters may opt to live downtown.
environment, including downtowns.
are less than a 15-minute drive from
The report also finds that the
Many city and business leaders seem to
downtown. A few miles farther out is a
downtown population grew by 10
think so, too, and they’ve been
big-box strip on Woodruff Road. Right
percent during the 1990s, a sharp turnreinvesting in their city centers to capacross is Verdae, a planned mixed-use
around following 20 years of overall
italize on these trends. Downtowns
development with homes, offices, a
decline. The same trend is observed in
may be a good place to do this because
shopping center, and a golf course. A
the number of households — an
they are already workplaces, and there
cluster of new office spaces is located
14

R e g i o n Fo c u s • Fa l l 2 0 0 7

BookFall08Final

1/28/08

2:41 PM

Page 15

important driver for the housing
market — that grew by 13 percent in
the 1990s. In downtown Baltimore,
Md., for instance, the number of
households grew very rapidly in the
1990s, in spite of a dip in the city’s
overall household population during
the same period.
Downtown residents are important
in providing the base needed to
support shops and the restaurants as
well as to ensure that people will still
be around on weekdays after 5 p.m.
and on weekends, hence making the
streets safer and more pleasant. But
how can a city entice potential residents and nonresidents to come to
downtown after years of ignoring it?
Perhaps by paying attention to the
kind of place people are looking for.

A Place Built for People
“Lawrence Halprin loved manipulating water,” says Robert Bainbridge,
former director of the South Carolina
Design Arts Partnership. Bainbridge is
talking about a public plaza that
Halprin, one of the finest landscape
architects in the country, designed
for downtown Greenville around the
late 1970s. “Halprin believed in touchable water. There is no railing between
you and the water,” says Bainbridge.
In a way, the new downtown
Greenville is just like that: People can
touch it.
This is evident in Halprin’s
streetscape design of Main Street, the
starting point of downtown’s reinvention. In 1979, Main Street was
narrowed from four lanes to two in
order to widen the side walks.
This allowed more space for people to
walk around and for restaurant patrons
to dine outside. Trees were planted and
parallel parking spaces were replaced
with diagonal ones along the street.
The sidewalk pavement blends into
the intersection, giving pedestrians a
feeling of continuity even while crossing the street. The plans were careful
not to exclude the automobile and
make the place entirely pedestrian.
“Americans come by car,” says
Bainbridge. The combination of a
narrower street, wider sidewalks, and

a canopy of trees creates a sense of
enclosure to what used to be an
unfriendly wide-open space.
The streetscape may have created a
fresher-looking downtown, but the
businesses weren’t going to go there
just because it looked pretty. “Anchor
projects” were needed to spur interest
in the area, and these have been
planned and placed over a one-mile
stretch of Main Street.
The Greenville Commons — a
cluster of buildings that includes
a hotel, a small convention center, an
office building, and a public park —
opened in 1982 at the point where the
new streetscape begins. Less than half
a mile away by the Reedy River is the
Peace Center for Performing Arts,
which opened in 1991, so that people
could get into the habit of going downtown on evenings and weekends. The
Westend Market is just a few blocks
down, an old cotton warehouse converted into a mixed-use of office,
shops, and restaurants in 1994. And at
the end of the current concentration
of activity on Main Street is a new
baseball stadium that opened in 2006,
which was modeled after Fenway Park.
(The stadium is home to the
Greenville Drive, a minor league
affiliate of the Boston Red Sox.)
These catalyst projects have
spawned other private developments,
from the construction of new buildings like the RiverPlace, the largest
private investment so far in downtown
Greenville, to the rehabilitation of old
buildings. Downtown revival has
sparked interest in the preservation of
many historical structures with fine
architecture, which in turn has helped
downtown set itself apart from the
competition. “It conveys the character
of the market,” says Robert Benedict,
a historic preservation consultant
in Greenville.
Throughout downtown’s revitalization efforts, the city has made sure
that buildings all come down to a level
that engages people walking by. For
instance, the Wachovia office building
on Main Street used to be set back far
from the sidewalk. Following the city’s
design guidelines, a private developer

built a new low-rise structure that
wraps around the part of the office
building that faces busy streets, effectively aligning it with the rest of the
buildings. Restaurants and shops
occupy the ground floor of this new
mixed-use structure while apartments
were built above.
The city has planned its parking
garages in a way that they are, as much
as possible, out of sight from the
street. A good example is a mixed-use
project called the Bookends, which
occupies a whole block in a street off
Main. The city wanted to rebuild a
parking garage that stood there but
didn’t really need all that space. So it
sold off a slice of the property on each
side facing the street, while the
parking garage was constructed in
between, hence the name.
The same mixed-use philosophy
repeats in almost all the buildings on
Main Street. Restaurants and shops
are placed at the street level, residents
on the upper floors, and sometimes
office spaces in between. It works well
because no one wants to live on the
ground floor, and many people don’t
want to walk up a flight of steps to
enter a store. The result is an almost
continuous row of restaurants and
shops on Main Street.
Greenvillians will say that publicprivate partnerships, perhaps a
fuzzy concept for some, have played an
important role in successfully putting
together many of the projects
downtown. “The public-private partnerships are really what have made
downtown Greenville what it is today,”
says Mary Douglas Neal, the city’s
downtown development manager. In
the early days, Greenville had a downtown development organization, but it
later decided to completely assume
the rebuilding efforts within the city’s
economic development department.
Rebuilding downtown required a
tremendous amount of coordination
from all the departments of the city
(police, fire, building codes, planning,
public works, etc.).
The city has taken on many roles
at different levels of involvement,
but it is mainly in charge of making,

Fa l l 2 0 0 7 • R e g i o n Fo c u s

15

BookFall08Final

1/28/08

2:41 PM

Page 16

facilitating, and following through
the plans for downtown. “We promote
ideas,” says Mayor Knox White, who
has been at the helm of the city since
1995. Sometimes, it will pitch in more
investments to take on the risk that a
private developer is not willing to bear.
The only time that the city developed
a project entirely on its own was in
rehabilitating the Westend Market.
The city could not get a private developer to come. But the old cotton
warehouse’s location (the building was
donated to the city) was important to
the city to anchor that end of Main
Street. The Westend Market was
eventually sold in 2005 at a profit.
But the city sees its role as
stimulating private investment, in
doing things that would enable the
private sector to do business in downtown Greenville. “The private sector is
the real engine here. No matter what
you’re doing from the public-sector
standpoint, if you don’t get the private
sector … you’re going to stall out,”
says Whitworth. In every project, an
agreement is reached as to what the
city can do for the developer and
what the developer can do for downtown. In general, the city builds and
operates everything outdoors that is
on public grounds, which usually
includes the parking garage, while the
private developer takes care of
everything indoors.
Most of the public infrastructure
has been paid for by Tax Increment
Financing (TIF), an arrangement
designed to capture the tax dollars
from an increase in an area’s property
value thanks to public investment.
The new tax revenue collected is
used to pay for development costs of
that “TIF district.” Greenville has two
such districts. But the city has
also been able to tap funds from
other sources, such as a 2 percent
“hospitality tax” on prepared meals
and beverages to pay for a pedestrian
bridge in Falls Park. In all, the city
has spent about $150 million in
rebuilding downtown, with Greenville
leaders believing the investments
would benefit residents as a whole.
And it takes time. “One of the key
16

R e g i o n Fo c u s • Fa l l 2 0 0 7

things is that it really does take 25
years. You have to think that far ahead
and commit to doing it. This place will
still be a great place in 25 years because
it was done right,” says Bainbridge.
And if there’s any doubt as to
Greenville’s seriousness in rejuvenating its downtown, one need only
be reminded of that vehicular bridge
on Camperdown Way that formerly
crossed Main Street and the Reedy
River. A few years ago, a decision
was made to tear down that section
of the bridge to expose a beautiful
60-foot waterfall, which many
residents did not even know was
there. An elegant cable foot bridge
now stands in its place. Today,
Greenvillians not only have a unique
piece of nature to enjoy at the heart
of downtown, but also something to
put on their postcard.

When Does it Make Sense to
Rebuild a Downtown?
Rebuilding their centers is understandably on many cities’ wish list.
There is something unsatisfying about
letting a place just wither away, especially if it is one with much history and
great architecture. Also, an eyesore of
a downtown may tarnish the city’s
reputation. Some think that a vibrant
city center can jumpstart — or is an
important element of — economic
success, while others are more skeptical of pinning a city’s hopes on a
downtown. The bottom line of
whether efforts to bring downtowns
back to life is tricky to find.
Greenville, it seems, has benefited
from public-private partnerships
aimed at reviving the city center. But
such development may have happened
organically, without government
involvement. Also, it’s unclear that
other cities hoping to revive their
downtowns could replicate Greenville’s
success with similar redevelopment
programs. In short, there is no uniform rule, so cities must look hard at
whether there is a clear demand for a
downtown revitalization or clear benefits from doing so.
Such a demand is probably less likely to be found in struggling cities like

Detroit and Cleveland. “The last thing
you want to do is build excess infrastructure in a declining region,” says
Glaeser. After all, the hallmark of a
moribund area is when there is too
much infrastructure relative to
demand. A downtown may not be a silver bullet either. Glaeser cites the
experience of Buffalo, N.Y., where a
snazzier downtown hasn’t done much
to stem the population outflow. Job
growth in the Buffalo-Niagara area
has been dismal for a very long time.
Glaeser also casts doubt on a
popular reason why cities want a cool
downtown. Cities want to appeal to
the “creative class,” but it isn’t clear if
that is mostly what these types are
attracted to. “There is some confusion
about who the creative people are,”
says Glaeser. He notes that the
cappuccino-sipping young professional
is just a small fraction of this
group. Creative people may just as
likely be highly educated 40-year-olds
with two kids. As incomes increase,
more amenities are demanded, but
safe neighborhoods, good schools,
and fast commutes are probably
paramount for this group. Thus, if
the intention is to recruit those
high-value-added workers, it might be
best if a city pays attention to those
basic amenities first.
But many think that while schools
and safety are important factors, a city
can capitalize on the growing interest
in downtown living and use it as a
starting point to uplift an area.
“Leaders are starting to realize that
while a downtown isn’t a guarantee
to a strong economy, it is certainly
somewhat of a prerequisite for
success,” says Jennifer Vey, a fellow at
the Brookings Institution. Leinberger
likewise thinks that part of the reason
why some metropolitan areas are
healthy is because they’ve rejuvenated
their downtowns. In this view, a
strong downtown can aid in recruiting
companies and workers, bolster the
regional economy, and help adjacent
lower-income neighborhoods.
In Greenville, the economy wasn’t
doing badly in the 1980s and early
1990s when the push to turn around

BookFall08Final

1/28/08

2:41 PM

Page 17

downtown began. Once a textile giant
that made the city very prosperous in
the early 20th century, Greenville has
been trying to make up for that lost
manufacturing power by diversifying
into services and durable goods. “We
didn’t have to make choices about
where we would put our emphasis,”
says Whitworth. “The natural growth
was happening in the suburbs so
we focused internally, in downtown.”
South Carolina also has very limited
annexation opportunities, so the city
had to redevelop areas that they
already had. Moreover, they hoped
that a strong city center would
help the low-income neighborhoods
around it, by bringing in not only jobs
but also the attention and eventual
support for these downtrodden areas.
But Greenville leaders say that they
are getting much more in return.
And Brian Reed, a market researcher
at the real estate firm The Furman
Company, says that part of the reason
why the suburban office market is
catching up is because of downtown.
This draw of downtown is a selling
point for a lot of professional servicetype organizations that choose to
locate in the Greenville suburbs,
Reed notes.
The growing activity there is also
why Clemson’s business school
decided to locate its Renaissance
Center in the historic Liberty Building
on Main Street. (Clemson University is
about 30 miles from downtown
Greenville.) The center serves as a
work area and meeting place for
students working with companies in
Greenville like Michelin, a large French
manufacturer of tires, whose U.S. headquarters is based in Greenville. Caron

St. John, director of Clemson’s Arthur
M. Spiro Center for Entrepreneurial
Leadership, says that the business
school wanted to be associated with
downtown “because it’s attractive, so
dynamic, and a fun place to be.”

Sustaining the
Downtown Option
For now, Greenville is a work in
progress. It is difficult to get a precise
estimate of the number of people
living in downtown Greenville, but
there are about 1,215 residential condo
units and more are on the way. This
can be thought of as roughly equivalent to the number of households in
downtown. The flurry of residential
condo building in recent years has
been well-received, with some units
going for more than $1 million.
Other projects that have been eager to
get off the ground have not yet done
so, because construction costs have
risen faster than the price that these
condos can fetch in the market, says
Charlie Whitmire, developer of
the Bookends.
There are middle- to upper-income
residential neighborhoods around
downtown, which some say has helped
to support its growth. But unless these
Greenvillians are avid walkers, these
households will have a choice on
which direction to take the family car:
downtown or out to the mall. This
makes downtown residents a crucial
aspect of the sustainability of downtowns, says Clemson economist Curtis
Simon, because these are the people
who will likely patronize a downtown
grocer, for instance.
Office workers are important, too,
as they bring in another aspect of

demand. The office market in the
central business district seems to be
doing very well, with rents high and
vacancy rates low. The restaurants are
enjoying good business, partly because
of a very strong lunch crowd of office
workers. Stores, on the other hand,
have not fared as well, and there have
been a number of closings. People
seem to prefer shopping in the
mall, but regional stores like North
Carolina-based Mast General seem to
be doing well in downtown. The retail
space is changing, however. A Publix
grocery store and a Staples officesupply store just opened in downtown.
The success of a downtown revitalization depends on a number of
factors. Part of it is about commitment, having good leaders, and
executing a plan well. But there are
other elements that are more uncertain than guaranteed. If you build it,
will residents and businesses come?
Will it be a center of ideas? Will people
have fun there? Will it uplift the
neighborhoods around it?
The only thing that is certain is
that downtown’s roles have changed
and diminished greatly from their
once very powerful position. This is
what cities must understand. The car
remains king, and downtowns might
have a hard time competing with
that, with other centers of ideas
and of consumption. Downtown has
become an option that will, like it
or not, simply exist side by side with
malls, big-box retail strips, and office
parks. But a downtown does not have
to be obsolete. If the demand is there
and if it is done the right way, a
downtown may be able to hold up
well against its competition.
RF

READINGS
Birch, Eugenie L. “Who Lives Downtown.” Living Cities Census
Series, The Brookings Institution, November 2005.

Glaeser, Edward L., Jed Kolko, and Albert Saiz. “Consumer City.”
Journal of Economic Geography, 2001, vol.1, no. 1, pp. 27-50.

Fogelson, Robert M. Downtown: Its Rise and Fall, 1880-1950. New
Haven: Yale University Press, 2001.

Kotkin, Joel. The City: A Global History. New York: Modern Library
Chronicles, 2006.

Ford, Larry. America’s New Downtowns: Revitalization or Reinvention?
Baltimore: The Johns Hopkins University Press, 2003.

Leinberger, Christopher B. “Turning Around Downtown: Twelve
Steps to Revitalization.” Research Brief, The Brookings
Institution, March 2005.

Glaeser, Edward L., and Matthew E. Kahn. “Sprawl and Urban
Growth.” NBER Working Paper No. 9733, May 2003.

Fa l l 2 0 0 7 • R e g i o n Fo c u s

17

BookFall08Final

1/28/08

2:41 PM

Page 18

Armed
against ARMs
Educating low-income borrowers may be an effective
— if oft-overlooked — way to minimize mortgage losses
BY D O U G C A M P B E L L

B

18

R e g i o n Fo c u s • Fa l l 2 0 0 7

ering her expectations about how much of a home she could
afford, and paying off her bills.
When she had done that, her credit score had risen about
100 points — right on the border between the ability to
obtain a subprime or regular loan. With Porter’s help, Turner
found the latter, as well as downpayment assistance. She
obtained a conventional, 30-year fixed-rate mortgage, originated by a reputable bank. Her monthly payment: $686,
including insurance and property taxes. In March 2005,
mortgage loan in tow, Turner closed on a brand-new, 1,500square-foot, three-bed, two-bath home for $122,000.
More than anything, Turner says, she came away from her
mortgage counseling experience with an appreciation for the
commitment she was making. “I mapped out a plan, thought
it through, and stayed the course,” Turner says today.
“I had to sit down and decide whether I wanted to do this.
That sense of commitment is one of the best things I
took away.”
The focus on the role of mortgage brokers and Wall
Street — and even on regulators in the recent decline of the
subprime housing market — is richly deserved. But another
player deserves attention: borrowers. The extent to which
subprime borrowers were grossly misled, took calculated
risks or simply didn’t understand the details of the contracts
they entered into, is unclear.
But if there is anything to be learned from Turner’s
experience, it is that financial education can make a
difference. What if all subprime borrowers received the
counseling that Turner did? Would we even be talking about
the problems in the subprime market?

Subprime Primer
Though standards vary, in general a credit score of 660
(around the national average) or higher may qualify for a
“prime” loan. There is also a near-prime, sometimes called
“Alt-A,” category of loans for borrowers with credit scores
between 580 and 660. Subprime borrowers usually are
those with credit scores lower than 580 (though by some

PHOTOGRAPHY: GETTY IMAGES

ack in October 2003, Donna Turner had her eye on
a house. It was a modest house, priced to sell
at $150,000. For the Raleigh market, that was
something of a steal. But Turner had a few financial
obstacles to overcome before she could live the American
Dream. She was a single mother who worked as a certified
nursing assistant, earning about $23,000 a year. Her credit
report was pocked with poor choices and understandable
setbacks, from delinquent cell phone payments to unwieldy
medical bills.
It added up to a credit score in the mid-500s, putting
her somewhere among the 15th percentile of the nation’s debt
seekers. By all definitions, Turner was a “subprime”
borrower, a credit risk so great that mortgage lenders
would charge her extra — if they chose to take her on at
all — before putting up the funds necessary to close on her
dream home.
Turner’s story could have gone several different ways at
that point. She might have been able to secure a subprime
loan, perhaps one of those now much-maligned “adjustable
rate mortgages” (ARMs), which would inevitably balloon in
the years to come, making it impossible for her to keep up
with payments. Turner would end up as another subject
in a newspaper article about the hardships consumers face
when taking deals from unscrupulous lenders. It’s a familiar
tale of late.
Or she could have somehow come up with the monthly
payments, even after they increased with interest rates. It’s
less likely you’ve heard of that story, even though it’s actually
more commonplace than the first one. Remember: The
majority of subprime loans are in fact being repaid on time.
Both interesting stories. But perhaps a better one is what
actually happened. Turner didn’t take out a home loan in
2003. Instead, she first walked into the Raleigh offices of
Downtown Housing Improvement Corp., or DHIC. There
she met Sheila Porter, who goes by the title of mortgage manager. Together they spent the next year and a half plotting a
turnaround strategy. It entailed Turner taking a new job, low-

BookFall08Final

1/28/08

2:41 PM

Page 19

Share of U.S. Mortgages in Foreclosure, June 30, 2007
measures, scores below 620 qualify).
Federal regulators define such borrowers as those with records of delinquency
or bankruptcy, and debt-to-income
ratios of 50 percent or more.
As of this fall, prime loans were not
showing signs of major trouble. The
overall delinquency rate (between 30
and 90 days overdue) has stayed close
to 4 percent since the early 1990s,
according to a Chicago Fed paper,
though rising to about 5 percent in the
past year. Fixed-rate, 30-year mortgages in fact remain at historical low
levels of delinquency, at around 2 percent. The problem has been in the
subprime market.
Subprime mortgages didn’t gain
much attention until recently, but
their growth began in the early 1990s.
Interest rates were declining, and
some high-risk borrowers turned to
them as a means to refinance existing
mortgages. Meanwhile, technological
improvements made it easier and
cheaper to “score” borrowers’ credit
risks, helping to increase volume in the
subprime category.
Subprime mortgages (defined here
as loans obtained by borrowers with
credit scores less than 620) have
indeed seen a sharp increase in
delinquencies, overall at more than
13 percent in early 2007, with ARMs
leading the way at 14 percent. More to
the point, the growth in subprime
mortgages has been astonishing, rising
from 6 percent of all loans as recently
as 2002 to 20 percent at the end of
2006. (This 20 percent figure includes
a 5 percentage point share for Alt-A
loans.) The share of subprime loans
that are ARMs — with the highest
delinquency rates — stood at 50 percent (or about 7.5 percent of all
mortgage loans) at the end of 2006.
Not only are subprime loans risky, but
half of them are the riskiest possible
— ARMs. Meanwhile, the share of
prime loans that are ARMs stood at
18.2 percent at the end of 2006.
Subprime borrowers of any type will
pay between 2 and 3 percentage points
more than the prevailing prime rates.
For example, a hypothetical subprime
loan originated this fall might carry an

annual rate of 8.4 percent,
2%
Prime Fixed
16%
11%
compared with 6.4 percent
Prime ARM*
Subprime Fixed
for a prime borrower.
Subprime ARM
(Historically, the subprime
Federal Housing Administration
spread has been between 200
Veterans Administration
and 300 basis points, but in
recent months has widened.)
A 30-year, $250,000 loan at
36%
the subprime rate would
require monthly payments of
*Adjustable rate mortgage
$1,904 compared with $1,563
18%
SOURCE: Mortgage Bankers Association
for a prime loan — a difference of more than $4,000 a
year. Economists with the St. Louis
Does it Work?
Fed put it this way: “At its simplest,
Therein lies the motivation for thinksubprime lending can be described as
ing about the power of financial
high-cost lending.”
education. Reliable data are difficult to
Many of the largest originators of
find on the impact of pre-homeownersubprime loans are not banks. New
ship counseling. With mortgage loans
Century Financial Corp., for example,
being sold to investors, tracking them
is a real estate investment trust and
over time is difficult. There are also
was the nation’s second-largest submany different forms of counseling
prime originator before seeking
(from workshops to intense, monthsbankruptcy protection this spring.
long individual programs), and a
Other nonbanks are parts of bank or
dearth of formal tests matching differthrift holding companies. Also in the
ent programs with different outcomes.
top 10 are banks like Wells Fargo and
In a survey of the literature on
CitiFinancial, as well as thrifts like
credit counseling, Richmond Fed
Countrywide Financial. But what
economist Matthew Martin draws
distinguishes a subprime from a prime
some conclusions that may be quite
loan is the perceived credit risk of
pertinent to the subprime market’s
the borrower. A subprime loan may
decline. Based on his reading, Martin
include features like interest-only
says it’s clear that some households
payments or zero downpayment or
make mistakes in personal financial
adjustable rates, but it doesn’t have to.
decisions, and that “mistakes are
All these features are also available
more common for low-income and
to prime borrowers. So when we disless-educated households.” As such,
cuss subprime loans, we are generally
low-income households tend to beneconsidering mortgages to high-risk
fit the most from financial education.
borrowers, or those who fail to provide
A widely discussed study found
adequate documentation on their
that, for low-income borrowers, there
income, or to those with high debt-tois a connection between prepurchase
income ratios.
counseling and avoiding delinquency.
Unquestionably, the subprime revoIn 2001, researchers with Freddie
lution extended credit to those who in
Mac showed that borrowers have a 19
previous decades were shut out of the
percent lower delinquency rate after
homeownership market. On the other
counseling. Of the different sorts of
hand, it may seem like asking for
counseling, one-on-one was found to
trouble by charging the poorest, or
be most effective, with a 34 percent
the most debt-ridden borrowers extra.
decrease in delinquency compared
Or, as others have postulated, it may
with 26 percent for group sessions
be perilous to offer complicated
and 21 percent for home study.
financial instruments to relatively
Similar studies have tried to adjust
unsophisticated consumers — and
for self-selection — the problem that
low-income borrowers tend to fall into
results will be skewed because people
that category.
who seek counseling in the first

Fa l l 2 0 0 7 • R e g i o n Fo c u s

17%

19

BookFall08Final

1/28/08

2:41 PM

Page 20

place are likely those committed to
improving their credit. These studies
found little difference between
self-selectors and others in terms of
the difference that counseling made
on their behavior.
In a 2006 paper, economists
Valentina Hartarska of Auburn
University and Claudio Gonzalez-Vega
of Ohio State University found that
counseling has a significant effect
on borrowing behavior, as it makes
low-income borrowers more aware
of all their financial options — from
refinancing to default. Counseled
borrowers grow more “ruthless” in
their decisionmaking, an outcome
that may not always be so great for
lenders. Borrowers, for example,
would now understand that it might
make more sense for them to default
than to refinance.
Hartarska notes in an interview that
her study was fairly limited. It drew
from the experience of an Ohio bank
that provided counseling services as
part of its Community Reinvestment
Act requirements from 1996 to 2000.
The authors looked at a total of 1,338
loans over three papers, comparing
those that occurred before counseling
(1992 to 1995) and those in the post1996 period. That said, Hartarska
believes the results to be quite robust,
and perhaps useful to lenders.
“It means that you can educate and
then you can price your risk based on
the experience that borrowers will
behave slightly differently from what
you would have expected of people with
their income and credit score,” she says.
Hartarska also adds that much more
study needs to take place, a process that
could be aided if lenders made more
data available to researchers.
It may help outcomes, but mortgage
counseling isn’t free. It is supported in
part through government grants.
NeighborWorks America is the main
backer of local nonprofit housing
organizations, with 240 members
across the country. It distributes much
of its $115 million annual (federally supported) budget to groups like DHIC in
Raleigh. The local organizations can do
a number of things with the money,
20

R e g i o n Fo c u s • Fa l l 2 0 0 7

from developing properties to hiring
financial educators.
MostNeighborWorks-backed organizations offer some sort of prepurchase
counseling, says Douglas Robinson,
NeighborWorks spokesman. Perhaps
because of that, local nonprofit housing groups see better results from their
clients: The default rate on subprime
mortgages taken out by their clients is
less than 3 percent, Robinson says,
compared with about 13 percent for all
subprime loans.
“If more families and more
households had taken advantage of
prepurchase counseling, whether
prime or subprime borrowers, they
would have been better armed,” says
Robinson. “Mortgages can seem to be
perfect that day but with any instant
gratification, if you think about it,
maybe it’s not a good thing.”

“Mortgage Ready”
The sort of homeownership counseling
that Donna Turner received at DHIC
is fairly rigorous — up close and
personal, and not cheap to provide.
In 2006, the center shuttled about
480 people through its program, and
210 ended up buying homes that year.
A big chunk of DHIC’s clientele exists
because of lender requirements.
The city of Raleigh, for example,
offers some low-income residents
up to $20,000 in downpayment
assistance, but orders first that they
complete a DHIC counseling program. Charlotte-based Bank of
America instructed dozens of its
clients in the past year to attend
DHIC seminars as part of their
mortgage qualification process.
Almost everyone who comes to
DHIC, initially, would be considered a
subprime borrowing candidate. The
counselors here talk about getting
clients “mortgage ready.” The charge
for this service is $25.
Like a lot of nonprofit housing
organizations, DHIC derives most of
its operating revenues from development projects, where it builds
low-income housing. Grants provide
cash for services that don’t pay for
themselves, including homeownership

counseling. DHIC owns rental housing and in 2004 and 2005 sold 54
homes at its MeadowCreek subdivision, where Turner now lives.
There is a class on adjustable rates.
The counselors walk their clients
through “good-faith” estimates point
by point, highlighting potential trouble spots like high upfront fees or
the possibility of ballooning rates
down the road. For many, there is
subsequent one-on-one counseling to
improve credit scores before even
trying to secure a loan.
Are some brokers trying to sell
products that borrowers probably
can’t handle? Probably, DHIC counselors say. “A lot of our clients are told,
‘ Do it now and then you can refinance
in a year,’” says Porter, who was
Turner’s main mortgage counselor.
“But they probably have to come up
with more out-of-pocket money to do
that because they won’t have enough
equity built up to cover all the costs.
I’ve had clients come in with a goodfaith estimate, and with their credit
score, and I’m thinking, ‘Why are you
being offered this?’”
And yet, some borrowers simply
act on what they want to hear, ignoring
what they know is true.
“It’s more complicated now,” says
Saundra Harper, a sales manager and
counselor at DHIC. “You’ve got so
many different products that have
come on board, like interest-only
loans. I’ve seen lenders come up with
some unbelievable things.”
DHIC does not keep track of its
clients in a systematic way after they
complete their counseling, so there is
no way to say how effective the
programs have been. Anecdotally,
DHIC staffers offer up evidence like
Turner. And they wonder why there
isn’t a bigger push to support
pre-homeownership counseling for
low-income borrowers. “We’ve been
asking that question for a long time,”
says Gregg Warren, DHIC president.
He attributes some of the lack of
motivation to the way mortgages are
sold to investors, seemingly reducing
the risk that lenders carry, and thus
continued on page 39

BookFall08Final

1/28/08

2:41 PM

Page 21

Professional
Prognosticators
Is Forecasting a Science or an Art?
BY D O U G C A M P B E L L

BLUE CHIP ECONOMIC INDICATORS

J

im
Smith
tells
the
story
this
way:
It was the summer of 1971. Smith was the director of
credit market research at Sears, Roebuck and Co. One
day the chief executive, a man named Gordon Metcalf,
strolled into Smith’s office and talked about his recent visits
with international suppliers. Overseas, Metcalf said, there
was growing sentiment that the dollar was overvalued.
Metcalf realized that if the dollar decreased in value, it could
hurt Sears’ business. Sears needed a clearer picture of the
future impact of such a change.
“Get together with your friends and see what you can
do,” Metcalf ordered. So Smith dialed up his friends at the
University of Pennsylvania, where the famed Wharton
Econometric Forecasting Associates (WEFA) was housed.
At the time, the notion that the gold-backed dollar might
ever float in value was still considered far-fetched by some.
But WEFA spent a few weeks tweaking a short-term model
and ran some simulations for Smith, who duly reported the
results to the executive suite.
It turned out to be highly valuable information, especially after Aug.15, 1971, when the Nixon administration brought
an end to the Bretton Woods Agreement of 1944 that fixed
exchange rates worldwide. From then on, the dollar would
float, its value determined by the constantly changing balance of supply and demand. While most other firms were
caught off guard, Sears was ready.
“We saved and made a ton of money as a result of forecasting,” Smith says today from his office in North Carolina,
where he is chief economist with Parsec Financial in

Asheville. “That model pretty well played out with all that
happened over the next two to three years.”
This tale underlines the worth of a good forecast. In his
time, Smith has made a few. In fact, after Sears he went on to
become one of the nation’s most celebrated economic forecasters. Since the late 1990s, the Wall Street Journal has three
times named him the nation’s most accurate forecaster.
But is there such a thing as a “star” forecaster? Are there
a handful of prognosticators whose abilities consistently surpass the crowd? If so, then you would think they are either in
possession of superior instincts or superior mathematical
models. Perhaps it’s a little of both.
Models of all stripes can never perfectly predict the
future because they are not exact replicas of the actual
economy. To get an accurate forecast, you need information
that gets closer to the current state of affairs. Maybe a certain forecaster is friends with a banker who provides the tip
that more loans are going past due. That’s information the
forecaster would want to incorporate. Of course, information can be wrong. The loan problems might have been
limited to that single bank.
“It takes a great deal of tender, loving care to get the
forecasts to run properly,” Smith says. “Nobody is perfect
every time.”

Stars
Forecasters are constantly being ranked. Besides the Wall
Street Journal, there are rankings and surveys in USA Today
and BusinessWeek, as well as in the monthly Blue Chip

Fa l l 2 0 0 7 • R e g i o n Fo c u s

21

BookFall08Final

1/28/08

2:41 PM

Page 22

Economic Indicators, a must-read for
chief economists at Fortune 500 firms.
The surveyed forecasts encompass
firms ranging from Morgan Stanley to
FedEx on measures ranging from GDP
to housing starts, usually predicting
changes to a tenth of a percent. Over
time, a handful of forecasters stand
out. These are the star forecasters, and
it’s fair to say that Stuart Hoffman is
among them.
Hoffman is chief economist at
Pittsburgh-based PNC Financial
Services Group. The Wall Street Journal
named him one of the nation’s top
forecasters from 1988 through 2006,
a remarkable run. And Business
Week named him the most accurate
forecaster for 2004.
Hoffman develops his forecasts the
way many others do. He uses a basic
model and monitors data ranging from
consumer spending to productivity.
He lets the model run for four to six
quarters out to “see what the key economic trend looks like.” Then he
makes adjustments “based partly on
intuition and conversations with other
economists, particularly people in the
business who are contacts I have.”
This talking-and-listening approach is
most useful for short-term estimates.
It is this network of contacts to which
Hoffman attributes his success. That
and his distance from Wall Street,
where there is a tendency, Hoffman
believes, for economists to get too
caught up in the state of financial
services and ignore other sectors of
the economy.
Though there are more data available today, which are quicker both to
obtain and to process, and models are
more intricate, Hoffman isn’t so sure
his forecasts are much superior to
what they were 20 years ago.
“Forecasting is still as much of an art
as it ever was,” he says.
Smith agrees with that assessment.
Though he is skilled in econometrics
— a leading tool of forecasters, which
uses both theory and statistical techniques to evaluate data — Smith
believes that good forecasts are the
result of good information. He attributes his predictive success to his
22

R e g i o n Fo c u s • Fa l l 2 0 0 7

ability to listen. “I have never found
a substitute in my 35 years of doing
this for asking people what they
think is going on,” Smith says. “There’s
always somebody who knows more
than you do, and you’re well-advised
to listen.”

Building Crystal Balls
Modern-day forecasting history
begins with Jan Tinbergen and
Lawrence Klein, who both received
Nobel Prizes primarily for their work
in building multi-equation econometric models. In the 1950s Klein’s
models of the U.S. economy became
the most widely used. In 1963, he
established WEFA, which used a
model bearing the association’s name.
Smith was tapping into a more evolved
version of this model helping Sears
anticipate the impact of a floating dollar. As the cost of computer power
declined, forecasting models grew
richer and more complex.
For a time, there were three major
economic forecasting models —
one used by WEFA, another by Chase
Econometrics, and a third by Data
Resources Inc., developed by its
founder Otto Eckstein. All three
of these entities later merged to
become Global Insight, today the
largest economic forecasting firm in
the world, with 600 employees and
about $100 million in annual revenue.
Leading rivals to Global Insight
include Macroeconomic Advisers,
founded by former Federal Reserve
Governor Laurence Meyer, and
Moody’s Economy.com.
If you had models that could
perfectly predict the future, then that
would be one thing. But as Robert
Lucas acknowledged with rational
expectations theory, the world is an
uncertain place. Changes in economic
conditions can be no more easy to
predict than the next roll of the
dice. People are forward-looking. As
government policies and economic
conditions change, so do people’s
expectations about the future and
hence their actions; moreover,
people’s actions respond both directly
to present conditions — today’s prices,

holding future expectations constant
— and to expectations of the future. It
is difficult to build a model capable of
incorporating all these factors.
Certainly, it is impossible to make
predictions on measures like GDP
with precision to even the tenth
decimal place.
“As long as you take the model
forecast for what it is, models are very
useful tools,” says Roy Webb, a senior
economist with the Richmond Fed
who has studied forecasting accuracy.
“The danger is you assign these numbers
more significance than you should.”
There is considerable academic
debate about which sort of models are
the best — for various purposes one
might choose among econometric
models, or simpler vector autoregressive (VAR) models. Among the key
differences is that structural models
use economic theory to constrain
the possible relationships among
variables, while VARs are often
considered “atheoretical” because
they tend to let the data speak for
themselves.
Some observers argue that all the
subjective fiddling that goes onto
modeling strips them of any scientific
legitimacy. “Add factors” introduce an
extra degree of human error into the
process, inevitably fouling it up.
Despite such concerns, that’s how
most forecasters operate. They use a
model to get a sort of baseline, and
then add in factors that may not yet be
either showing up in the data or for
which the model may ignore. Take the
U.S. macroeconomic model used at
Global Insight. It has about 1,900 variables, with data points coming from
national income and product accounts,
price indexes, and 25 different interest
rates. Then economists take over.
“Forecasts are a combination of
econometrics and judgment,” says
Sara Johnson, a managing director
and economist at Global Insight.
“The econometrics help us to draw
statistical relationships based on
the historical record. Economists can
then insert their judgment based on
how current conditions might differ
from the past, based on factors

BookFall08Final

1/28/08

2:41 PM

Page 23

that the models cannot, or do not,
fully incorporate.”

Which Way?
The apparent slow of economic
activity from the third to the fourth
quarter led some to wonder whether
the economy was approaching a turning point. This is when forecasters
really earn their keep. “Whenever
there’s volatility, demand for our services increases,” Johnson says. “Our
clients are watching our forecasts and
analyses even more closely and having
more frequent contact with us.”
For an industry strategist trying to
figure out what to do next, this might
seem a tempting time to rely on an
aggregate of multiple forecasts of the
aggregate economy. That is because
the average consistently beats individual forecasters. An Atlanta Fed study
examined forecaster rankings in
the Blue Chip Economic Indicators
Survey. It found that the consensus
forecast performed better over time
than any individual forecaster —
although several forecasters did quite
well. “This result is a ‘reverse Lake
Wobegon’ effect: none of the forecasters are better than the average
forecaster,” the authors wrote. “There
are superior forecasters, but no individual has access to all of the
independent information from all of
the forecasts that is incorporated into
the consensus forecast.”
This underlines a truth that will
come as no surprise to fans of
The Wisdom of Crowds by James
Surowiecki — who argues that collective information tends to be more
reliable than individual assessments.
And yet, some individual firms and
forecasters do consistently outshine
others. For example: Blue Chip

Economic Indicators hands out an annual
award to the best forecasting record
over the past four years based on projections of real GDP, the consumer
price index, three-month Treasury
bills, and the unemployment rate.
A few firms, including Global Insight
and Macroeconomic Advisers, are
dependably in the upper echelons
of the rankings. (Notably, the rankings
don’t point out forecasters who
consistently miss; there is no “Most
Inaccurate Forecaster” award.)
Randell Moore, editor of the Blue
Chip survey, notes that DuPont has
won the annual honor three times in
the past three decades — but each
time with a different chief economist.
“I don’t detect that any individual is
particularly good over long periods of
time at forecasting,” Moore says.
“That’s why using the consensus
appears to make the most sense.”
An interesting exception may be
the Federal Reserve’s “Greenbook.”
Certain economic projections from
the Greenbook are released to the
public after a five-year lag, and studies
have shown that those projections are
quite reliable compared with private
forecasts. The Greenbook process
is a back-and-forth between the large
Federal Reserve Board model and
subjective add-ons by staff experts.
In the most cited study, economists
Christina Romer and David Romer
with the University of California at
Berkeley attributed the Greenbook’s
accuracy to the finding that the Fed
“appears to possess information about
the future state of the economy that
is not known to market participants.”
Princeton University economist
Christopher Sims found that the
Greenbook even beats most of the
Fed’s own model-based forecasts. Sims

agrees that there is some evidence,
though not complete, that “the superiority of the Fed forecasts arises from
the Fed having an advantage in the
timing of information — even with the
view that this might arise entirely
from the Fed having advance knowledge of its own policy intentions.”

Shrinking Industry
For all the potential payoff that a good
forecast can deliver, the business of
economic forecasting has been contracting. In the heyday of the 1960s
and 1970s, it was customary for big
companies to keep economics departments, with several analysts reporting
to a chief economist. But cost-cutting
began in the 1980s, as many firms saw
they could simply contract for forecasting services, or rely on published
consensus forecasts. Bank mergers
also led to consolidation of economic
research departments.
Smith believes that businesses
which give up in-house forecasters
with see it reflected in their bottom
line. “There’s no way to cope with all
the changes that come up and have a
feel for whether something is a major
shift, or a tempest in a teapot that
will pass, unless you have your own
internal group,” Smith says. “You won’t
find a consensus for steel demand, or
for vehicle output, and that’s of
huge importance to many industries.
It’s a small investment and you only
have to get a few things right to pay
for themselves.”
Of course, even in-house forecasters get things wrong, as Smith readily
admits about his own career. This is
why Smith likes to quote perhaps his
field’s oldest of axioms: “He who lives
by the crystal ball must learn to love
the taste of broken glass.”
RF

READINGS
Bauer, Andy, Robert Eisenbeis, Daniel Waggoner, and Tao Zha.
“Forecast Evaluation with Cross-Sectional Data: The Blue Chip
Surveys.” Federal Reserve Bank of Atlanta Economic Review,
Second Quarter 2003, vol. 88, pp. 17-31.
Romer, Christina D., and David H. Romer. “Federal Reserve
Information and the Behavior of Interest Rates.” American
Economic Review, June 2000, vol. 90, no. 3, pp. 429-457.

Sims, Christopher. “The Role of Models and Probabilities in the
Monetary Policy Process.” Brookings Papers on Economic Activity,
September 2002, no. 2, pp. 1-62.
Tulip, Peter. “Has Output Become More Predictable? Changes in
Greenbook Forecast Accuracy.” Federal Reserve Board of
Governors Finance and Economics Discussion Series, August
2005, no. 31.

Fa l l 2 0 0 7 • R e g i o n Fo c u s

23

BookFall08Final

1/28/08

2:41 PM

Page 24

Runs Make the Bank
The fragile capital structure of banks makes them inevitably
prone to runs, and that’s a good thing
anks are one of the most powerful and enduring
institutions of all time. They have survived runs and
panics and the Great Depression. They have folded,
divided, and merged. They have withstood and participated
in the parade of financial innovation. And even flourishing
capital markets could not make them obsolete.
The persistence and pervasiveness of banks suggests that
they provide a unique service. Companies, for example,
overwhelmingly prefer banks when seeking financing outside their own coffers. “Bank loans are the predominant
source of external funding in all the [industrialized] countries,” note economists Gary Gorton of the University of
Pennsylvania and Andrew Winton of the University of
Minnesota, authors of a survey on financial intermediation.
Instead of borrowing from banks, firms could secure the
funding they need through the sale of a stock or bond, by
going directly to the capital market. However, “in none of
the countries are capital markets a significant source of
financing,” Gorton and Winton note. “Equity markets are
insignificant.” Their observations come from a 1990 study
that looks at the sources of net financing by nonfinancial
enterprises from 1970 to 1985. In the United States, about
24.4 percent of investment by firms was financed by bank
loans, 11.6 percent by bonds, and only 1.1 percent by shares.
Studies have also found that the stock market price of a
firm responds more favorably to the announcement of a new

B

24

R e g i o n Fo c u s • Fa l l 2 0 0 7

SUMO

bank loan or the renewal of an existing one, compared with
news of an offering of company securities in capital markets.
Others have shown that if a borrower’s bank fails, it can
cause a substantial loss to the borrower because his valuable
relationship with a bank is destroyed. In other words, it
won’t be easy for a borrower to switch financiers if his bank
shuts down.
But what specifically makes banks so special? What is it
about the way they organize themselves that sets them apart
from other businesses? The fact is, as dominant as banks are,
their basic structure is actually quite fragile. On the asset
side, banks make loans to borrowers that are typically longterm and are inherently illiquid, not easily converted to cash.
On the liability side, depositors expect that they can withdraw their money anytime they need to. However, this may
force banks to sell their assets, possibly at a much lower
price, if depositors demand more money than what the bank
has readily available. Thus, the bank’s activities on both
sides of the balance sheet, although valuable, appear to be
ruinously incompatible.
To protect banks and their clients from this apparent vulnerability, financial regulators have typically responded with
supervision, safety nets, and even proposals to downsize and
restrict banks’ activities. However, University of Chicago
Graduate School of Business economists Douglas Diamond,
who is also a visiting scholar at the Richmond Fed, and

PHOTOGRAPHY: GETTY IMAGES

BY VA N E S S A

BookFall08Final

1/28/08

2:41 PM

Page 25

Raghuram Rajan say that there is actually a good reason for a bank’s choice
of such a delicate arrangement.
Far from being a concern, a bank’s
distinctive asset and liability structure
is precisely what allows the bank to
provide liquidity at all times; that is, to
make funds available to both longterm borrowers and short-term
depositors whenever a need arises.
The explanation for this surprising
result comes from a rather catastrophic
prospect built into a bank’s fragile
capital structure: the threat of runs.

Bank Runs
When the public suspects that a bank
may become insolvent, depositors will
rush to take out their money in desperate hope that they won’t be last in
line. The sudden demand for cash can
force a bank to sell assets prematurely
at a loss and, consequently, may cause
that bank to fail, whether or not it was
healthy prior to the run. On a scale
that affects many banks, runs can disrupt economic activity and cause
financial distress to many people.
Perhaps paradoxically, the possibility of bank runs arises from a valuable
service that banks perform: transforming illiquid assets or bank loans
into liquid liabilities or deposits,
according to a 1983 paper by Diamond
and Washington University economist
Philip Dybvig, considered the most
important and well-known analysis on
bank runs. In other words, the ability
to provide funds to depositors on
demand even if the bank holds mostly
illiquid assets on its balance sheet is
what makes a bank a bank. But it is
also why they are vulnerable to runs.
A depositor may want to invest his
money but is worried that tying up his
funds will make it difficult to withdraw, except at a considerable loss,
when a personal need suddenly arises.
Banks — as opposed to another investment vehicle — can improve upon this
situation by getting all the depositors
together and pooling everybody’s risk
of holding an illiquid asset. This works
well because banks know with some
certainty that for a given pool of
depositors, only a fraction will ordi-

narily take out their money at any
given time. Thus, banks can offer
depositors a way to get out on better
terms than would have been available
to them had they invested individually.
But this solution also opens up the
possibility that things may not go
according to plan. If depositors panic
and turn up earlier than expected, then
those who will come to the bank later
know that they may not get
as much as they were promised, and
indeed may not get anything at
all because the bank will not have
sufficient resources. Thus, a “firstcome-first-served” rule induces the
very real possibility that if some depositors ever get a whiff that a bank may
be in trouble, even those who were previously not concerned about the bank’s
health will rush to withdraw their
money. “If a run is feared, it becomes a
self-fulfilling prophecy,” says Diamond.
Whether the rumor was true or not
and whether depositors believe it or
not, no depositor wants to be the last
one to line up at the bank’s door. This
summer, depositors at British bank
Northern Rock raced to take out their
money when news leaked out that the
central bank would provide emergency
funds to the troubled bank.
Deposit insurance is one way to
prevent runs and is provided in many
countries. The purpose and terms may
differ, but deposit insurance in general
assures that no matter what happens
to the bank and no matter how many
people come to withdraw, depositors
will always get the amount that they
were promised. The government is a
natural insurance provider because it
has the authority to tax, say Diamond
and Dybvig, so it can guarantee to
come to the bank’s rescue without
having to hold a large amount of liquid
assets to back up that claim. A deposit
insurance law commits the government to insure banks, which is a
stronger pledge than more discretionary policies such as suspending the
convertibility of deposits to cash.

Runs as a Commitment Device
One would think that a bank’s fragile
structure is surely a weakness, for how

can bank runs be a good thing? But
according to Diamond and Rajan, this
weakness is also its strength. In a series
of papers written in 2000 and 2001,
Diamond and Rajan argue that banks
as we know them today choose such a
structure because the possibility of a
run is what gives them the power to
provide liquidity, which is the very
thing that makes banks unique.
The story begins in a theoretical
environment where banks don’t exist.
An entrepreneur needs to finance a
project and a lender has money to
invest in it. Only the entrepreneur has
the specific skill to generate the highest cash flow possible from this
undertaking, so once the investment is
made, the project would be worth
much less in somebody else’s hands. In
this case, a lender’s investment in that
project is said to be illiquid. One could
think of a top-rated chef who wants to
open a restaurant. If he decides to quit
before the restaurant opens, then the
lender can seize the restaurant, but he
would have difficulty finding another
chef of the same caliber to operate it.
The plot gets thicker if the lender
himself needs cash at some interim
date. To obtain the money, the lender
can opt to borrow against the loan he
made to the chef, by promising to collect the cash flows generated from the
restaurant venture on behalf of a new
investor. However, the investor knows
only too well that the lender might be
tempted to pay back less than what
they agreed upon. If the investor
thinks that the lender cannot commit
to being honest, then it would be
impossible for the lender to borrow an
amount equivalent to the full value of
the loan. The consequence of this
chain of illiquidity is clear: Either the
loan to the chef will not be made in the
first place or the cost to him of borrowing money will be very high,
because the lender will need to be
compensated for the illiquidity of
the loan.
The way to resolve this dilemma is
for the lender to write a contract that
guarantees investors can take out their
money at any time they please. In this
way, if the lender tries to extract more

Fa l l 2 0 0 7 • R e g i o n Fo c u s

25

BookFall08Final

1/28/08

2:41 PM

Page 26

money by renegotiating the contract
and offering investors less than what
had been promised, then investors will
quickly withdraw all their funds
because they assume others will
do the same, leaving the lender emptyhanded. A run is painful for the lender
because his income depends primarily
on the service he provides as an intermediary between the entrepreneur
and the investors, so a run will drive
his income to zero. Therefore, the
lender would never attempt to
renegotiate the contract and will
always strive to give investors what
he promised.
As it turns out, this type of “relationship” lender is exactly the kind of
bank we have today, one that lends
money for long-term projects but at
the same time collects short-term
deposits. A delicate capital structure
that is vulnerable to runs is what
makes the bank’s commitment credible and effective. This ensures that
depositors will always be willing to put
their money in the bank, and that
there will always be a steady supply of
funds for the bank to lend to entrepreneurs. If the bank ever misbehaves,
then the depositors will run and the
bank will shut down.
Thus, if depositors couldn’t run on
the bank, then there would be no
way to create liquidity. While it may
seem counterintuitive to think of a
bank run as a good thing, it is actually
only the possibility of one that is
desirable. “The threat of a run, great;
the fact of a run, that’s bad,” says
Diamond.
The commitment to discipline
banks is convincing because it promises
to punish even if the punishment is
painful for the depositors themselves.
“This is going to hurt me as much as it
is going to hurt you, but I will do it
anyway. Therefore, you know that if
you mess around, you’re going to get
the sanction imposed on you,”
explains Diamond by taking the
depositor’s perspective. Even if it is
not in the depositors’ collective interest to pull their money out, they will
rush to the bank anyway when they
spot a crime in progress.
26

R e g i o n Fo c u s • Fa l l 2 0 0 7

The Narrow Banking Alternative
Stuart Greenbaum, former dean
and professor emeritus of finance at
Washington University, thinks that
while Diamond and Rajan’s proposal
has some merit, “building in a weakness because the weakness will make
you strong” sounds a bit like “hotel
music.” It’s pleasing, but it makes too
much of a bank’s delicate capital structure. “It’s one of those arguments
where you find virtue in a weakness,
developing compensating strengths
for some sort of disability you might
have,” says Greenbaum.
It could be desirable to avoid a fragile structure altogether, according to
economists who believe that a 100
percent reserve requirement should be
imposed on deposits that can be withdrawn on demand (this group includes
Milton Friedman). Such a proposal
would effectively narrow a bank’s
activities by requiring it to invest
demand deposits solely in “safe” shortterm assets like Treasury bills, as
opposed to illiquid assets such as
loans. Putting deposits in very liquid
assets makes the banking system runproof. It precludes a bank run because
depositors know with certainty that
their deposits are backed by investments the bank can quickly convert
into cash. A narrow bank could be
chartered separately, while other institutions that lend to longer-term
projects would be forbidden to finance
these projects with demand deposits.
Narrow banking would make the
financial system a more stable place
because it would provide greater safety
against bank runs, says Greenbaum.
But it would come with a cost. Under a
narrow banking arrangement, deposittaking banks would lose that special
ability to turn illiquid assets into liquid
liabilities. “It provides a greater degree
of safety, at a cost of the production of
liquidity through mismatching [of
assets and liabilities],” Greenbaum
says. In other words, banks would not
be able to use the rich mass of demand
deposits to fund projects that have a
much longer duration. Economists
agree on this, but disagree on just how
large that cost is.

An analysis by Neil Wallace,
an economist at Pennsylvania State
University, attempts to quantitatively
compare these opposing worlds, by
extending the original Diamond and
Dybvig model of fragile banking to
include the possibility of a narrow
banking system. Overall, he finds that
the narrow banking alternative is
undesirable. “It eliminated any role for
banking,” says Wallace. History is rife
with episodes of panics and runs, and
perhaps narrow banking can prevent
that, but at what cost? Wallace thinks
it might be substantial. “Using narrow
banking to cope with the potential
problems of banking illiquidity is analogous to reducing automobile
accidents by limiting automobile
speeds to zero,” writes Wallace in his
paper. Diamond and Rajan agree.
They think that narrow banking
would essentially “kill liquidity
creation and result in lower credit
availability to borrowers.”
Greenbaum, however, thinks otherwise. “It doesn’t preclude the production of liquidity,” Greenbaum says.
He says that there are other ways of
creating liquidity without using
demand deposits, in particular by “mismatching” other financial instruments
on the bank’s balance sheet. For
instance, instead of using the money
from checking accounts and transforming these funds into loans, another
institution can take a one-year time
deposit and lend out a three-year loan.
Hence, in this view, banks do not need
the threat of runs to create liquidity.
(However, some ways of creating liquidity may not be immune from
run-like events. Recently, “structured
investment vehicles,” which issue commercial paper backed by longer-term
assets such as mortgages, had trouble
rolling over their paper when investors
started doubting the quality of the
underlying assets.)
Nonetheless, no country has ever
experimented with narrow banking
and Greenbaum says it will probably
never happen. And so in the existing
banking system where banks’ longterm assets are backed by mostly
demand deposits, regulators have

BookFall08Final

1/28/08

2:41 PM

Page 27

responded with oversight, stops, and
safety nets. “We make the best of it.
We do it with regulation, we do it with
monitoring, and all sorts of restrictions in order to avoid the worst
instability. That’s the basic fact of
the case,” says Greenbaum.

The Implication for Safety Nets
The threat of runs, say Diamond and
Rajan, keeps banks from misbehaving,
because if they ever do anything that
people perceive might impose a loss
on depositors, the bank would be
closed immediately. If so, then certain
safety nets like deposit insurance,
which is often thought to prevent
jumpy depositors from running on
the bank, may actually reduce the
incentive for banks to behave well
because it removes the depositors
commitment to run. So why have
deposit insurance?
In the real world, unexpected
events can cause losses, even if they
have nothing to do with a bank’s
behavior. For instance, if the economy
is hit by a recession, a bank’s investments may not generate as much
return as expected, and as a result, the
bank may not be able to deliver what it
promised to its depositors. Thus,
while the threat of runs keeps banks
from misbehaving, the real-world
uncertainty might make it excessively
susceptible to panics.
In this case, deposit insurance
could be helpful by tempering depositors’ nerves, but where to draw the line
is tricky. On the one hand, bank panics

and their dire consequences should be
avoided, but on the other, a fully
insured bank will lose the disciplining
mechanism that was built in its capital
structure — it would make banks more
likely to take big risks. As a result,
deposit insurance would require additional financial regulation because the
onus to impose the appropriate penalties now lies with the regulator.
“Deposit insurance is only going to
work well if regulators are good at actually closing banks whenever they
misbehave,” Diamond says.
But if there is a sense that some
banks may be too big to fail, regulators
may be hesitant to carry out that punishment. Diamond thinks that having
limited deposit insurance likewise
disciplines the regulators themselves,
because if they intervene to bail out
a bank, then this very public event
will receive scrutiny by the political
process, which can subsequently
improve regulation. Hence, in assessing how much of a bank’s deposits
should be insured, regulators must try
to get as much as possible of the good
and very little of the bad. They would
have to weigh the importance of
enforcing discipline against ensuring
financial stability.
The implications of Diamond and
Rajan’s proposal for deposit insurance
also hold true for capital adequacy
rules. Bank capital includes long-term
claims such as equity and long-term
debt, “softer” claims that are not subject to runs. As such, too high an
amount of bank capital is not desirable

because it impairs the bank’s ability to
create liquidity by removing the
depositors’ incentive to punish. But if
banks keep too low a buffer, then they
might fail too often. Indeed, banks
themselves will choose some amount
of capital, regardless of government
regulation.
The question, then, is whether
regulators want to stipulate an amount
other than that level, keeping in mind
the trade-off between creating
liquidity and stability in the financial
system. If stability is considered the
more important goal and a higher
minimum capital requirement is stipulated, then regulatory standards ought
to be more intense to keep the banks
in check. If these standards are good,
then a higher level of capital requirement won’t compromise too much of
the bank’s unique ability to provide
funds to those who need it and at
the same time will make the bank less
vulnerable to the vagaries of the
business cycle.
Despite their apparent fragility,
banks have persevered through
centuries and continue to be integral
to the economy. Indeed, one can
recognize the might of banks by the
grandeur of their buildings and marble
interiors, just as the palaces of the
past were iconic of the stature of kings
and queens. And just as the power
of the monarchies relies on the
allegiance of their subjects, the
strength of banks depends mostly,
as it turns out, on even the littlest of
their depositors.
RF

READINGS
Gorton, Gary, and Andrew Winton. “Financial Intermediation,” in
George Constantinides, Milton Harris, and Rene Stultz (eds.)
Handbook of the Economics of Finance, 2007, vol. 1, part 1, pp. 431-552.

Diamond, Douglas W., and Raghuram G. Rajan. “A Theory of
Bank Capital.” Journal of Finance, December 2000, vol. 55, no. 6,
pp. 2431-2465.

Greenbaum, Stuart I., and Anjan V. Thakor. Contemporary Financial
Intermediation. Orlando: The Dryden Press, 1995.

____. “Liquidity Risk, Liquidity Creation and Financial Fragility:
A Theory of Banking.” Journal of Political Economy, April 2001, vol.
109, no. 2, pp. 287-327.

Diamond, Douglas W. “Banks and Liquidity Creation: A Simple
Exposition of the Diamond-Dybvig Model.” Federal Reserve Bank
of Richmond Economic Quarterly, Spring 2007, vol. 93, no. 2,
pp. 189-200.
Diamond, Douglas W., and Philip H. Dybvig. “Bank Runs, Deposit
Insurance, and Liquidity.” Journal of Political Economy, June 1983,
vol. 91, no. 3, pp. 401-419.

____. “Banks and Liquidity,” American Economic Review Papers and
Proceedings, May 2001, vol. 91, no. 2, pp. 422-425.
Wallace, Neil. “Narrow Banking Meets the Diamond-Dybvig
Model.” Federal Reserve Bank of Minneapolis Quarterly Review,
Winter 1996, vol. 20, no. 1, pp. 3-13.

Fa l l 2 0 0 7 • R e g i o n Fo c u s

27

BookFall08Final

1/28/08

2:41 PM

Page 28

CRASh

In Virginia, private insurers test vehicles for safety.
Isn’t that the government’s job?
BY D O U G C A M P B E L L

E

28

R e g i o n Fo c u s • Fa l l 2 0 0 7

An important thing to understand about the IIHS is that
it was created by and still funded by insurance companies.
It is a nonprofit, private-sector organization performing
functions that one might otherwise assume would be done
by the government. It does so perhaps in part because of the
goodwill it generates with improving vehicle safety. But it is
also true that the insurers who fund the institute see returns
on their investments in other ways.
With safer vehicles, claims are reduced. Minimizing losses
is obviously useful to insurance firms, in so much as it
reduces potential payouts from claims. Even more useful are
the data gleaned from IIHS crash tests. With information
about the expected severity of injuries — to both vehicle and
human bodies — insurers can fine-tune premiums to
maximize profits. It is an instance of private-sector initiative
in performing a role — improving automobile safety —
ordinarily assigned to government.
“We’re bullish enough on the outcome of what the
institute has done and the data that comes out that it’s well
worth the investment,” says Dave Skove, an executive with
Progressive Insurance who served as IIHS chairman in 2005.
Like many insurers, Progressive has a target underwriting
profit margin, in its case 4 percent. “We’re interested in
the margin. So if cars tend to be safer and we can help that,
then great.” By extension, it is often in the interest of automobile companies to reveal information about the safety of
their products, as positive reviews can have a positive impact
on sales. For this reason, automakers are quite cooperative
with IIHS’ efforts.

Early Days
The Insurance Institute for Highway Safety was born
in 1959. Some of the nation’s biggest insurers — Allstate,
State Farm, and Nationwide among them — initially
put their money into research on driver-education programs. After a time, the research produced some
surprising findings: Driver-education programs don’t help
reduce crashes among teens, because they tend to help
youths get licensed at younger ages. So IIHS leaders
decided to take a new approach, turning away from the
focus on drivers themselves toward the cars they drive.
They recruited William Haddon, the former head
of what is now the National Highway and Traffic
Safety Administration, to study the safety features
of automobiles.

PHOTOGRAPHY: INSURANCE INSTITUTE FOR HIGHWAY SAFETY

ven without the painful screams, the sound of
screeching tires and busting glass is sickening. On
Tuesday, July 24, a compact, pointed sled cruising at
31.1 mph hit a 2007 Ford Explorer carrying two BioSIDs,
or “small-stature female side-impact” dummies. The
impact, centered just between the driver-side doors, threw
the vehicle back 10 feet, shattered the windshield, and
violently whipped the seat-belted dummies about the
passenger cabin.
For a moment afterward, it was dead silent. Then the
lights went up. A polo-shirted man stepped up to the crash
scene and quickly began sweeping away the tiny shards
of glass. A team of at least 12 engineers descended, pushing
computers on wheeled trays. Now it was time to learn the
extent of the dummies’ injuries.
The venue for this staged accident was the Insurance
Institute for Highway Safety’s (IIHS) Vehicle Research
Center in Ruckersville, Va., just outside of Charlottesville.
The VRC, for short, conducts about 70 of these side-impact
crashes each year. Each is attended by representatives of the
crashed vehicle’s manufacturing company, in this case Ford.
After all the data are processed, the IIHS will issue a report
card of sorts, grading the Explorer on how effectively it
protected the dummies. The very best models earn a
“Top Safety Pick” designation, a Good Housekeeping Seal of
Approval for the automobile industry. Companies often use
IIHS-produced video footage of the most successful crashes
in their TV commercials.
A “poor” rating, on the other hand, can translate
to slumping sales and costly redesigns. This was the
case with the Pontiac Transport, a minivan whose poor
safety designation in 1997 prompted an overhaul that
resulted in the newly dubbed Uplander, which garnered
a good rating from the IIHS in 2005. It goes to show the
sometimes powerful influence IIHS ratings can have.“There
is no question that our ability to do these crash tests and
show the differences among vehicles and their
different amounts of protection is forcing the automakers to
change their designs,” says Adrian Lund, IIHS president.
In the institute’s early days, rear-end tests were the staple.
A slim minority of vehicles back then were gathering “good”
safety ratings. Today, the institute rarely bothers with
rear-end tests because the clear majority of vehicles are performing so well on that standard. Instead, it relies on spot
checks and data provided by the automakers themselves.

BookFall08Final

1/28/08

2:41 PM

Page 29

The government first started consumer car crash tests in the late 1970s.
Until that time, automakers disputed
the notion that “safety sells.” But with
the crash data, consumers for the first
time could compare vehicle ratings
based on objective data. Increasingly,
safety features became standard-issue
selling points.
For years the institute relied on
government data or performed crash
tests on a limited basis. But the
cavernous building in Ruckersville
allowed IIHS researchers to conduct
their own tests on a vast scale in a
controlled environment. Besides sideimpact tests, they perform (with
decreasing frequency) rear-end and
frontal crashes. The standard barrier
that slams into tested vehicles aims to
replicate the sort one often finds on
the road in the early 21st century;
namely, sport-utility vehicles or large
trucks. While certainly not perfect
stand-ins for real-world crashes,
IIHS tests provide objective, easily
comparative results that consumers
and others can use in making
purchasing decisions.
Today, IIHS announcements make
headlines the world over. On Aug. 15,
for example, came side-impact results
for luxury sedans. Acura and Volvo
were among the manufacturers
claiming the coveted highest ratings,
while BMW came out as the worst
performer. A BMW spokesman
explained to the Associated Press that
test results can vary based on a number
of factors: “This was one test on one
day on one car.”
The side-impact test is IIHS’
biggest. The institute says that sideimpact crashes are the most common
type of fatal crash in the nation, killing
about 9,000 people each year. With its
$14 million annual budget, IIHS can
afford about 70 side-impact crashes a
year. Its expenses include buying vehicles right off dealer lots. (Though auto
firms might be quite happy to provide
cars for free, IIHS seeks to ensure that
the cars it tests are identical to the cars
consumers actually buy.) At the VRC,
teams of engineers must be paid,
dummies built and refurbished (a fully

instrumented dummy costs about
$125,000), and the antiseptically clean
building itself maintained.

Payoff
Insurers pay prorated amounts to keep
IIHS running. Membership accounts
for about 70 percent of the private
passenger insurance market. The IIHS
accomplishes a number of goals for
insurance companies. Among them is
positive PR from nonprofit efforts to
reduce traffic fatalities. Another is the
pecuniary benefit of all the data captured by the VRC as well as those
collected by the institute’s sister
organization, the Highway Loss Data
Institute (HLDI). Vehicles with side
airbags, better stability control, or
less susceptibility to crushed bumpers
may get discounts when premiums
are considered.
The HLDI is a huge trove of
valuable information for insurance
companies. Basically, participating
firms furnished their own loss information, which is then processed and
mined by the HLDI, which in turn
makes public much of its studies, such
as loss rates by vehicle make and
model. Insurance firms can use some
of the same data to precisely price
their premiums.
The weekly, sometimes twiceweekly, side-impact crash is a veritable
spectator event. Usually on hand are
representatives of various insurance
agency claims departments. On the
day of the Ford Explorer crash, a group
of State Farm adjusters joined engineering students from the nearby
University of Virginia. A viewing deck
overlooks the crash spot, where a
fresh-off-the-lot Explorer has been
wheeled into place.
The crash aims to replicate one
of the most common accidents: a
relatively slow-moving vehicle gliding
through an intersection getting hit on
the side by a faster-moving car. In the
hour before the test, engineers make
sure all the sensors are working and
the vehicle is properly prepped. The
Explorer’s original fluids have been
drained with something nonflammable. Its sides are strapped in tape. The

two small-stature female dummies — a
driver and a passenger directly behind
— have different colors of paint
applied to different parts of their
bodies. That way, it’s easier after the
crash to see where their bodies came
into contact with the vehicle. (Females
aren’t always tested — the IIHS stable
of dummies includes men, women,
and children of various sizes. But
females are used most often because
their injuries tend to be the worst in
side-impact crashes, the IIHS says.)
With four minutes to the crash,
everybody clears the floor. The stage
area is lit by 750,000 watts of lightbulbs. A bay door rises. Two football
fields away, a sled sits. The countdown
begins, and then the sled begins its
short trip, being pulled along on a belt.
It sounds like a small aircraft about to
take off. It reaches 31.1 mph just before
impact, but watching live it seems
much faster. Then the crash.
Cameras of both the still and
motion variety capture every angle.
Images of the crash immediately
begin to replay in a loop on TV
monitors posted about the hall.
The sled hit just where it was supposed
to. The dummies are still in their seats,
a bit slumped. Paint is visible on
airbags where the dummy heads were
slapped. Damage to the vehicle will be
assessed later. (In a nutshell, it’s
totaled.) But information about the
extent of the dummies’ injuries is
quickly forthcoming: The rear passenger came out virtually unscathed, with
good protection for her head and
neck, torso, and pelvis and legs.
The driver was also in good shape
overall, though the pelvis/leg measure
earned a “marginal” rating because of
the indication that “a fracture of the
pelvis would be possible in a crash of
this severity.”

Aftermath
The results were not exactly surprising
to Ford, a company that has earned
more Top Safety Pick designations in
the past year than any other automaker.
Ford spokesman Dan Jarvis points out
that the company’s own tests include
continued on page 43

Fa l l 2 0 0 7 • R e g i o n Fo c u s

29

BookFall08Final

1/28/08

2:41 PM

Page 30

INTERVIEW
Susan Athey
Every two years, the American Economic Association
awards the John Bates Clark Medal to “that American
economist under the age of 40 who is adjudged to have

RF: You have worked across several fields using many
different approaches to answer important questions.
Can you explain how your basic and applied work fit
together or complement each other?

made the most significant contribution to economic
thought and knowledge.” Susan Athey of Harvard
University was awarded the Medal in 2007. Past
winners include a host of economists who have gone
on to greatly influence the profession, including Paul
Samuelson, Milton Friedman, Kenneth Arrow, Robert
Solow, and Gary Becker; more recent recipients include
Paul Krugman, Kevin M. Murphy, and Andrei Shleifer.
Athey’s research is hard to sum up in a few words.
She is perhaps best known for her methodological
work. But as she describes in the interview, many of
her methodological contributions stem from looking
at applied problems, finding the existing tools to

Athey: What I find most exciting about economics is the
fact that real policy issues and problems always can point
the way to interesting research questions. But I also tend to
be an abstract thinker and I like to understand the limits
of an answer — and how particular or general that answer
is, depending on different circumstances. That tends to take
me from a situation where I am, on the one hand, immersed
in a policy problem and trying to understand the answer,
to where another part of my brain is trying to find
the abstractions which that problem fits into — for
instance, what other problems might be like this one.
So while working on the policy paper I might have learned
something along the way that is more broadly applicable and
that might bring me to write a methodological paper
subsequently. I haven’t tended to take a tool and apply it to
lots of different applications. I tend to have an application

answer those questions, and then developing new
methods to solve them.
Her applied work has touched many fields, from the
economics of organizations, where she has looked at
how firms might improve their mentoring systems for
talented young employees, to auction design, where
she has examined how the government could more
efficiently run procurement auctions and auctions for
natural resources such as timber. She also has helped
us better understand the conditions under which
collusion among firms might be expected and the
possible welfare effects of such cartelization. And, of
interest to monetary economists, she has considered
why it is often desirable to limit the discretion of the

Athey has long ties to the Fifth District, having grown
up in Maryland and then attending Duke University as
an undergraduate. Aaron Steelman interviewed Athey
at her office on the Harvard campus on Oct. 9, 2007.

30

R e g i o n Fo c u s • Fa l l 2 0 0 7

PHOTOGRAPHY: JUSTIN IDE /HARVARD NEWS OFFICE

central bank so that price stability can be achieved.

BookFall08Final

1/28/08

2:41 PM

Page 31

and then develop the tool. To me, it’s a natural process of
trying to understand a problem, recognizing the shortcomings of the existing methods, and then developing new
tools to better answer similar problems.
RF: Can you give an example
of the interplay between your
methodological work and your
work on policy problems?

talent is scarce and so it could be that your star student or
your star young employee is of an opposite type, and if
that is the case, you might lose out on that talent. It also
seemed that there were probably diminishing returns to
having a huge majority of one type.
For instance, even if men were
more effective at mentoring men,
the last man you add to your faculty
might not add that much value to
mentoring the existing men.
So we looked at these trade-offs
and how both a myopic organization might fare, as well as how
a farsighted organization might
evolve. We derived conditions
under which there might be
multiple steady states for a profitmaximizing organization. If it
started out relatively homogeneous, the firm might find it
profitable to discriminate against the minority because they
will have a hard time succeeding. But if they happen to find
someone of the minority type who is so talented and such a
good fit that they do succeed, then that might make it
worthwhile to hire more employees of the minority type
and move toward a diverse steady state. At that point, the
organization might implement a voluntary and profitmaximizing affirmative action program as an investment in
the ability to mentor future minorities. One of the key
assumptions in such a model is that there is a scarcity of
talent for people who match an organization’s needs. To find
that talent, firms might have to look for people who by some
characteristics do not tend to fit the profile of their existing
workers. Initially, that can cause some problems but
ultimately be beneficial to the firm. So you might take some
short-term hit in profits but over the long-run it can be a
good investment.
This goes beyond my model, but I think it’s important
to note that social conventions are often arbitrary. For
instance, a Southern law firm might have a hunting trip for
its annual retreat. But young associates, and perhaps
especially young female associates, might have no interest in
hunting. So if they changed the retreat to something that was
more gender-neutral, in a couple of years, only a few of the
long-standing partners might care and you would appeal to a
broader pool of talent. So that’s outside of my model, but my
model does have these trade-offs in diversity, where you are
not as effective at mentoring majorities of either type when
you are diverse. In the long run, though, my belief is that
people get better mentoring those from another type as
social norms change and they get a little experience doing it.

I haven’t tended to take
a tool and apply it to lots
of different applications.
I tend to have an
application and then
develop the tool.

Athey: Probably the best example comes from a case where I
started working on a very applied
problem — collusion in auctions.
To get at that problem, I developed tools for analyzing ongoing
relationships in dynamic models
with private information. That
methodological work led me to connect with macroeconomists who were interested in the issue of discretion in
monetary policy. I knew nothing about that issue from an
applied perspective, but I did understand a lot about providing incentives to privately informed agents. So that was an
example where I got to learn about a new applied problem
but my contribution was more on the methodological side.
So, ultimately, it came full circle — from one applied
problem to another. And that’s a bit unusual for me. But it
can work well, because if you have different conceptual
insights, you might attack a long-standing problem in a
different way. Plus, in this case, I got a great chance to learn
a little bit about macroeconomics.
RF: How did you become interested in the topic of
mentoring from a research perspective?

Athey: The question of how mentoring affects diversity
in organizations was the first problem that I posed
independently as a scholar. I started on it in my second year
of grad school. The work was motivated by a simple
observation. A lot of male graduate students played in
regular basketball games with male faculty members.
But women and nonathletic males were not particularly
welcome. It turned out that a pretty high share of the
students who played in these games got plum research
assistant positions over the summer. So I started thinking
about why that was happening and what the impact was on
eventual outcomes for students, schools, and the profession.
I also thought that a lot of things I was seeing weren’t really
entering the debate about affirmative action and why firms
might want to actively manage the process of diversity.
I developed a model that included the idea that people
might have more effective mentoring relationships with
people of the same type. The model had competing forces.
On the one hand, if people are more efficient at monitoring
people of the same type, then there could be some benefit to
having a homogenous organization. On the other hand,

RF: How did you get interested in auction design?
Athey: When I was heading off to college I needed a
summer job, so I worked as a receptionist for a company that
sold computers to the government at auction. My family also

Fa l l 2 0 0 7 • R e g i o n Fo c u s

31

BookFall08Final

1/28/08

5:36 PM

Page 32

sells timber and cattle at auction. So I had some exposure
of inventory for 45 days with an uncertain resolution
already, but it was while working at that summer job that
to the protest.
I recognized that the way the government ran its
This could potentially pose some serious problems for
procurement auctions led to some inefficient behavior.
the company with the winning bid, which everyone knew.
One of my friends introduced me to Bob Marshall, a profesSo the protesting bidder would often approach the awardee
sor at Duke who was working on defense procurement. I
and ask for a settlement. This type of side payment
shared with him what I had
was encouraged by procurement
observed, because while I knew that
officials because they just wanted
the procurement process could be
their computers and from their
improved, I did not know how to put
perspective, the faster a protest
➤ Present Position
this issue into formal models or how
was resolved, the better. A few
Professor of Economics, Harvard
to conceptualize what was happencompanies came into existence
University
ing. I wrote a paper about the topic
that were not legitimate — they
➤ Previous Faculty Appointments
that gathered a lot of the institutionsaw how the protest system was
Massachusetts Institute of Technology
al information and with his guidance
handled
and
made
money
(1995-2001) and Stanford University
put it into an economic framework.
just
by
asking
for
bribes,
in
effect,
(2001-2006)
I was fascinated by observing
from legitimate companies that
➤ Education
Bob’s work on theory models that
had been awarded procurement
B.A., Duke University (1991); Ph.D.,
seemed to hit the nail on the head:
contracts. These protesting compaStanford University (1995)
They were right, insightful, and I
nies could have never fulfilled the
➤ Selected Publications
learned something that I hadn’t
contracts themselves.
Author or co-author of papers in such
known before. As a result of this
So it was a very inefficient system
journals as the American Economic Review,
research, he was asked to testify
where companies were regularly
Quarterly Journal of Economics, Journal of
before Congress about changes in
being held up and pressured into
Political Economy, and Econometrica
the procurement system. A lot had
side payments. We saw that we
➤ Awards and Offices
happened in the few years since I
could develop a model which could
Winner, John Bates Clark Medal, 2007;
took that summer job as a receptioncapture what was going on and
Fellow, Econometric Society; Co-Editor,
ist. Senators were listening to the
guide policies for improving incenAmerican Economic Journal: Microeconomics
suggestions we had to reform to the
tives while preserving the original
process and that was very gratifying.
intention of the protest system.
The tools of economics allowed us to develop a formal
RF: What were some of the flaws in the bidding process
analysis of the issue. That was what really got me interested
that you observed?
in auctions. The theme that emerged from this case runs
through a lot of my applied work. In the end, yes,
Athey: With auctions, the problems are often not just in the
the auction rules are important but you also have to get
design of the auction itself. You have to design a market, and
the broader context correct.
there are a whole set of rules in a market — for instance, who
can participate, what gets sold, and how it is divided to be
RF: Can you discuss your work on timber auctions?
sold. So the design decisions of a market are much broader
What did the U.S. Forest Service do incorrectly that the
than the auction itself. In this particular context, there was
Canadian government seemed to improve upon?
no problem with the auction; there was a problem with the
regulatory environment. The government had created a very
Athey: My papers are not directly about that second
streamlined process for protesting a procurement. If a bidquestion, but I think they can help shed some light on it.
der thought that a procurement had been misallocated —
The U.S. Forest Service doesn’t raise revenue, generally.
perhaps a procurement official had been biased or there was
That’s a problem. But that’s not a problem of auction design.
some error in the process — the costs to appeal were very
It’s a problem of market design and incentives facing the
low and the procurement would immediately be delayed for
agency. Because the Forest Service has not been run with the
45 days while a board reviewed the protest. This seemed like
goal of revenue maximization, lots of tracts get sold that do
a good idea, but what they hadn’t taken into account was
not generate much revenue for the government. In many
that many of the smaller procurements had very short delivcases, the government would reimburse the firms for road
ery dates, and you had to immediately start delivering on the
construction and essentially the value of the timber was not
procurement when it was awarded. So a small business might
much more than the cost of building the roads. There also
have brought in a couple of million dollars worth of invenhave been a lot of issues of regulatory capture.
In Canada, timber is such an important natural resource
tory, and then 20 days into the procurement, the award would
that the government cannot afford to essentially subsidize
be protested, at which point everything would be frozen
the timber industry in this way. The government needs the
with the company sitting on this relatively large amount

Susan Athey

32

R e g i o n Fo c u s • Fa l l 2 0 0 7

BookFall08Final

1/28/08

2:41 PM

Page 33

revenue and there is significant public interest in the
program, so it does operate a revenue-generating enterprise.
The Canadian problem is that the government owns a very
large fraction of the resource. So they have worked hard to
design a system that could deliver the best possible
incentives for efficient behavior, such as getting the right
trees cut at the right time, getting the right timber replanted,
and getting the right mills built, as well as bringing in
revenue for the government.
To illustrate the issues that have to be solved regarding
market design, nobody is going to build a mill if they
don’t have some idea of future supply. So the Canadian
government engaged in various forms of long-term
contracting, which is a very sensible thing to do. But once
you have the mills built, you have to find a way to price the
timber that is going to those mills. Historically, they used
various forms of administered prices. The United States
complained about that. So British Columbia introduced a
system where they used auctions to create spot markets
for timber, and the prices on that spot market were
used to calibrate prices for timber harvested under
long-term contracts.
RF: In which industries — or types of industries — is
collusion most common? And how can policymakers
respond to such noncompetitive behavior to improve
the functioning of those markets?
Athey: Collusion often occurs in markets where you tend to
have homogeneous products, fairly inelastic demand, and
high fixed costs and low marginal costs. Examples include the
lysine and vitamin industries. There is a small number of
firms that have made big investments in plants. They need a
markup to survive and they are continually bidding on
business from big customers.
There have been some firms that have been in a number
of markets where collusion might be desirable and they got
very good at colluding. For instance, Archer Daniels Midland
(ADM) was in both the lysine and vitamin markets and they
helped to organize fairly effective cartels. In those kinds of
environments, you expect strong pressure for those firms to
find some way to soften up their price competition because
the underlying conditions of the marketplace are so severe.
It is common in procurement to have a fairly small
number of firms consistently bidding against one another.
So we have seen it in school milk and road construction.
And some things that the government does can actually
make it easier for firms to collude. In order to maintain
transparency, the government tends to reveal a lot of
information about procurement and also tends to break
things up into smaller procurements, creating lots of
auctions. That creates the conditions where firms can more
easily arrive at tacit collusion.
The auction design can make a difference. For instance,
it’s much easier to collude in an open-bid auction than in a
sealed-bid auction. That’s something my empirical research

confirms. In my work, open auctions do not yield as much
revenue as you would expect, and that is consistent with
the theory that collusion is easier in that environment.
It’s certainly possible to collude in sealed-bid auctions.
But it’s especially easy to collude in open auctions, because
there really isn’t much gain from deviating today. To see why,
imagine that a bunch of bidders have all agreed to bid low in
an auction and then you show up and you deviate. As soon as
you start bidding above the agreed price, your competitors
can respond. They can outbid you. In a sealed-bid auction,
however, a firm can deviate and their competitors cannot
immediately respond. They can only respond in the future.
In an open auction, if you are not the most efficient firm,
you cannot gain at all by deviating to win the auction. If you
are the most efficient firm but you were not designated by
the cartel to win, then you can gain in the present day
by deviating. But you might not gain that much, because
your opponents can bid you up. At best, you can gain
the competitive profit today while in a sealed-bid auction
you can gain the collusive profit today.
RF: In your opinion, how effective is antitrust policy
in preventing collusion?
Athey: Typically, tacit collusion, where firms do not
make formal agreements, tends not to get prosecuted. The
prosecutions that take place typically occur because firms
have gotten together and done something explicitly illegal —
like fixed a bid or met in a smoke-filled room and exchanged
side payments. My research addresses the following
questions: If that’s the main way firms get caught, why do
they take that risk? Why can’t they do pretty well with tacit
collusion? My research suggests that bribes and communication can be helpful for firms in achieving the most efficient
cartel. So, in principle, if they are very patient and sophisticated, they may be able to arrive at a scheme of tacit collusion
that does allocate efficiently. But if firms are less patient, they
may not get there. Bribes can help them settle up today to
compensate those who give up market share. So if one firm is
more efficient than the others or has extra inventory, it can
pay the other firms to hold back production. If you do not
have transfers to do that, you just have to make some
promise that in the future you will take a turn and let the
other firms produce. But that’s a long way off, it’s not clear
that people will follow up on the promise, and things become
murky without the side payments.
Tacit collusion also becomes easier when there are many
rounds of bidding. If you give firms a lot of opportunity to
interact and if any particular action they might take does not
have a huge impact on final outcomes, then firms are able
to communicate through the marketplace and don’t necessarily need to get together to talk. For example, in Federal
Communications Commission auctions, Firm B may bid
against Firm A in some city that Firm B does not have a
natural interest in to signal to Firm A to stay out of those
areas that Firm B considers to be its core markets. If it’s early

Fa l l 2 0 0 7 • R e g i o n Fo c u s

33

BookFall08Final

1/28/08

2:41 PM

Page 34

in the process, those prices are not going to be the final
prices. So the firms are able to communicate in the early
stages of the price discovery process and divide up the
markets to decide how the licenses are allocated. Firms can
use other techniques, such as putting signals in the trailing
digits of their bids. Instead of bidding a round number, they
would use patterns of numbers to communicate with each
other. But if you have less frequent, larger auctions where
there are not a lot of opportunities to communicate through
action, firms tend to need to
get together and explicitly communicate to arrive at a similar
arrangement.
RF: I would like to return to
your research on discretion
in monetary policy. Can you discuss your work on inflation
targeting — about the possible
virtues of and problems with
limiting central bank autonomy?

Economics allows you
to think several layers
deeper. Without that
structure, you just get
lost in a muddle.

Athey: You might ask: Why does the central bank need
discretion at all? Why can’t we make rules that depend on
publicly available information? You can think of different
motivations for having central bank autonomy. A leading
motivation must be that you believe the central bank understands something that is difficult to quantify or write down as
a function of public observables. It’s not that the central bank
has access to better raw information, but perhaps there is a
lot of subjectivity in evaluating publicly available data and
because of that, reasonable experts would arrive at different
conclusions based on the same data. If the central bank has
some expertise in analyzing those data — and if it has access
to some nonpublic data, which it does — then there can be an
argument for discretion. The problem is they also have a
classic time inconsistency problem. There can be a benefit to
a surprise inflation. So the question becomes, how do you
provide incentives in a world where the agency you are trying
to incentivize has a social objective at heart, but they have
private information and a time inconsistency problem?
The fundamental economic insight is that in an environment like that, where the mechanisms you have for
providing incentives have social costs, it is often not worth
the cost to provide incentives. If the central bank decides it
is optimal to increase inflation a little bit today, inflation
expectations may go up in the future. How do you weigh
the future costs with today’s benefits? The answer is not
self-evident. In fact, it depends on the nature and distribution of the private information. But for a wide set of
circumstances, it is not worth it to try to provide incentives.
It is desirable, much more often than you might expect, to
simply establish an inflation cap and limit autonomy.
The reasons for that are fairly subtle. But that same kind of
idea has also arisen in my work on collusion.
In some circumstances, firms collude best by just setting a
34

R e g i o n Fo c u s • Fa l l 2 0 0 7

fixed price and sharing the market evenly rather than attempting to divide up the market in an efficient way.
You need pretty efficient instruments for providing incentives
to make it worthwhile to provide those incentives. When
resolving the trade-off between suboptimal decisions and
inefficient instruments for incentives, you have to account for
the indirect effects of the decision policy, because you will
have to distort what happens in some states of the world
to preserve incentives to make the best decisions in other
states of the world. Those indirect
spillovers wind up pushing you
toward less efficient decisions.
RF: What would you consider
your most important contribution to econometrics or methodology more generally?

Athey: I would not say that my
most important methodological
contribution is in econometrics. I
think that I, among other people,
have influenced applied practice in industrial organization
and the analysis of auction data by paying a lot of attention to
non-parametric identification. I have been able to push the
ball forward in delineating what kinds of auction environments you could possibly learn the primitives of models and
in which kinds of environments that is just not possible. I
think that is an important set of facts to know when you go
to start a project.
I also have emphasized specification testing to provide
more systematic ways to justify assumptions that you make.
Rather than just marching forward with a set of assumptions
for a structural model, I have emphasized ways to test those
assumptions and have more confidence in your work. I hope
that I have focused more attention at the beginning steps of
a project, when you are conceptualizing which question you
can ask and what assumptions you should make.
Let’s assume that you have a very large and good data set,
there is a lot of value in determining early on whether you
can answer your question with a minimum of extraneous
simplifying assumptions. Could I answer the question just
using the assumptions that I believe to be good approximations for reality or that are testable, rather than relying on
assumptions of functional form or unrealistic assumptions
about the environment? I hope that by doing that early
work, people will abandon projects to which the answer is no
or focus their attention on what additional piece of data
would turn the answer from no to yes. For example, if you
want to do structural work on common-value auctions, you
are going to need some data beyond bidding data, such as
information about the underlying value of the object
obtained from observations after the auction ends (e.g. how
much oil was extracted from an oil lease). So before you even
begin a project, you should find that kind of data, otherwise
the project will not be fruitful.

BookFall08Final

1/28/08

2:41 PM

Page 35

RF: I read on your Web site a short article that you
wrote for middle-school students about applying math
to real-world problems. How do you think economists
can help students become more interested in economics
and not necessarily scared off by the sometimes very
technical nature of the discipline?
Athey: I think a big issue is finding the problems that
will engage students and showing them that economics can
provide real insights. One thing that has made it easier
for me to engage undergraduate students is eBay. It is
still a relatively new company; someone not much older than
the students founded it; they can see how it allows
them to buy something they otherwise might not be able to
get; and they are forced to think a little bit about bidding
strategy and market design when they interact with the
system. It allows them to think about which kind of
economic institutions you might like and which might be
more appropriate for certain goods. There are many things
on eBay that might initially seem puzzling but that conform
quite well to economic theory. So through this example you
can get students engaged and improve their understanding of
something they have already encountered and puzzled over.
That is quite powerful.
I think another example is the economics of social
networking sites like facebook.com and myspace.com.
These are also institutions they interact with, yet the design
decisions are evolving and the dominant market structure
has not yet been determined. They can see how market
design matters.
There are other broad topical areas that can get students
engaged, such as the economics of sports or the economics
of the entertainment industry. Finding the applications that
resonate with the students or the population in general and
then showing them how a little bit of structured thinking
can substantially improve their understanding — I think
that’s where you get the power of economics. I’m still
amazed that in the business world how having a coherent
and structured way of approaching problems can allow
someone like me to walk into an industry meeting and talk
to people who are brilliant people managing large companies
and still have unique insights for them. That’s because I have
these really powerful tools at my disposal. Economics allows
you to think several layers deeper. Without that structure,
you just get lost in a muddle.
RF: You are the co-editor of the American Economic
Journal: Microeconomics, one of four new journals
launched by the American Economic Association. What
niche do you aim to fill that is not currently served by the
many and varied academic journals already in existence?
Athey: There are a lot of journals, but there are not a lot of
really good journals. Most of them are fairly secure in their
position. So there is not a lot of competition on service. An
enormous amount of time is wasted with slow refereeing

processes and revisions that may improve the paper but are
not worth the time required to make them. So a big goal for
me is to have an outlet for the kind of work that I like, where
people can get good service in a general-interest outlet.
A secondary issue is that for more technical work there are
not that many options from a general-interest perspective.
Your papers fall to the field journals very quickly. Basically,
what I want is a journal that gets the cost-benefit analysis on
revisions right, that turns around papers fast, and that
reaches a broad audience with technically rigorous work.
RF: How has winning the John Bates Clark Medal
affected your life, both personally and professionally?
Athey: Receiving an honor like the Clark Medal puts me in
the position of being an ambassador for economics to the
general public. Given how passionate I am about economics,
I view that as an exciting opportunity. Also, when you win
the Clark Medal, you get a lot of media attention — and
with that, a lot of correspondence from people you may
know only slightly or not at all. As the first female winner,
I received hundreds of e-mails from women in other
male-dominated professions. These people felt compelled to
tell their own stories and it made me realize the power of
being a role model. Whether you like it or not, graduate
students are looking ahead at the people who are leading the
profession and it appears to have affected a substantial
number of them to look at me. That’s not something that I
chose — or even can control — but it has happened, and it
has been gratifying to know that I may have inspired more
women to jump into mathematically oriented professions
such as economics.
RF

Fa l l 2 0 0 7 • R e g i o n Fo c u s

35

BookFall08Final

1/28/08

2:41 PM

Page 36

ECONOMICHISTORY
The Great Southern Migration
Throughout much
of the 20th century,
people streamed
out of the South,
rearranging the
social, political, and
economic landscape

The Great Migration brought families
like this to Chicago and other industrial
economic magnets in the Midwest and
Northeast. For blacks, the migration
promised not only job opportunities but
also escape from the segregated South.

J

ames Macbeth moved to New
York from Charleston, S.C., in the
boom years of the Great
Migration. It was the 1950s, a decade
when some 1.1 million blacks left the
South. His father had departed many
years before, too many for him to
remember just which year it was. The
elder Macbeth worked for the postal
service in New York City. By the time
Macbeth was ready for college, he
moved to Pennsylvania and his mother
later joined his father in New York.
The elder Macbeths also worked at the
Carolina Chapel of Mickey Funeral
Service in Harlem, founded in 1932, far
from its original Charleston, S.C.,
home base. Macbeth works there now.
Macbeth is but one of 8 million
black and 20 million white
Southerners who streamed to cities in
the North or West, with the heaviest
flows between about 1915 to 1970.
Blacks migrated in higher percentages
than whites, and so this “Great
Migration” redistributed the racial
population. It changed job markets,
politics, and society. And culture. For
blacks, the exodus urbanized a formerly agricultural and dispersed people,
allowing them visibility in accomplishing social goals. Effects of white
migration were less dramatic and,
in many cases, temporary, coinciding with
the wartime and postwar industrial boom.

Migration: A
Sorting Mechanism
People migrate in
search of better living
conditions. Sometimes
freedom from war and
oppression supplies
the necessary energy
required to overcome
the inertia inherent
in the status quo.
36

R e g i o n Fo c u s • Fa l l 2 0 0 7

Sometimes it’s a better job. Or both.
Migrations affect jobs, wages,
geography, housing, education — all
economic activity. Migrations also
reveal how workers sort themselves
into jobs in different locations.
“It’s a complex process in which
workers and employers match up, and
it’s absolutely essential in an economy
that changes rapidly over time,”
says economist William Collins
of Vanderbilt University. “In other
words, migration — the movement of
workers from place to place — is a
key part of the story of how labor
markets work.”
The Great Migration ebbed and
flowed with the world wars. The first
period dated from about 1915 to 1930
— World War I and after — and
slowed with the Depression.
Migration picked up again as military
production — steel and aluminum
plants, shipyards, aircraft plants, and
military installations — for World War
II created jobs in the Great Lakes
corridor from New York to Chicago
as well as on both coasts. People kept
moving even after the war, as the
economy grew.
While the migration north and
west from Southern states began in
earnest in the century’s first decade,
more than 40 years before Macbeth’s
personal odyssey, the exodus was
growing even stronger at the time of
his departure.
Macbeth, like most black immigrants, laughs when he says his father
headed north because “everybody said
the streets were paved with gold.” But
the laughter subsides when he talks
about segregation, the “Jim Crow”
laws that prevented blacks from voting
and more.
In all former Confederate states,
less than 5 percent of eligible blacks
were registered to vote as late as 1940,
according to historian David Kennedy.

PHOTOGRAPHY: CHICAGO DEFENDER , SEPT. 4, 1920

BY B E T T Y J OYC E N A S H

1/28/08

2:41 PM

Page 37

(Women, black and white, did not
receive voting rights until 1920.)
By 1900, Southern states had instituted
racial separation: drinking fountains,
schools, waiting rooms. Few industrial
jobs existed in the South, and Jim Crow
affected those too. For instance, in 1915,
South Carolina required segregated
workrooms in textile mills. Infant mortality rates for blacks were nearly
double those for whites in 1930 (10 percent and 6 percent, respectively).
Blacks could expect to live 15 fewer
years than whites, 45 compared with 60.
Moving
destinations
varied.
Southerners aimed for meccas like
Chicago or Detroit if they were from
Mississippi or Alabama. But the goal
was New York, Philadelphia, or
Boston if they hailed from the
Carolinas and elsewhere along the
Eastern Seaboard. Historian Spencer
Crew, who has studied the migration,
says that blacks in the early years followed whatever rail routes crossed
their towns. Trains pulled into
Southern stations filled with goods
and pulled out filled with the people
who could afford to go.
Economists have been curious
about why blacks waited some 50 years
after the Civil War to exit the South in
significant numbers. By the early
1900s, only a couple hundred thousand
blacks (and about 716,000 whites) were
leaving. The Great Migration peaked
in the 1970s when some 1.5 million
blacks and 2.6 million whites left
the South.
Theories have pointed to European
immigration as a “deterrent” to black
migration, especially in those early
years. Data show that blacks “moved
at times and to places where foreignborn immigrants were less prevalent …
the Great Migration would have gotten under way earlier than it did if
strict immigration controls had been
adopted earlier,” Collins wrote in a
paper on the subject. As World War I
stifled that European flow, it simultaneously created demand for workers to
fill industrial jobs previously available
only to whites.
While blacks were not hired into
skilled jobs in the Northern industries

Regional Distribution of Black Population, 1900–2000
100
South

West

Midwest

Northeast

80
PERCENT

BookFall08Final

60
40
20
0
1900

1910

1920

1930

1940

1950

1960

1970

1980

1990

2000

SOURCE: U.S. Census Bureau, Demographic Trends in the 20th Century, November 2002

until mid-century, they did find lowerlevel jobs, according to Crew, who now
directs the Underground Railroad
Museum in Cincinnati. “In the North,
because of the war, there was a real
shortage of labor, and as a consequence, opportunities for African
Americans opened up, mostly in the
iron mills and slaughter houses.” Crew
notes that the better-paid, higherskilled jobs were not available to blacks
until the post-World War II years —
and even then, they were hard to get.
In 1920, for instance, 70 percent of
Southern black men worked in
unskilled or service jobs compared to
22 percent of Southern white men.
By 1970, according to historian James
Gregory, that number had fallen to 35
percent for Southern-born black men
and a barely changed 24 percent for
Southern-born white men.
The agricultural depression of the
1920s, sparked by wartime overproduction and rock-bottom crop prices,
accelerated immigration even further
during that decade. The cotton for
which the Southern states were
famous was devastated by the boll
weevil. In 1920, South Carolina farmers produced 1.6 million bales, the
biggest in the state’s history, but two
years later they counted 493,000, the
fewest since the Civil War. Add to that
an agricultural deflation in which
peanut prices fell from $240 to $40
per ton in one season, corn from $1.50
to 50 cents.
That and mechanization forced
many white and black agricultural
laborers off Southern fields for good.

“The 1922 harvest season was followed
by the largest wave of migration in the
history of black Carolina,” according
to Black Carolinians: A History of Blacks
in South Carolina from 1895 to 1968
by I.A. Newby. Some 59,000 blacks
left rural areas of 41 South Carolina
counties between November 1922 and
June 1923.

Migrant Characteristics
Blacks who migrated tended to be
more educated than those who stayed,
while the reverse was true of whites,
according to Duke University economist Jacob Vigdor. He has studied
changes in migration patterns and
migrant characteristics. Before World
War II, educated blacks were more
likely to migrate north because they
could better afford it. (Families who
could afford the opportunity costs of
sending their children to school,
he notes, could more likely pay for
a move.)
Plus, they valued the educational
opportunities they heard about up
North. It’s not that the North always
turned out to be a “promised land” for
blacks, Crew says. But there was hope,
the brightest of which was better education. “People [were] bringing their
kids with them in the hopes they
[would] have a better future,” he says.
Vigdor reports median years of schooling completed among black migrants
from most Southern states as eight or
nine in 1940 among those born from
1913 to 1922.
Early migrants were, on average,
younger as well as better educated

Fa l l 2 0 0 7 • R e g i o n Fo c u s

37

BookFall08Final

1/28/08

2:41 PM

Page 38

than non-migrants. They could read
newspapers, letters, or flyers that
described the migration. “In each age
cohort, highly educated blacks living
outside their state of birth were more
likely to reside in the North than in
the South,” Vigdor writes. “In the oldest cohort, highly educated black
interstate migrants were 35 percent
more likely to reside in the North.” In
1940, educated blacks were likely to
choose a Northern destination, but
that trend began to change in 1970,
with more educated blacks turning
back to the South.
These patterns might have implications for human capital and economic
outcomes of later generations of
native-born blacks. Economics literature, Vigdor notes, links outcomes
with characteristics of fellow ethnic or
racial group members, especially in
segregated environments.
As decades passed, life for black
youth who remained in the South was
still tough. An article published in
1967 in a New York biweekly newspaper, The Reporter, noted that the
poorest county in South Carolina,
Williamsburg, lost 14,636 blacks from
1950 to 1960. That was more than half
its black population. Among those who
stayed, even black students with some
college lacked opportunity. Here’s how
the author describes the situation,
based on interviews with black families:
“But when the time came for them to
find jobs, there were none. One by one,
Davis’ four sons and three daughters
packed up and left for New York.”

Chain Migration
Migrants drew on the help of friends,
relatives, and friends of friends
in the search for a new life up North.
Sometimes industrial recruiters,
desperate for labor and sometimes
strikebreakers as World War I and
immigration policy choked off the flow
of whites from other countries, trolled
Southern towns for would-be migrants,
some offering free train tickets.
Northern cities’ newest Southern
arrivals, white and black, didn’t always
find the welcome they sought, and they
tended to stick together. Some natives
38

R e g i o n Fo c u s • Fa l l 2 0 0 7

derided “hillbilly” and Southern
accents. Entire blocks of Chicago
and Detroit were known as little
Appalachia. There is still a faint legacy
of “Bronzeville,” a black district just
now undergoing a renaissance of sorts,
also in Chicago.
Early on, new migrants were often
“portrayed in unflattering terms by
contemporary observers,” according to
University of Washington sociologist
Stewart Tolnay. Even sociologists like
W.E.B. Du Bois wrote, of the earliest
migrants to Philadelphia, that their
Southern backgrounds were a handicap
as they tried to adapt to life in the
Northern city. And, at first, even
Northern black newspapers such as the
Chicago Defender discouraged blacks
from settling in Northern cities.
But by 1918, the Defender was
selling 130,000 copies, three-fourths
of those outside Chicago in cities like
Richmond, Norfolk, and Savannah,
Ga., with smaller circulations in towns
dotted throughout the South. Much of
this was in response to the Southern
press, which “built into a crisis story
about potential labor shortages for
Southern agriculture,” according to
Gregory in The Southern Diaspora.
The black-owned newspapers in
the South warned whites of an
“exodus,” should whites fail to open
doors to change. Meanwhile, white
publishers pondered what to do
about the out-migration, worrying
in headlines about labor shortages.
White-owned Northern newspapers
often focused on the negatives of the
influx, Gregory wrote.
Although migrants to Northern
cities were better educated than their
Southern counterparts, their new
Northern neighbors, white and black,
described them as illiterate. “And their
growing numbers were sometimes
viewed as a potential threat to the
racial status quo that offered Northern
blacks a relatively comfortable coexistence with whites, if not actual racial
equality, ” according to Tolnay. Later
anecdotal portraits of migrants,
however, are kinder — perhaps native
Northerners had gotten used to the
new migrants. Still, race riots erupted

in Chicago, Detroit, and Harlem, while
Ku Klux Klan terrorism and lynching
marred life in the segregated South.
Black migrants tended to settle
together, and they organized themselves socially, according to Newby.
“In every city where significant
numbers of them settled, there were
Palmetto College Clubs or Palmetto
state societies, which, in purely social
matters at least, eased the transition to
urban living for many migrants.”
Until migration picked up in World
War I, there was little separation
of the races in neighborhoods. For
instance, the 5,000 blacks who lived in
Detroit in 1910 had lived among other
immigrants. But with the influx of new
migrants, blacks were channeled into
the city’s slums. Even if migrants could
afford a home, there were the tools of
zoning and restrictive covenants that
prevented them from purchasing in
certain neighborhoods until government intervened with housing laws.

Going Home
By 1970, blacks who were educated
were more likely to head for a
Southern destination than their
less-schooled counterparts, a trend
that continues. Economists and historians suggest by way of explanation
that discrimination had begun to ease
in the South, with conditions for
blacks being more hospitable as civil
rights gained ground. It’s also possible
that the Northern cities to which they
had moved had become less desirable
as industrial strength of the Great
Lakes region waned and joblessness
eroded neighborhoods.
As late as the tail end of the 1960s,
the 14 states with the largest number
of blacks leaving were all in the South.
But a decade later, migration had
leveled off, and reversed.
For whites, the entire migration
tended to be more of a “circulatory”
trend. For instance, in the late 1950s,
according to Gregory, for every 100
white Southerners who migrated
north or west, 54 returned home, and
that number increased to 78 by the
late 1960s.
“Turnover was the key dynamic of

BookFall08Final

1/28/08

2:41 PM

Page 39

the white diaspora,” writes Gregory.
“Fewer than half of the nearly 20 million whites who left the South actually
left for good. That means that the
white diaspora is best understood
as a circulation, not as a one-way
population transfer.”
But black return migration was only
about a third of the rate of white migration during most decades. Some did
come back even as others departed.
For instance, in 1949, some 43,000
black Southerners returned, about 1.7
percent of all Southern-born blacks
living in the North and West.
Still, in the 1970s, the return flow of
blacks to the South was evident —
more moving in than moving out.
Between 1975 and 1980, Virginia, the
Carolinas, and Maryland were among
the states gaining the most black
in-migrants, according to demographer
William Frey.
Frey analyzed migration data from

four decennial censuses. Among other
findings, the South netted black
migrants from all other U.S. regions
during the 1990s, completely reversing
the migration stream. Charlotte,
Norfolk-Virginia Beach, RaleighDurham, and Washington-Baltimore
were among the 10 most-preferred
destinations during that time. Atlanta,
however, was the strongest magnet.
New York, Chicago, Los Angeles, and
San Francisco lost blacks during the
same period. Also noteworthy: Blacks
were more likely than whites to pick
Southern destinations. Maryland,
North Carolina, and Virginia were
among the 10 states that gained the
most black college graduates during
the late 1990s.
Black reverse migration reflects
economic growth, improved race
relations, “and the long-standing
cultural and kinship ties it holds for
black families,” according to Frey.

James Macbeth, who is 71 and
beginning to think about retirement,
may move back to Charleston.
His parents, both dead, are buried
in South Carolina, and his siblings
have scattered throughout Southern
cities in a return migration of their own.
Over his lifetime, Macbeth witnessed the chain of events that people
like his father set in motion.
The migratory tide, once it began
going out, forced change as it
rearranged population, employment,
education, attitudes, art, music,
sports, transportation, recreation,
housing, and more. The Great
Migration was driven by more than
the opportunity to improve working
conditions — at least for blacks. James
Macbeth’s father didn’t leave
Charleston just for a good job in New
York at the post office. “He just
couldn’t get along with segregation
in the South.”
RF

READINGS
Collins, William J. “When the Tide Turned: Immigration and the
Delay of the Great Black Migration.” Journal of Economic History,
September 1997, vol. 57, no. 3, pp. 607-632.

Gregory, James N. The Southern Diaspora: How the Great Migrations
of Black and White Southerners Transformed America. Chapel Hill:
University of North Carolina Press, 2005.

Frey, William. “The New Great Migration: Black Americans’
Return to the South, 1965-2000.” Brookings Institution Center
on Urban and Metropolitan Policy. The Living Cities Census Series,
May 2004.

Vigdor, Jacob L. “The Pursuit of Opportunity: Explaining
Selective Black Migration.” Journal of Urban Economics, 2002, vol.
51, no. 3, pp. 391-417.

ARMED AGAINST ARMS
causing them to lack incentive to
ensure that borrowers are “mortgage
ready.” (It should be pointed out
that lenders do carry risk even when
they sell their mortgages because
over the long term, if defaults are widespread, then they are certainly worse
off in terms of their future ability to
originate loans and sell them.) “Are we

•

continued from page 20

going to expect Wall Street investors to
support homeownership counseling?”
he asks rhetorically.
Almost three years after her
purchase, Donna Turner is keeping up
with her monthly payments and
tending a small garden out back. She is
the very picture of a happy, responsible
homeowner. “I had always lived with

somebody. And after you pay your part
of the bills, they say get out,” Turner
says. “So I was determined to get to the
point where nobody could ever tell
me to get out again.”
Turner did it. Economic research
suggests that, while it won’t come close
to working for everyone, she needn’t be
the only exception.
RF

READINGS
Campbell, John Y. “Household Finance.” Journal of Finance, August
2006, vol. 61, no. 4, pp. 1,553-1,604.
Chomsisengphet, Souphala, and Anthony Pennington-Cross. “The
Evolution of the Subprime Mortgage Market.” Federal Reserve
Bank of St. Louis Review, January/February 2006, vol. 88, no. 1,
pp. 31-56.

Hartarska, Valentina, and Claudio Gonzalez-Vega. “Credit
Counseling and Mortgage Termination by Low-Income
Households.” Journal of Real Estate Finance and Economics, 2005,
vol. 30, no. 3, pp. 227-243.
Martin, Matthew. “A Literature Review on the Effectiveness of
Financial Education.” Federal Reserve Bank of Richmond Working
Paper No. 07-03, June 15, 2007.

Fa l l 2 0 0 7 • R e g i o n Fo c u s

39

BookFall08Final

1/28/08

2:41 PM

Page 40

BOOKREVIEW
Analyst of Change
PROPHET OF INNOVATION: JOSEPH SCHUMPETER
AND CREATIVE DESTRUCTION
BY THOMAS K. MCCRAW
CAMBRIDGE, MASS.: HARVARD UNIVERSITY PRESS, 2007
719 PAGES
REVIEWED BY THOMAS M. HUMPHREY

M

oravian-born, Vienna-educated Professor Joseph
Alois Schumpeter, who liked to say of his
aspirations to be the world’s greatest economist,
horseman, and lover that only the second had given
him problems, was a study in contrasts. He relished his fame
as one of the interwar years’ premier economic theorists, yet
modestly declined to mention his work in his Harvard
classes or in his exhaustive book on the history of
economic thought. (Citations to his work were inserted into
that book by his wife after his death.) An obsessively hardworking, morose (indeed often depressed) writer in private,
he affected a public image of carefree, cheerful ebullience. A
notoriously easy grader to his students, he often gave himself
low marks in his diary. A one-time banker, he relied upon the
women in his life to balance his checkbook. He chronicled
the evolution of the auto industry but never learned to drive.
He admired mathematics but failed to employ them in his
work. A harsh critic of the static, steady-state equilibrium
thinking of the neoclassical marginal utility/marginal productivity school, he nevertheless declared one of its founders, the
French neoclassical equilibrium theorist Leon Walras, the
greatest economist of all time.
All of his life Schumpeter championed capitalism yet was an expert on
Marx, Marxist economics, and the
entire socialist literature. A Marxist
economist, Paul Sweezy, was among his
closest Harvard friends. He was a political conservative and antisocialist who
notwithstanding served as Finance
Minister for a socialist government in
post-World War I Austria. He lauded
capitalism’s superior performance
while predicting the system’s death
from too much success. He preached
creative destruction — the incessant
tearing down of old ways of doing
things by the new — as capitalism’s
inescapable iron law, yet he was unprepared when his own work fell prey to it.
The 1990s saw the publication of at
least three biographies of this complex,

40

R e g i o n Fo c u s • Fa l l 2 0 0 7

paradoxical figure. Now comes Thomas McCraw’s definitive
and elegantly written study to top them all. Drawing upon
Schumpeter’s diary, correspondence, early drafts, and
published works, McCraw, a Pulitzer Prize-winning
emeritus professor of Business History at Harvard, paints a
vivid picture of Schumpeter’s life and times, his loves and
achievements. Readers will choose their favorite parts of the
book. Most enlightening to this reviewer is McCraw’s survey
of Schumpeter’s scholarly contributions. Ironically, McCraw
writes that he is “not concerned with Schumpeter’s
economic thinking, narrowly construed,” but with his “life
and his compulsive drive to understand capitalism.” But that
is a false dichotomy because Schumpeter’s theories cannot
be divorced from his attempts to come to grips with
capitalism: Each guided and shaped the other. In any case,
McCraw provides a perceptive and accurate account of
Schumpeter’s academic greatest hits and misses.

Greatest Hits
Hits include first and foremost the path-breaking
and seminal The Theory of Economic Development, published
in 1911 when Schumpeter, then 28, was in what he
called his scholar’s “sacred third decade” of peak creativity.
Other hits followed including the subtle and provocative
Capitalism, Socialism and Democracy, and the mighty
History of Economic Analysis, which Schumpeter worked on
throughout the whole decade of the 1940s, and which was
edited and published by his third wife, Elizabeth, four years
after his death in 1950.
Schumpeter pushed one idea all his
life: that capitalism means growth and
growth requires innovation. The book
that put him on the map,
The Theory of Economic Development,
states for the first time his vision of
capitalism as the economic system
that delivers faster growth and higher
living standards (especially of the
middle- and lower-income classes)
than any other system, albeit in a
disruptive, jerky fashion. Like a
perpetual motion machine, capitalism
generates its own momentum internally without the need of outside
force. Even technological change,
seen by some as an exogenous propellant, is treated by Schumpeter as
a purely endogenous matter, the
product of economically motivated
human ingenuity.

BookFall08Final

1/28/08

2:41 PM

Page 41

observation, Schumpeter effectively abandoned the classical
Breaking from received wisdom, Schumpeter replaces
dichotomy notion that loan-created money is a mere
the static equilibrium analysis of his neoclassical
sideshow, a neutral veil that together with metallic money
marginalist predecessors and contemporaries with a
determines the nominal, or absolute, price level while
dynamic disequilibrium theory of cyclical growth.
leaving real economic variables unaffected. Not so, said
His key building blocks are profits, entrepreneurs,
Schumpeter. For him, money and credit are integral to the
bank credit creation, and innovation. Profits (supplemented
process of real economic growth and so have real effects.
perhaps with a desire to create a business dynasty) motivate
Schumpeter’s most popular
entrepreneurs, who, financed by
hit was his 1942 book Capitalism,
bank credit, innovate new goods,
Socialism and Democracy. In it
new technologies, and new methods
Schumpeter preached creative
he coins the term “creative
of management and organization.
destruction” to denote capitalThese innovations fuel growth and
destruction — the incessant
ism’s incessant killing off of the
generate cycles.
old by the new. The book conWhy cycles? They arise when
tearing down of old ways
tains his famous end-of-history
the first successful entrepreneur
prediction that capitalism’s
overcomes the stubborn resistance
of doing things by the new —
very successes, not its failures
of incumbent interests and eases
and contradictions as prophethe path for other entrepreneurs.
as capitalism’s inescapable
sied by Karl Marx, will produce
The resulting bunching of innovasocial forces — the routinizations (not to be confused with
iron law, yet he was
tion and depersonalization of
mere inventions, which Schumpeter
innovation, the destruction of
saw as occurring more or less
unprepared when his own
the image of the entrepreneur
continuously over time) boosts
as romantic hero, the creation
investment spending, which bids
work fell prey to it.
of a class of intellectuals hostile
prices above costs and raises profit
to capitalism — which undermargins thereby triggering the
mine the system and lead to its demise.
upswing or prosperity phase of the cycle. The high profit
If capitalism cannot survive, can one rely upon its succesmargins then attract swarms of imitators and would-be
sor, socialism, to deliver the goods and amenities of life
competitors into the innovating industries. Output overefficiently and fairly? Yes, said Schumpeter, who proceeded
expands relative to the demand for it, prices fall to or below
to provide the supporting argument. Many readers took him
costs thus eliminating profit margins, and the downswing or
at his word, but not McCraw. He sees Schumpeter’s
recession phase begins. The recession continues, weeding out
“defense” of socialism as a devastating satire that mocks the
inefficient firms as it goes, until the economy absorbs the
system instead of bolstering it. Schumpeter, in other words,
innovations and consolidates the attendant gains thus clearing
comes not to praise socialism, but to bury it. In the end,
the ground for a fresh burst of innovation.
Schumpeter’s case for socialism rests on extremely abstract
If the upswing has been accompanied with speculative
theoretical conditions unlikely to be realized in practice.
excesses nonessential to innovation, the downswing may
All of which creates a problem: If Schumpeter sought to show
overshoot the new post-innovation equilibrium. Then the
that socialism was a practical impossibility, then why did he
cycle enters its depression phase where the excesses are
predict its ultimate triumph over capitalism? One wishes that
expunged and the economy returns via a recovery phase
the real Schumpeter would please stand up.
to equilibrium. Schumpeter stressed that the latter two
As for democracy, Schumpeter viewed it as a political
phases and the phenomena that generate them are
market in which politicians compete for the votes of the
unnecessary for cyclical growth and could be prevented
electorate just as producers compete for consumers’ dollars in
by properly designed policy. It’s not speculative bubbles but
markets for goods and services. Always skeptical of
rather the discontinuous clustering of innovations in time
consumer rationality, he believed that market power resides
plus their diffusion across and assimilation into the economore with vote seekers than with the electorate,
my that produces real cycles of prosperity and recession.
whose apathy, ignorance, and lack of foresight enable
Profits, entrepreneurs, bank credit, innovation — all are
politicians to set the policy agenda and to manipulate
essential to the growth of per-capita real income in
voter preferences. Even so, he felt that capitalism, as long as it
Schumpeter’s model. Remove any one and the growth
operates within a proper legal framework, is largely
process stops. Innovation, for instance, is abortive in the
self-regulating and so requires little intervention. It thus
absence of bank credit creation necessary to effectuate it.
constrains politicians’ market power more than does socialCash-strapped entrepreneurs cannot build their better
ism. McCraw fails to note that these ideas mark Schumpeter
mouse traps from thin air. They require real resource inputs
as a forerunner of the modern public choice school.
and loans of newly created bank money to hire them away
The last hit in the Schumpeter canon is his History of
from alternative employments. In highlighting this

Fa l l 2 0 0 7 • R e g i o n Fo c u s

41

BookFall08Final

1/28/08

2:41 PM

Page 42

Economic Analysis, whose title expresses his contention that
the rise of analytic techniques in economics is part of the
economic growth process and must be studied as such.
The History, in terms of its scholarship, breadth of coverage,
richness of content, originality of interpretation, and
wealth of resurrected valuable ideas, ranks with Jacob Viner’s
1937 book Studies in the Theory of International Trade as the
finest history of thought ever written. Scholars still mine it
for ideas today. Among other things, it provides sparkling
accounts of the quantity theory, the gold standard, Say’s Law,
the development of production and utility functions,
and much more.

Greatest Misses
Apart from an unfinished book on money, Schumpeter’s
misses include his massive, two-volume Business Cycles (1939),
which he wrote entirely by himself with no research assistance. Seven years in the making, it emerged stillborn from
the press. McCraw, however, values the book for its historical narrative of the vicissitudes of firms in five industries and
three countries. But Schumpeter’s contemporaries saw only
the book’s prolixity, discursiveness, and lack of focus. Most
of all, they rejected its contrived, mechanistic analytical
schema composed of three superimposed cycles — the
50-year Kondratieffs, 9-year Juglars, and 4-year Kitchens, all
named for their discoverers — into which Schumpeter
forced his data. As if these flaws weren’t enough to sink
Business Cycles, it had the bad luck, and bad timing, to appear
when J. M. Keynes’ celebrated General Theory was sweeping
the field. Everybody talked about Keynes’ book, few
about Schumpeter’s.

Schumpeter and Keynes
Schumpeter fumed when Keynes and Keynesian economics
upstaged him in the 1930s and 1940s. Economists preferred
Keynes’ theory to Schumpeter’s because it seemed to
offer a better explanation of and remedy for the Great
Depression, and because it possessed greater policy
relevance and was more amenable to the mathematical
modeling, econometric testing, and national income
accounting techniques just beginning to come into vogue
in the ’30s.
Schumpeter should have foreseen this state of affairs.
It was consistent with his doctrine of creative destruction in
which new theories, like new goods and new technologies,
displace the old in a never-ending sequence. Here Keynes
was the innovator whose analysis of capitalism rested
on such novel concepts as the multiplier, marginal propensity
to consume, marginal efficiency of capital, and liquidity
preference function. Taken together, these Keynesian
innovations were bound, according to the creative
destruction doctrine, to have supplanted Schumpeter’s
old-fashioned theory.
Instead of accepting this outcome, Schumpeter reacted
exactly as he had described entrenched interests doing
when threatened by an innovation that disrupts their
42

R e g i o n Fo c u s • Fa l l 2 0 0 7

accustomed status quo: He put up stubborn resistance.
His resistance, however, was motivated not so much by
simple self-interest, or desire to protect his own theory, as
by his scientific judgment that Keynesian economics was
fundamentally unsound.
Schumpeter accused Keynes of assessing capitalism on
the basis of a short-run, depression-oriented model when
only a long-run growth-oriented one would do. He scorned
Keynes’ claim that capitalistic economies tend to be
perpetually underemployed and in need of massive government deficit spending to shore them up. He attacked the
“secular stagnation” notion that capitalists face vanishing
investment opportunities and slowing rates of technological
progress when the opposite is true. He rejected the
contention that income must be redistributed from the rich
(who save too much) to the poor (who cannot afford to save)
in order to boost consumption spending and aggregate
demand. Nonsense, said Schumpeter. The insatiability of
human wants ensures that income, regardless of who
receives it, will be spent in one way or another.
McCraw does a fine job discussing Schumpeter’s
criticisms, all of which were valid, penetrating, and correct.
He fails, however, to note that Schumpeter essentially
attacked the wrong target. For it was not so much Keynes as
his British and American disciples — people like Joan
Robinson, R. F. Kahn, Abba Lerner, Schumpeter’s Harvard
colleague Alvin Hansen, and others — who were largely
responsible for the doctrines, especially their extreme
versions, that Schumpeter countered. But McCraw rightly
points out that Schumpeter slipped when he opined that the
Keynesian-style permanently mixed economy, or public
sector-private sector partnership, was unsustainable and
could not last. The private sector, Schumpeter reasoned,
would become addicted to government expenditure
stimulus and demand ever-increasing amounts. In this way,
the public sector would expand relative to the private one
and the economy would gravitate to socialism. Time has
proved Schumpeter wrong. Private and public sectors have
coexisted in a fairly stable ratio in most developed countries
for the past 60 years.

Controversial Issues
Schumpeter held politically unpopular opinions in the 1930s
when New Deal activism and populist anti-business
sentiments were on the rise. He opposed President
Roosevelt’s New Deal reforms on the grounds that they
hampered entrepreneurship and growth. For the same
reason, he opposed Keynesian macro demand-management
policies designed to tame the trade cycle. In his view,
because growth is inherently cyclical, one flattens the cycle
at the cost of eliminating growth. Other controversial
opinions, all corollaries of his work on innovation and
creative destruction, flowed from his pen.
Of income inequality he wrote that the gap between rich
and poor is a prerequisite to and a relatively harmless
byproduct of growth in a capitalistic system. The rich are

BookFall08Final

1/28/08

2:41 PM

Page 43

necessary since it is they and not the poor who save and
invest in the innovation-embodied capital formation that
lifts the living standards of all. Moreover, high incomes
provide both incentive and reward for the entrepreneurs
who propel growth. No one need fear that an unequal
distribution will condemn them to poverty. The Italian
economist Vilfredo Pareto’s notion of the “circulation of
the elites” assures that. The ceaseless rise and fall
of entrepreneurs into and out of the top income bracket
means that it will be occupied over time by different
people, many of them drawn from the ranks of the poor.
The poor replace the rich and the rich the poor in
never-ending sequence.
In assuming a high degree of mobility across income
groups, Schumpeter may have overlooked an education
barrier. He failed to acknowledge that a superior education,
increasingly a prerequisite to entrepreneurship and wealth
in today’s high-tech world, is more affordable by the rich,
enabling them and their offspring to stay on top.
Monopolistic firms and monopolistic profits hardly
worried Schumpeter. He thought that monopolies, unless
protected by government, are short-lived, inherently selfdestroying, and require no antitrust legislation. Their high
profits attract the very rivals and producers of substitute
products that undercut them. For the same reason, he
regarded antitrust laws aimed at breaking up large,
nonmonopolistic firms as ill-advised. Not only are big
firms often more efficient than small ones, but their
research and development departments house teams of specialists functioning collectively — and routinely — as an
entrepreneur who creates innovations that drive growth.
Indeed, the very existence of R&D departments indicates
that big firms realize they must continually innovate to
stay alive.
Schumpeter’s politically unpopular opinions continued
into the wartime years of the 1940s. He distrusted
Roosevelt, suspecting him of trying to establish a
dictatorship. And he had mixed emotions about the Axis
nations, Germany and Japan. He despised their military
establishments, leaders, and advisors. But he admired the
people and cultures of the two countries and feared that the
United States would impose punitive reprisals at war’s end.
Most of all, he saw the United States’ wartime ally, the
Soviet Union, as its chief long-term foe, and thought

CRASH

•

that it would need Germany and Japan to serve as
buffers against the communist nation. These views found
little sympathy among Schumpeter’s friends and associates
in the ultrapatriotic environment of the early 1940s, a
circumstance that caused him much unhappiness.

Schumpeter Today
The new improves upon and kills off the old. True enough.
But what’s new and what’s old may lie in the eye of the
beholder. Today’s cutting-edge theorist and mathematical
modeler may regard Schumpeter’s analysis as older than old, a
pre-Keynesian, pre-monetarist, pre-new classical/rational
expectations relic. Accordingly, Schumpeter’s name is stricken
from required reading lists in many top graduate economic
programs where theory is king. To businessmen, journalists,
and historians seeking not abstract theory but rather practical
understanding of global capitalism, however, his work is as
fresh and insightful as the day he penned it. Journalists speak of
a renaissance of Schumpeterian economics and of a reversal
of his relative ranking with Keynes. Although McCraw does
not say so, Schumpeter undoubtedly would be pleased, but
hardly surprised, by the revival of his work. It fits his
description of the zigzag path of doctrinal history in which
sound economic ideas get lost or forgotten only to be
rediscovered and restored to their proper place.

A Complaint
A great book deserves a great index, or at the very least
an adequate one. McCraw’s book has neither. Lacking
comprehensiveness and precision, the index creates problems
for readers searching for particular items in the text. It is
inexcusable that the index fails to cover the 188 pages of
endnotes containing valuable scholarly information and
constituting a fourth of the book. One can fault the publisher,
not the author, for this oversight. Luckily, it does little to mar
McCraw’s outstanding text. Elizabeth Schumpeter wrote
that her husband “loved to read biographies.” It’s a sure bet
that he would have enjoyed this one.
RF
Thomas M. Humphrey, a retired senior economist at
the Richmond Fed and long-time editor of its
Economic Quarterly, has written extensively about the
history of economic thought. He can be reached at:
moneyxvelocity@comcast.net

continued from page 29

component-level examinations as well as simulations with
dummies and sometimes cadavers (the latter led by universities). “It’s a lot more complex when we’re doing the testing,”
Jarvis says. “We have to design for 1,001 different scenarios
and we have to design so that occupants have the best level
of protection in every one of those scenarios.” With regards
to the possible injuries to the pelvis of Ford Explorer passengers, Jarvis says that even with multiple crash tests in

consistent settings, there will be variation. Also, injuries
suffered by dummies don’t always translate to injuries suffered
by real people.
That said, Jarvis says Ford sees value in IIHS testing, as
well as that conducted by governments around the world.
“All of the public domain testing has upped the ante and
increased the debate in the level of design and safety testing,”
he says. “We certainly learn things from them.”
RF

Fa l l 2 0 0 7 • R e g i o n Fo c u s

43

BookFall08Final

1/28/08

2:41 PM

Page 44

DISTRICT ECONOMIC OVERVIEW
BY M AT T H E W M A RT I N

F

ifth District economic activity
advanced at a moderate pace
in the second quarter as
continued declines in housing
market activity constrained growth.
In contrast, labor market conditions
remained strong as District services
firms maintained a brisk pace of
hiring and posted healthy revenue
gains. Also, District households
experienced solid income growth
during the second quarter.

Housing Markets Retreat
Overall, Fifth District housing
market activity declined further in
the second quarter. The pullback
in residential construction activity
deepened a bit, with building
permit issuance down 16.2 percent
compared to last year. Existing home
sales were lower as well. Sales in the
District fell 11.1 percent since the second quarter of 2006.
Slower home construction and
sales were accompanied by slower
home price growth during the period.
While the pace of growth lessened in
the second quarter, appreciation in
every District jurisdiction — with the
exception of West Virginia —
remained in positive territory. After
peaking at 14.3 percent in the second
quarter of 2005, overall, year-over-year
price growth in the District has drifted lower since, settling at 4.0 percent

in the second quarter of this year.
Considerable variation in home price
performance remains, however. Rates
of appreciation have pulled back
sharply along the coast and in the
Washington, D.C., metro area, while

Fifth District economic
activity advanced at
a moderate pace in
the second quarter.
holding steady or even accelerating
modestly in many markets across
the Carolinas.

Labor Markets and Services
Sector Activity Steady
District labor market conditions
remained generally healthy in the
second quarter. Employment growth
was steady at 1.5 percent compared
to a year earlier — matching the
first-quarter mark — with payroll
expansions recorded in all District
jurisdictions. Reports from the household survey also indicated solid labor
market fundamentals. The Fifth
District’s unemployment rate held
steady at 4.2 percent, keeping the
region’s rate lower than the national
rate by 0.3 percentage point.

Economic Indicators
2nd Qtr. 2007
Nonfarm Employment (000)
Fifth District
U.S.
Real Personal Income ($bil)
Fifth District
U.S.
Building Permits (000)
Fifth District
U.S.
Unemployment Rate (%)
Fifth District
U.S.

44

1st Qtr. 2007

Percent Change
(Year Ago)

13,872
137,864

13,816
137,447

1.5
1.4

942.7
9,882.0

940.0
9,867.3

3.9
3.9

53.9
404.4

50.5
361.5

-16.2
-23.6

4.2%
4.5%

4.2%
4.5%

R e g i o n Fo c u s • Fa l l 2 0 0 7

The majority of the employment
growth during the quarter occurred in
the District’s services sector. Job gains
were particularly strong in education
and health services and business
services with year-over-year increases
of 3.3 percent and 2.5 percent,
respectively. Other assessments of
the services sector were also
upbeat. The revenue index from
the Richmond Fed’s survey of
service-providing firms rose two
points in the second quarter to
finish at 9. Additionally, the retail
revenues index rebounded in the
second quarter, climbing into
positive territory at 5, up from -9 in
the first quarter. Survey readings on
services employment in the District
were positive as well.
Goods-producing industries did
not fare as well in the second quarter,
however. Our survey of manufacturers
indicated generally lower levels of new
orders and shipments since the end of
March, though the index for overall
activity rebounded into positive territory in the June survey. On the
employment front, District factories
continued to shed workers during the
second quarter. Our manufacturing
employment index finished the quarter at -6. By contrast, employment in
the District’s construction industry
continued to increase despite the pullback in home building activity, buoyed
by solid nonresidential activity.

Households Faring Well
Steady job and income growth helped
strengthen household financial conditions in the second quarter. Overall,
real personal income in the District
was up 3.9 percent compared to last
year, with solid growth in most
District jurisdictions. Other measures
of household financial conditions were
mixed. Mortgage delinquency and
foreclosure rates were moderately
higher in the second quarter, though
in many parts of the District they were
below recent peaks.
RF

BookFall08Final

1/28/08

2:41 PM

Page 45

Nonfarm Employment

Unemployment Rate

Real Personal Income

Change From Prior Year

First Quarter 1996 - Second Quarter 2007

Change From Prior Year

First Quarter 1996 - Second Quarter 2007

First Quarter 1996 - Second Quarter 2007

4%

7%

8%

6%

6%

7%

3%
2%

5%
4%

5%

1%

3%
0%

2%

4%

1%

-1%

0%
3%

-2%
96 97 98 99 00 01 02 03 04 05 06 07

-1%
96 97 98 99 00 01 02 03 04 05 06 07

Fifth District

7%
6%
5%
4%
3%
2%
1%
0
-1%
-2%
-3%

96 97 98 99 00 01 02 03 04 05 06 07

United States

Nonfarm Employment
Metropolitan Areas

Unemployment Rate
Metropolitan Areas

Building Permits

Change From Prior Year

Change From Prior Year

First Quarter 1996 - Second Quarter 2007

First Quarter 1996 - Second Quarter 2007

First Quarter 1996 - Second Quarter 2007

Change From Prior Year

7%

30%

6%

20%

5%

10%

4%

0%

3%

-10%

2%

-20%

1%
96 97 98 99 00 01 02 03 04 05 06 07
Charlotte

Baltimore

-30%
96 97 98 99 00 01 02 03 04 05 06 07

96 97 98 99 00 01 02 03 04 05 06 07

Washington

Charlotte

Baltimore

Washington

FRB—Richmond
Services Revenues Index

FRB—Richmond
Manufacturing Composite Index

First Quarter 1996 - Second Quarter 2007

First Quarter 1996 - Second Quarter 2007

40

30

30

20

Fifth District

United States

House Prices
Change From Prior Year
First Quarter 1996 - Second Quarter 2007

16%
14%
12%

20
10

10

10%

0

8%

0

6%

-10
-10

4%
-20

-20

2%

-30

-30
96 97 98 99 00 01 02 03 04 05 06 07

96 97 98 99 00 01 02 03 04 05 06 07

0%
96 97 98 99 00 01 02 03 04 05 06 07
Fifth District

United States

NOTES:

SOURCES:

1) FRB-Richmond survey indexes are diffusion indexes representing the percentage of responding firms
reporting increase minus the percentage reporting decrease.
The manufacturing composite index is a weighted average of the shipments, new orders, and
employment indexes.
2) Metropolitan area data, building permits, and house prices are not seasonally adjusted (nsa); all other
series are seasonally adjusted.

Real Personal Income: Bureau of Economic Analysis/Haver Analytics.
Unemployment rate: LAUS Program, Bureau of Labor Statistics, U.S. Department of Labor,
http://stats.bls.gov.
Employment: CES Survey, Bureau of Labor Statistics, U.S. Department of Labor, http://stats.bls.gov.
Building permits: U.S. Census Bureau, http://www.census.gov.
House prices: Office of Federal Housing Enterprise Oversight, http://www.ofheo.gov.

For more information, contact Matthew Martin at 704-358-2116 or e-mail Matthew.Martin @rich.frb.org.

Fa l l 2 0 0 7 • R e g i o n Fo c u s

45

BookFall08Final

1/28/08

2:41 PM

Page 46

STATE ECONOMIC CONDITIONS
BY M AT T H E W M A RT I N

E

conomic conditions in the District of Columbia
remained generally healthy in the second quarter as
strong payroll growth outweighed softening residential real
estate activity. Employment growth accelerated during the
period, advancing at a 2.4 percent annual rate compared to
last quarter’s 1.1 percent mark. The region’s housing market
pullback deepened, however, as both existing home sales
and new construction declined, while delinquency rates
edged higher.
Overall, labor market conditions improved in the second
quarter, propelled by a 7.1 percent increase in professional
and business services employment. Government payrolls

U.S. and D.C. Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
110

INDEX LEVELS

108
106
104
102
100
98
96
01

02

03

04

05

06

District of Columbia

07
United States

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

also posted solid gains during the period. Employment in
the sector was up 3.4 percent at an annual rate, on the heels
of two consecutive quarters of decline. Turning to household conditions, the District of Columbia’s unemployment
rate dipped 0.2 percentage point to finish at 5.6 percent —
its lowest point in nearly seven years. However, even with
the improvement, the unemployment rate remained the
Fifth District’s highest mark.
On the residential real estate front, housing market conditions deteriorated further since the end of March. After a
slight uptick to begin the year, existing home sales reversed
course in the second quarter, falling 10.3 percent. Home
sales were also lower compared to the previous year, with
sales activity down 7.1 percent since the second quarter of
2006. The decline in sales contributed to continued softness
in home prices. The District of Columbia’s House Price
Index (HPI) — published by the Office of Federal Housing
Enterprise Oversight (OFHEO) — was unchanged in the
second quarter. On the other hand, the region experienced
mild appreciation over the past year as its HPI was up 4.6
percent compared to a year earlier.
46

R e g i o n Fo c u s • Fa l l 2 0 0 7

Slowing residential real estate activity during the period
coincided with slightly higher mortgage delinquency and
foreclosure rates. The District of Columbia’s delinquency rate
increased 0.5 percentage point during the second quarter to
finish at 3.7 percent, though it remained well below its
recent peak of 6.0 percent. The region’s foreclosure rate
edged up 0.1 percentage point to settle at 0.6 percent.

U Maryland

M

aryland’s economy slowed a bit in the second quarter as
mild employment growth and continued weakness in
residential real estate markets tempered growth prospects.
Payroll employment growth moved lower during the quarter,
advancing at a 0.6 percent annual rate versus 1.7 percent last
quarter. Job gains were limited due to further declines in
manufacturing payrolls and a slight dip in professional and
business services employment. Employment performance in
the state was also sluggish compared to a year earlier; payrolls
expanded by less than 1.0 percent since the second quarter
of 2006.
The report on household economic conditions was a bit
more upbeat, however. Maryland’s unemployment rate was
unchanged during the second quarter at 3.7 percent and 0.2
percentage point lower than a year ago, though a portion of
the stability in unemployment can be attributed to a slight
reduction in the state’s labor force. The readings on income
growth were mixed. Although real-income growth remained
positive in the second quarter, the rate slowed to just
0.6 percent at an annual rate down from 4.8 percent in
the first quarter. However, real income in the state increased
3.5 percent over the past year, up from last quarter’s mark
of 3.2 percent.
In housing markets, activity in the state declined in
the second quarter spurred by a drop-off in sales and

U.S. and MD Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
108
106

INDEX LEVELS

District of Columbia

104
102
100
98
96
01

02

03

04

05

06

Maryland

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

07
United States

BookFall08Final

1/28/08

2:41 PM

Page 47

construction activity. Existing home sales declined by a
wide margin during the second quarter, falling 19.7 percent
compared to the first quarter and 21.1 percent compared to a
year earlier. Additionally, building permit issuance fell 27.8
percent over the past 12 months. Both declines were the
largest among District jurisdictions.
Despite a weaker housing market, home prices continued
to move higher in the second quarter. Maryland’s HPI rose at
a 3.2 percent annual rate, a full percentage point higher
than the first-quarter mark. In addition, its HPI was
4.7 percent higher than a year ago, though the increase was
the state’s smallest since 1999. On a less rosy note, Maryland’s
mortgage delinquency rate moved higher in the second quarter. The state’s overall delinquency rate rose to
4.2 percent, but remained well below the recent peak of
6.4 percent registered in the third quarter of 2001.
The delinquency rate for subprime mortgages set a new
high watermark, however, climbing to 13.8 percent, up from
11.2 percent in the first quarter.

h

North Carolina

T

he North Carolina economy remained on generally
solid footing during the second quarter, though labor
market growth was less robust. Compared to a year earlier,
total employment was up 2.2 percent versus 2.4 percent
in the first quarter. Job gains were centered in the
state’s services industries, but an increase in the rate of

U.S. and NC Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
106

INDEX LEVELS

104
102
100
98
96
94
01

02

03

04

05

06

North Carolina

07
United States

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

manufacturing job losses constrained overall growth.
Professional and business services employment increased by
3.7 percent over the past 12 months, while manufacturing
payrolls contracted 1.3 percent.
The household survey provided a less optimistic view of
labor market conditions as North Carolina’s employment

rate rose 0.3 percentage point to finish at 4.8 percent.
Additionally, labor force growth in the state slowed to 1.6
percent over the past year, down from 2.4 percent in the
first quarter.
On a brighter note, household financial conditions
improved in the second quarter as North Carolina posted
solid income growth. In fact, the state experienced the
strongest income growth among District jurisdictions
during both the second quarter and the past 12 months.
Furthermore, the state’s 4.8 percent increase in personal
income was nearly a full percentage point above the
national mark over the same period.
As in most other jurisdictions, North Carolina’s housing
sector continued to slump. Building permit issuance
across the state declined 16.7 percent compared to the same
quarter last year, while existing home sales declined
4.5 percent over the same period. Soft construction and
sales activity in the second quarter accompanied a slowdown
in home appreciation. North Carolina’s HPI increased
0.8 percent during the three-month span compared
to a 1.7 percent increase last quarter. Nonetheless,
the state saw housing prices increase 7.1 percent since the
second quarter of last year — the largest year-over-year gain
in the Fifth District.
In other housing news, North Carolina’s overall
mortgage delinquency rate edged higher in the second
quarter to 5.5 percent compared to 5.3 percent a year earlier.
The subprime delinquency rate also rose during the quarter,
increasing 2.1 percentage points to 15.5 percent.

o South Carolina
E

conomic conditions in South Carolina deteriorated a bit
in the second quarter amid softening labor markets and
continued housing woes. Employment growth moderated
during the period as payrolls expanded just 0.1 percent since
the end of March. The weak employment performance was
due, in part, to the state’s first decline in construction payrolls
in two years in concert with an intensification of manufacturing job losses. On a brighter note, education and health
services employment was up 7.1 percent compared to the
previous year. State professional and business services firms
also posted solid payroll gains — employment in the sector
expanded 1.9 percent during the quarter, the largest increase
in the District over the period.
On the household side, South Carolina’s unemployment
rate fell 0.5 percentage point to finish at 5.6 percent — a mark
which, despite the drop, tied the District of Columbia for
the highest rate in the Fifth District. Household financial
conditions were boosted by solid income growth over the

Fa l l 2 0 0 7 • R e g i o n Fo c u s

47

1/28/08

2:41 PM

Page 48

U.S. and SC Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
106

INDEX LEVELS

104
102
100
98
96
01

02

03

04

05

06

South Carolina

07
United States

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

second quarter. Real personal income increased 3.7 percent
compared to the same quarter last year. Nonetheless, solid
income growth coincided with increased mortgage delinquencies. Delinquency rates among both conventional and
subprime borrowers moved higher since the end of March,
finishing at 3.3 percent and 15.7 percent, respectively.
Like the rest of the District, South Carolina’s recent
housing woes persisted in the second quarter. Cumulative
building permits through the second quarter were 19.8 percent
lower than 2006 levels. Additionally, existing home sales were
down 9.3 percent from a year ago with especially large
declines in coastal markets. Soft housing market activity
contributed to lower rates of home price appreciation.
South Carolina’s HPI was up 6.3 percent over the last 12
months versus last quarter’s mark of 7.6 percent.
As was the case with sales, home price growth was slower near
the coast due in part to sharp reductions in demand for second homes. The HPI for the Charleston metro area, for
example, was down slightly in the second quarter.

u Virginia
n balance, economic conditions in Virginia improved
during the second quarter of 2007 as healthy labor market conditions more than offset growing weakness in the
residential real estate markets.
Payroll employment growth was strong across Virginia.
Nonfarm payroll employment increased at a 2.6 percent
annualized rate in the second quarter and 1.4 percent since
March of 2006. Most of the gains occurred in the services
sector, led by a 5.5 percent jump in professional and business
services employment. The state also experienced an increase
in manufacturing employment during the second quarter.
The expansion marked the second consecutive quarterly
gain in factory payrolls following 10 quarters of losses.

O

48

R e g i o n Fo c u s • Fa l l 2 0 0 7

The economic conditions of Virginia’s households were
also solid during the second quarter. The unemployment
rate inched higher by 0.1 percentage point to finish at 3.0
percent, but remained the lowest rate in the Fifth District.
The unemployment rate has hovered near the 3.0 percent
mark over the past 12 months even amid a sizable
1.6 percent increase in the labor force. Solid job prospects in
the period accompanied stronger personal income growth
across the state. Virginia’s real income growth over the past
year accelerated to a 3.5 percent annual rate, up from
3.1 percent in the first quarter.
On the other side of the coin, Virginia’s residential real
estate market remained a weak spot in the state’s economy
during the second quarter. Existing home sales dropped

U.S. and VA Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
108
106
104

INDEX LEVELS

BookFall08Final

102
100
98
96
01

02

03

04

05

06

Virginia

07
United States

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

sharply compared to the first quarter and a year earlier.
Home sales fell 15.3 percent over the past year, while building
permit levels dropped 17.6 percent over the same period.
The decline in both sales and construction corresponded
with a further deceleration in home price appreciation as
year-over-year growth in the HPI slowed to 3.7 percent.
Adding to the less upbeat housing report, Virginia’s overall
mortgage delinquency rate increased to 3.7 percent compared
to last quarter’s 3.1 percent mark. Increased delinquencies
among subprime borrowers accounted for much of the
second-quarter jump as that rate moved higher from 11.0 percent to 13.4 percent. Nonetheless, both overall and subprime
delinquency rates remained well below recent peak levels.

w

West Virginia

he pace of West Virginia economic activity waned a
bit in the second quarter as weak employment performance and slower residential real estate activity weighed
on growth.

T

Page 49

Behind the Numbers: Consumer Confidence
A leading consumer confidence index is released once a
month by an independent research organization called the
Conference Board. It is a survey of 5,000 households
(returned by about 3,500), asking participants whether they
are positive, neutral, or negative about a short list of
economic conditions in the present and near future. Out of
the responses the Conference Board builds indexes tied to
the base year of 1985. The method is similar to that used by
the other main consumer confidence index provider,
Reuters/University of Michigan Surveys of Consumers.
These indexes may be useful in forecasting what
consumers will spend in the future and perhaps provide
insights into current economic conditions not captured in
other data. In fact, studies have shown a strong correlation
between consumer confidence and consumer spending.
But do consumer confidence indexes do more than
confirm or support other data? That was the question
posed by economist Dean Croushore with the University
of Richmond.
Croushore noted that previous research has shown that
forecasts are not improved with adding consumer confidence

Index of Consumer Sentiment
120
100
80
60
40
20
0

20
07

Labor market conditions in the state softened further in the
second quarter. West Virginia’s unemployment rate increased
0.2 percentage point during the period to settle at 4.4 percent,
though the mark was below the state’s 4.9 percent rate a year
ago. Additionally, payroll growth in the state was weak. Total
nonfarm employment expanded just 0.6 percent over the past
year — the smallest percentage increase among Fifth District
jurisdictions. Steep manufacturing job losses weighed on
overall job gains, while employment growth in both the mining
and construction sectors decelerated somewhat.

06

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

20

07
United States

05

06

20

05
West Virginia

04

04

20

03

03

02

02

01

20

94

20

96

01

98

20

100

00

102

20

INDEX LEVELS

104

8

Index = Jan. 2001 = 100
106

The weak job growth in West Virginia accompanied
softer income growth in the second quarter. Real personal
income increased at an annual rate of 0.4 percent during
the period compared to last quarter’s 3.5 percent increase.
Over the past year personal income levels grew just
2.8 percent, the lowest mark among all Fifth District
jurisdictions.
Residential real estate remained a soft spot in West
Virginia’s economy during the second quarter. Activity
continued its retreat as the number of building permits
issued during the period fell 7.3 percent short of year-earlier
levels. Existing home sales were off more sharply,
declining 17.4 percent for the quarter and 13.6 percent
over the previous year. Moreover, West Virginia
was the only state in the Fifth District whose HPI
contracted during the second quarter. The state’s
HPI edged lower at a 0.9 percent annual rate, though
the index remained 4.4 percent higher than a year earlier.
Additionally, the overall mortgage delinquency rate
increased to 6.8 percent — 0.6 percentage point above
the second-quarter level. The increase in the overall
rate was due in large part to a substantial jump in
the number of subprime delinquencies. The state’s subprime
delinquency rate swelled 2.3 percentage points to finish
at a District-high of 18.1 percent.
RF

199
9

U.S. and WV Employment Growth Since Jan. 2001

7

2:41 PM

199

1/28/08

199

BookFall08Final

SOURCE: Reuters/University of Michigan Survey

indexes. To double-check, Croushore tapped into a
real-time data set developed by the Federal Reserve Bank of
Philadelphia. This allowed him to take the view of a
forecaster operating at the time those forecasts were made.
Even then, consumer confidence indexes don’t seem to add
much: “The bottom line: If you are forecasting consumer
spending for the next quarter, you should use data on past
consumer spending and stock prices and ignore data on
consumer confidence.”
— DOUG CAMPBELL

Fa l l 2 0 0 7 • R e g i o n Fo c u s

49

BookFall08Final

1/28/08

2:41 PM

Page 50

State Data, Q2:07
DC

MD

NC

SC

VA

WV

698.0
0.6
1.6

2,609.6
0.1
0.9

4,101.2
0.5
2.2

1,924.4
0.1
1.3

3,779.4
0.6
1.4

759.4
0.1
0.6

1.6
2.1
-9.3

134.3
-0.3
-1.6

546.8
-0.4
-1.3

244.8
0.3
-3.9

286.8
0.2
-1.3

59.5
-0.2
-2.6

Professional/Business Services Employment (000's) 159.8
Q/Q Percent Change
1.7
Y/Y Percent Change
4.4

401.7
-0.1
2.0

488.4
0.5
3.7

218.1
1.9
0.0

647.2
1.4
3.4

60.9
1.4
1.4

Government Employment (000's)
Q/Q Percent Change
Y/Y Percent Change

233.9
0.8
0.5

471.1
0.1
0.0

686.9
1.2
1.1

333.0
0.4
0.5

680.8
0.9
1.0

144.6
0.0
0.3

Civilian Labor Force (000's)
Q/Q Percent Change
Y/Y Percent Change

319.9
-0.3
1.4

2,998.6
-0.9
-0.2

4,529.3
0.2
1.6

2,148.8
-0.5
1.3

4,051.0
0.0
1.6

814.0
0.2
1.1

5.6
5.8
5.9

3.7
3.7
3.9

4.8
4.5
4.7

5.6
6.1
6.4

3.0
2.9
3.0

4.4
4.2
4.9

30,053.1
0.3
4.0

220,350.4
0.2
3.5

259,680.9
0.4
4.8

115,348.7
0.3
3.7

271,749.4
0.3
3.5

45,529.6
0.1
2.8

Building Permits
Q/Q Percent Change
Y/Y Percent Change

501
-39.9
136.3

6,280
15.1
-27.8

23,103
-0.2
-16.7

12,015
14.3
-9.2

10,804
12.0
-17.6

1,157
31.3
-7.3

House Price Index (1980=100)
Q/Q Percent Change
Y/Y Percent Change

665.3
0.0
4.6

547.4
0.8
4.7

339.4
0.8
7.1

322.3
0.2
6.3

477.6
0.7
3.7

232.5
-0.2
4.4

Sales of Existing Housing Units (000's)
Q/Q Percent Change
Y/Y Percent Change

10.4
-10.3
-7.1

92.8
-19.7
-21.1

231.2
-5.2
-4.5

113.6
-2.7
-9.3

124.0
-12.9
-15.3

30.4
-17.4
-13.6

Nonfarm Employment (000's)
Q/Q Percent Change
Y/Y Percent Change
Manufacturing Employment (000's)
Q/Q Percent Change
Y/Y Percent Change

Unemployment Rate (%)
Q1:07
Q2:06
Real Personal Income ($Mil)
Q/Q Percent Change
Y/Y Percent Change

NOTES:
Nonfarm Payroll Employment, thousands of jobs, seasonally adjusted (SA) except in MSA's; Bureau of Labor Statistics (BLS)/Haver Analytics, Manufacturing Employment, thousands of jobs, SA in all but DC and SC; BLS/Haver Analytics,
Professional/Business Services Employment, thousands of jobs, SA in all but SC; BLS/Haver Analytics, Government Employment, thousands of jobs, SA; BLS/Haver Analytics, Civilian Labor Force, thousands of persons, SA; BLS/Haver Analytics,
Unemployment Rate, percent, SA except in MSA's; BLS/Haver Analytics, Building Permits, number of permits, NSA; U.S. Census Bureau/Haver Analytics, Sales of Existing Housing Units, thousands of units, SA; National Association of Realtors®

50

R e g i o n Fo c u s • Fa l l 2 0 0 7

Metropolitan Area Data, Q2:07
Nonfarm Employment (000's)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q1:07
Q2:06
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Washington, DC MSA

Baltimore, MD MSA

Charlotte, NC MSA

2,437.6
1.7
1.8

1,314.2
2.0
0.4

843.7
1.4
2.6

3.0
3.2
3.1

3.8
4.2
4.1

4.7
4.6
4.7

7,311
14.4
-5.5

1,644
-5.6
-27.5

6,312
12.0
-6.6

Raleigh, NC MSA
Nonfarm Employment (000's)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q1:07
Q2:06
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Unemployment Rate (%)
Q1:07
Q2:06
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Columbia, SC MSA

498.9
2.1
2.5

294.9
1.1
2.9

365.8
0.6
1.3

3.7
3.6
3.7

4.2
4.9
5.0

4.7
5.5
5.4

4,214
3.7
21.3

2,197
-54.3
-3.0

2,313
38.2
14.0

Norfolk, VA MSA
Nonfarm Employment (000)
Q/Q Percent Change
Y/Y Percent Change

Charleston, SC MSA

Richmond, VA MSA

Charleston, WV MSA

783.6
2.8
1.2

638.3
1.5
1.7

151.9
2.2
1.0

3.1
3.3
3.2

3.0
3.2
3.1

4.1
4.5
4.6

1,574
-25.5
-19.3

2,136
18.1
-12.7

70
-6.7
-19.5

For more information, contact Matthew Martin at 704-358-2116 or e-mail Matthew.Martin@rich.frb.org.

Fa l l 2 0 0 7 • R e g i o n Fo c u s

51

BookFall08Final

1/28/08

2:41 PM

Page 52

OPINION
When Disclosure is Not Enough
BY B O RYS G RO C H U L S K I

mid the recent spike in the mortgage defaults, the
Federal Trade Commission reported this summer
that consumer disclosure forms used in mortgage
lending fall short in conveying vital information to borrowers,
and that improvements were both desirable and achievable.
This might well be true. Better-informed consumers will
often make better purchasing decisions, and loan disclosures
currently in use very well may be less than perfect. But for
reasons I will explain, even the fullest of consumer disclosures won’t get to the heart of a perhaps more fundamental
problem facing the U.S. mortgage market.
It is quite clear that mandatory, government-enforced
disclosures play an important and positive role in consumer
lending. Market forces alone are probably not enough to
determine the proper form of disclosure, as consumer
credit markets are not free of search costs and asymmetric
information. These so-called market frictions impede the
efficiency of the laissez-faire outcome and can justify
government intervention. What exact shape and form this
intervention should take is an important question.
However, even if borrowers perfectly understand
the terms of contract and the trade-offs involved in all
mortgage products available, there still exists another
force pushing borrowers toward taking too much risk: an
expectation of a taxpayer-funded government bailout in the
event of an adverse economy-wide shock.
An important example of such a shock is a housing
market slowdown. If the government is expected to
offer a bailout to borrowers in the case of a collapse in
property values, we face the so-called moral hazard problem,
in which borrowers take on too much risk. Under this
scenario, if the property values grow, borrowers win the
prize of appreciated home values; if they collapse,
taxpayers lose. If a bailout is likely, borrowers have an
incentive to take on risky mortgage products (putting zero
money down, keeping the monthly payment as low as
possible, and buying into as big a house as possible) so
as to maximize their capital gain in the good outcome.
The lenders are happy to oblige, as the losses that result
in the bad outcome will be sustained by the bailout.
No amount of disclosure can change this.
How can this problem be dealt with? Ex post, i.e., once
enough borrowers are under water, it is too late to prevent
moral hazard. The government cannot abandon distressed
primary-residence homeowners. However, measures could be
taken to eliminate the moral hazard issue going forward.
Just instituting the “no-more-bailouts” policy will not
work, for the public can correctly perceive that this policy
will likely be abandoned next time enough households are
in dire straits, and moral hazard will continue. The problem

A

52

R e g i o n Fo c u s • Fa l l 2 0 0 7

can, however, be eliminated at the ex-ante stage, i.e., before
households get into risky borrowing, with direct controls
put on the amount of risk that households can take.
An outright ban of some of the riskiest mortgage
products is almost certainly not a part of an efficient
solution. There may always be a borrower for whom, when
properly disclosed and priced, a “Ninja” mortgage actually is
optimal. What would, however, be the cost of finding out who
is a suitable borrower for a risky loan and who is not before the
deal is made? If this cost is not too high, relative to the benefit of mitigating the moral hazard problem, then perhaps a
suitability check for some of the risky mortgage products
could be instituted. After all, mandatory suitability checks are
already in place in other markets affected by the government’s
general inability to commit to not bailing out ex post.
The way we regulate medications in this country is
instructive. For many medications, particularly those risky
ones with strong and variable side effects, consumers must
obtain a prescription before purchasing. If consumers were
allowed to just read a disclosure and make their own
medication choices, they might take unnecessary risks and
later end up in the emergency room. The treatment
that a self-medicated patient would receive in an emergency
room is akin to a government bailout — a guarantee of help
even when the consumer took on excessive risk.
In a sort of preemptive strike, we require licensed
intermediaries (doctors) to determine which prescription
medications consumers can use partly because the
government cannot commit to not bailing out consumers
who recklessly self-medicate. This is an explicit restriction
on consumer freedom of choice in this particular market,
but one that has been deemed necessary because of the
alternative-scenario consequences.
Could the commitment problem in the mortgage
market be solved in a similar way? We might well consider
suitability checks for some mortgage products. Perhaps for
certain exotic loans, we might require the lender, or an
independent third party, to check and certify the suitability
of the loan for the borrower before the loan is made.
To be sure, we would then face other costs and problems.
A sound cost-benefit analysis of this solution is needed.
If, however, a government bailout is perceived by the
public as a real possibility, a mandatory suitability check
may be necessary to prevent moral hazard. Disclosures are
important, but we should not expect even perfect ones
to be sufficient.
RF
Borys Grochulski is a research economist with the
Richmond Fed. The views expressed here are his own and
not necessarily shared by the Federal Reserve System.

CoversFall08Final_1.30

1/30/08

11:39 AM

Page 3

UME 11
MBER 4
L 2007

NEXTISSUE
Private Equity

Interview

Many people associate the private equity industry with the
sometimes ruthless way firms go about getting results, and
the considerable profits pocketed by managers. But private
equity is not just about the splashy deals and headline-grabbing
returns. Studies find that private equity firms often improve
the companies they invest in. And most deals are relatively
small. However, even those who believe in the importance of
private equity worry that some of the firms’ practices may be
weakening the very attributes that have made them successful.

We talk with Christopher Ruhm of the
University of North Carolina, Greensboro,
a former senior staff economist for
the Council of Economic Advisers whose
research centers on early childhood
education.

Massively Multiplayer Online Games
Online games like World of Warcraft and Second Life have
attracted millions of players. Now, economists are looking
at virtual worlds for insights into real-world policies.
Unlike mathematical models or small-scale experiments, virtual
worlds provide venues for scenario-testing that might
otherwise be impractical, unwise, or unethical, and there
is no need for abstract assumptions about human behavior.
For economic policymakers in particular, massively multiplayer
online games may become an invaluable research tool.

Mechanism Design
Transactions often don’t yield the best possible outcome when
one party has more information than the other. Mechanism
design is about understanding how the rules of the game can be
set up to lead to a more desirable result, knowing that people
will typically act for their own gain. The theory received much
attention with this year’s Nobel Prize for economics, but its
applications have been around for a long time. We look at
research by economists, including those at the Richmond Fed,
who use concepts in mechanism design to study financial
contracts and institutions.

Revenue Sharing
In 2007, the New York Yankees spent $189 million on talent.
The Tampa Bay Devil Rays spent $24 million. To address this
discrepancy, which arguably distorts on-field play, baseball has
devised revenue-sharing plans, giving poorer teams the
resources to compete with richer teams. But research suggests
that revenue sharing has failed to restore competition in
baseball. The only salient effect appears to be a significant
reduction in player salaries.

Economic History
Rice cultivation built South Carolina into
the wealthiest colony in the New World.
But mechanization ultimately drove efficient production elsewhere and eventually
eroded the culture of the lowcountry
rice planters.

Federal Reserve
Central banks around the globe have long
cooperated with each other. In recent years,
collaborative efforts have included coordinating objectives on exchange rates and in
monetary policy. But cooperation is
difficult and sometimes controversial, given
political considerations as well as the
complexities of global finance. The scope
of future cooperation between central
banks remains in question.

Visit us online:
www.richmondfed.org
• To view each issue’s articles
and web-exclusive content
• To add your name to our
mailing list
• To request an e-mail alert of
our online issue posting
• To check out our online
weekly update

CoversFall08Final_1.30

1/30/08

11:39 AM

Page 4

RECENT

Economic Research from the Richmond Fed

E

conomists at the Federal Reserve
Bank of Richmond conduct
research on a wide variety of monetary
and macroeconomic issues. Before that
research makes its way into academic
journals or our own publications,
though, it is often posted on the Bank’s
Web site so that other economists can
have early access to the findings.
Recent offerings from the Richmond
Fed’s Working Papers series include:

“Moral Hazard and Persistence”
Hugo Hopenhayn and Arantxa Jarque, December 2007
“Avoiding the Inflation Tax”
Huberto M. Ennis, December 2007
“The Anatomy of U.S. Personal Bankruptcy under Chapter 13”
Hülya Eraslan, Wenli Li, and Pierre-Daniel G. Sarte, October 2007
“Notes on the Inflation Dynamics of the New Keynesian
Phillips Curve”
Andreas Hornstein, August 2007
“A Literature Review on the Effectiveness of Financial
Education”
Matthew Martin, June 2007
“Bank Runs and Institutions: The Perils of Intervention”
Huberto M. Ennis and Todd Keister, April 2007
“Heterogeneous Borrowers in Quantitative Models of
Sovereign Default”
Juan Carlos Hatchondo, Leonardo Martinez, and Horacio Sapriza,
March 2007
“Risky Human Capital and Deferred Capital Income Taxation”
Borys Grochulski and Tomasz Piskorski, January 2007

You can access these papers and more at: www.richmondfed.org/publications/economic_research/working_papers/

Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261

Change Service Requested

PRST STD
U.S. POSTAGE PAID
RICHMOND VA
PERMIT NO. 2

Please send address label with subscription changes or address corrections to Public Affairs or call (804) 697-8109