View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

RF_SUM_08 Full Cover

10/7/08

10:53 AM

SPRING/SUMMER

Page 1

2008

THE

FEDERAL

RESERVE

BANK

OF

RICHMOND

RF_SUM_08 Full Cover

10/7/08

10:56 AM

Page 2

VOLUME 12
NUMBER 2/3
SPRING/SUMMER 2008

12

Unsteady State: The ongoing evolution of mainstream
economics
Is economics too set in its ways to consider alternative explanations
for how individuals and firms make decisions? It’s a fair question but
it would be unfair to suggest that it is going unanswered.
16

Economist, Study Thyself: The way economists are trained has
come a long way in the past 20 years. Has it come far enough?
Economics is all the rave and a major in economics is in high demand.
But don’t get carried away. There is no shortage of questions about the
direction of economics.
21

John A. Weinberg
Stephen Slivinski

Hundreds of economic blogs have sprung up on the Internet, many
written by academics. What do blogs add to public discourse about
the economy?

MANAGING EDITOR

Kathy Constant
STA F F W R I T E R S

24

Is Rational Man Extinct? Searching for Homo Economicus

Doug Campbell
Betty Joyce Nash
Vanessa Sumo
E D I TO R I A L A S S O C I AT E

The debate whether there ever was such a creature has broken into
the mainstream media discussion about how economists view the
world. It’s a discussion that has been at least 50 years in the making
and probably won’t end soon.

Julia Ralston Forneris
R E G I O N A L A N A LY S T S

Matthew Martin
Sonya Ravindranath Waddell

28

Going Nuclear: The future looks brighter for a once-maligned
industry
After years of aversion by many, nuclear power seems to be making a
comeback, which has the potential to lead to big changes in the energy
industry in the Fifth District.

1 President’s Message/Financial Stability and the Fed
2 Federal Reserve/The Pragmatic Evolution of the Monetary Standard
5 Jargon Alert/Productivity
6 Research Spotlight/Does Unemployment Insurance Discourage Work?
7 Policy Update/New Farm Bill Extends Menu
8 Short Takes/News from the District
10 Around the Fed/The Costs and Benefits of Disclosure
32 Interview/Charles Holt
36 Economic History/Liquid Gold
39 Book Review/Good Capitalism, Bad Capitalism, and the Economics
of Growth and Prosperity
40 District/State Economic Conditions
48 Opinion/More Theory, Please

DIRECTOR OF RESEARCH

SENIOR EDITOR

Econblogs: Economists think out loud online

DEPARTMENTS

Our mission is to provide
authoritative information
and analysis about the
Fifth Federal Reserve District
economy and the Federal
Reserve System. The Fifth
District consists of the
District of Columbia,
Maryland, North Carolina,
South Carolina, Virginia,
and most of West Virginia.
The material appearing in
Region Focus is collected and
developed by the Research
Department of the Federal
Reserve Bank of Richmond.

CONTRIBUTORS

Khalid Abdalla
William Perkins
DESIGN

Beatley Gravitt Communications
C I RC U L AT I O N

Walter Love
Shannell McCall
Published quarterly by
the Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA23261
www.richmondfed.org

Subscriptions and additional
copies: Available free of charge
by calling the Public Affairs
Division at (804) 697-8109.
Reprints: Text may be reprinted
with the disclaimer in italics
below. Permission from the
editor is required before
reprinting photos, charts, and
tables. Credit Region Focus and
send the editor a copy of the
publication in which the
reprinted material appears.
The views expressed in Region Focus
are those of the contributors and not
necessarily those of the Federal Reserve
Bank of Richmond or the Federal
Reserve System.
ISSN 1093-1767

RF_SUM_08

9/11/08

11:46 AM

Page 1

PRESIDENT’SMESSAGE
Financial Stability and the Fed
his issue of Region Focus
features a special section
that explores the economics profession today, its role
in society, and questions about
its future. Such a discussion is
timely, appropriate, and healthy
for a dynamic discipline that’s
been much in the news lately.
Still, the basic lessons of economics endure: Markets tend to
organize economic activity efficiently, and government intervention can sometimes have unexpected and undesirable
consequences.
Given the recent events in the economy, and especially in
financial markets, it’s also useful to draw on research and
experience of the last 30 years. While that body of knowledge provides a solid foundation for policy, the wide-ranging
root causes of disruptions can be difficult to determine.
And when events require swift and direct policy responses
in real time, the job gets even tougher. The Fed’s role as
lender of last resort often faces the institution with difficult
choices when financial disruptions unfold.
In an episode of financial disruption, central bank lending may prevent a bank run and put off costly closure or
liquidation. (Bank runs occur when depositors fear that a
bank’s assets can’t cover its liabilities, and depositors cash
out en masse.) But if the financial sector is just coping with
deteriorating fundamentals, central bank lending distorts
economic allocations by artificially supporting the prices of
some assets and liabilities of some market participants.
Government support in this latter case can intensify the
“moral hazard” problem inherent in any financial safety net.
Applying this framework to recent policy actions can
help provide some perspective. As the slowdown in housing
markets and the associated decline in home prices began, it
became clear that the securities backed by mortgages
originated in 2006 and early 2007 would perform significantly worse than anticipated. This realization affected the
future prospects of any institution or financial instrument
with mortgage-related exposure. The recent instances of
run-like behavior, such as those that afflicted Bear Stearns in
the week leading up to its acquisition, seemed to reflect
increased concern about the quality of these sorts of
financial products. In short, it appeared to be what we
would classify as a deterioration in market fundamentals,
not a liquidity crisis.
Perhaps most important to the current debate is the fact
that market expectations of central bank response in times
of stress can affect the robustness of the system. In the

T

short-term, governments and central banks may relieve
financial market strains, but the intervention itself may
affect future choices of financial institutions. These new
expectations could make future crises more likely.
If banks and other financial institutions assume central
bank support in the future, then they are less likely to put in
place the necessary and appropriate safeguards. That
assistance interferes with market discipline and distorts
market prices.
If intervention is assumed, then there’s scant incentive
for banks to take costly alternative action to prevent adverse
consequences. But there is an alternative. New research
by economists at the Richmond and New York Fed banks
considers a scenario in which there is absolute certainty
that no government or central bank assistance will be forthcoming. In such a world, banking contracts would likely
include provisions that allow for suspensions of payment.
These contracts will prevent the type of run that may occur
because of the perceived quality of its assets. This sort of
contract actually has its roots in the 19th century U.S. banking panics.
The Fed’s lending policy can play a role in the stability of
financial markets. As we learn more about the causes and
nature of financial instability, I believe we should strive
for policy that is informed by lessons about price stability
learned in the 1980s. That’s when the Fed committed
itself to a long-term goal of maintaining a low and stable
inflation rate. We will achieve better outcomes if we can
establish credibility for a pattern of behavior consistent
with that objective.

JEFFREY M. LACKER
PRESIDENT
FEDERAL RESERVE BANK OF RICHMOND

Spring/Summer 2008 • Region Focus

1

RF_SUM_08

9/11/08

11:46 AM

Page 2

FEDERALRESERVE
The Pragmatic Evolution of the Monetary Standard
The following is an excerpt from The Monetary Policy of
the Federal Reserve: A History by Robert L. Hetzel,
Copyright © Robert L. Hetzel, 2008. Reprinted with the
permission of Cambridge University Press.

nominal anchor for prices lay at the heart of the spectacular
monetary failures of that century. What nominal anchor and
what monetary standard are in place at the start of the current century?

The views expressed in this excerpt are those of the
author and not necessarily those of the Federal Reserve
Bank of Richmond or the Federal Reserve System.
Robert Hetzel is a senior economist and policy adviser
in the research department of the Federal Reserve Bank
of Richmond.

The Volcker-Greenspan Monetary Standard

✧
ast, horrific disasters marked the 20th century, but
also widespread, beneficent progress. In the first
half, two world wars almost ended Western
civilization. In the second half, democracy spread and living
standards rose. Throughout, monetary instability interacted with social upheaval and political disorder. Inflation
and deflation created feelings of powerlessness in the face
of impersonal forces that promoted a search for scapegoats. Hyperinflation and depression contributed to the
rise of Nazism in Germany. The stability of the
deutschemark then accompanied the German postwar
growth miracle.
In the United States, deflation and
depression in the 1930s produced a
decade of untold human misery. The
Great Inflation of the 1970s spawned
wage and price controls, which trampled
on due process. The feeling of government’s loss of control, symbolized by gas
lines, helped propel Ronald Reagan into
power. After Paul Volcker led the Fed to
accept responsibility for inflation in
1979, an increase in monetary stability
accompanied an increase in economic
stability.
The success of the 21st century will
depend upon how well societies learn
the lessons of the 20th century. The
grand monetary experiment of the last
century was replacement of a gold standard with a fiat money standard. The
failure of central banks to understand
their new responsibility to provide a

V

2

Region Focus • Spring/Summer 2008

The U.S. monetary standard has evolved pragmatically
rather than by conscious design. The current standard arose
out of the consistent effort by the Federal Open Market
Committee (FOMC) under Paul Volcker and Alan
Greenspan to re-anchor inflationary expectations
unmoored by the experience with stop-go policy.
Consistency under duress achieved credibility. Credibility
laid the foundation for the current nominal anchor: an
expectation of low, stable trend inflation unaffected by
macroeconomic shocks.
Something must “anchor” the public’s expectation of the
future value of money. For the gold standard, it was the
commitment to maintain the par value of gold. Under the
gold standard as it existed in the late 19th-century, money
received its value from the Bank of England’s commitment
to maintain in the future a fixed pound price of an ounce of
gold. For the contemporaneous money price of gold to be
viable, the public had to believe that the Bank would maintain that value in the future.
To achieve the stability in the expected future price level
requisite for contemporaneous stability of the price level,
the public must believe that the central
bank will behave consistently. Over the
quarter century of the VolckerGreenspan era, the Fed did not follow
a rule in the sense that it never departed from consistent procedures for
setting the funds rate. Nevertheless,
the achievement of near price stability
derived from an overall consistency of
behavior that emerged out of an effort
to restore the expectational stability of
the earlier commodity standard.

Stop-Go Monetary Policy and
the Loss of a Nominal Anchor
Experience with a commodity standard
created an expectation of price stability
that persisted into the second half of the
20th century. The primacy attached to
price stability by the early William
McChesney Martin FOMC sustained

RF_SUM_08

9/11/08

11:46 AM

Page 3

that expectation into the 1960s. Subsequently, stop-go policy
opportunistically exploited it and, in time, destroyed the
nominal anchor provided by the expectation of price stability.
Keynesians emphasized discretionary manipulation of
aggregate demand. Because they assumed the existence of
an inertia in inflation independent of monetary policy, they
believed that subject to the inflation-unemployment tradeoffs of the Phillips curve, the central bank could manipulate
aggregate nominal demand to smooth fluctuations in real
output. The exercise of discretion destroyed the prior
nominal expectational stability.
Sherman Maisel, a member of the Board of Governors
from 1965 until 1972, expressed the Keynesian view in 1973:
There is a trade-off between idle men and a more stable
value for the dollar. A conscious decision must be made as to
how much unemployment and loss of output is acceptable in
order to get smaller price rises. Some price increases originate on the cost side or in particular industries. These cannot
be halted by monetary policy, which acts principally on the
overall aggregate demand for goods and services. … [E]xperience … shows that without some
type of government intervention in
the price-wage bargains struck by
labor and industry, the trade-off
between inflation and unemployment is unsatisfactory.

for wage and price controls as the price of stimulative monetary policy. In their absence, inflationary expectations,
Burns contended, would counter the stimulative effects of
expansionary policy. On Aug. 15, 1971, Nixon delivered the
controls Burns wanted and Burns obliged with expansionary
monetary policy.
Charles Walker, treasury undersecretary, later summarized the forces leading the Nixon administration to adopt
wage and price controls:
[I]nflationary expectations … began to come back on us
last winter after we had them under some control. Interest
rates were going down, and then [they] shot back up
again. ... [L]abor tended to leapfrog into the future and get
three-year contracts to guard against additional inflation.
Inflationary expectations are what really got us.

Keynesian aggregate demand management relied on inertia in actual and expected inflation as the lever with which
increases in aggregate nominal demand lowered unemployment. By the end of the 1970s, that apparent inertia
disappeared. The public’s response
to price controls offered an early
To achieve the stability in
example. Initially, their imposition did assuage inflationary fears
the expected future price
and permit stimulative monetary
policy. However, as George Shultz,
level requisite for
treasury secretary in the Nixon
contemporaneous stability
administration, wrote in 1978:

Starting with the Kennedy and
Johnson appointments to the
of the price level, the public
Board of Governors, Keynesian
Once the suspicion of permaviews became increasingly prevanence
sets in, gamesmanship
must believe that the
lent within the FOMC. According
develops between the private and
central bank will behave
to these views, monetary policy
public sectors. It becomes apparent
should aim for full employment,
that the controls process is
consistently.
almost universally assumed to
not a one-way street in which the
occur at a 4 percent unemploygovernment does something to the
ment rate or less. This figure benchmarked potential output.
private sector; rather, it is a two-way street, with the
By 1970, elimination of the resulting presumed negative outgovernment taking an action, the private sector reacting to
put gap (actual minus potential output) became a national
it, the government reacting in turn, and so forth. It is a
and an FOMC objective. Furthermore, a nonmonetary view
continual process of interplay and interrelations through
of inflation led the FOMC to believe that monetary policy
which those “controlled” develop ways of doing whatever
could be stimulative without increasing inflation as long as
they really want to do.
the output gap was negative. The inflation that did occur
with unemployment in excess of 4 percent had to arise from
Apart from wartime, before 1965, the United States had
cost-push inflation. Failure to accommodate such inflation
never experienced sustained high inflation. Experience with
would require high unemployment.
a commodity standard had conditioned the public to expect
The loss of expectational stability began in 1966 when
stationarity in prices. However, the sustained rise in inflathe FOMC, unlike 1957, did not move in a sustained way to
tion produced by stop-go monetary policy changed
eliminate nascent inflation. Bond yields began a long, irregexpectations. As the public learned that policy did not proular climb to the low double-digit figures reached in the
vide for stationarity in either the price level or the inflation
early 1980s. They fell briefly during the 1970 recession but
rate, an increase in expected inflation increasingly offset the
resumed rising in spring 1971. The Nixon administration
stimulative effect of the expansionary policy followed in the
wanted rapid money supply growth to stimulate output sufgo phases of stop-go policy. By 1979, the Fed found itself
ficiently to reduce the unemployment rate to 4.5 percent by
operating in the world described by Robert Barro and David
summer 1972. Arthur Burns, FOMC chairman, campaigned
Gordon (in 1983) and Finn Kydland and Edward C. Prescott

Spring/Summer 2008 • Region Focus

3

RF_SUM_08

9/11/08

11:46 AM

Page 4

(in 1977) where the public believes that the central bank
possesses an incentive to raise inflation to lower unemployment below its sustainable value. Forward-looking
expectations on the part of the public offset the stimulative
effect of monetary policy on the unemployment rate.
Herbert Stein, Council of Economic Advisers chairman
in the Nixon administration, foresaw in 1974 the environment that Volcker inherited upon becoming FOMC
chairman in 1979:
If policy or external events slow down the growth of
demand, price and wage increases abate little if at all, as everyone is looking across the valley to the next surge of inflation.
Because price and wage increases persist at a high rate employment suffers, and governments are driven or tempted to prop
up demand, validating the expectation of continued or everaccelerating inflation.

In 1980, Paul Volcker observed:
[T]he idea of a sustainable “trade-off ” between inflation
and prosperity … broke down as businessmen and individuals
learned to anticipate inflation, and to act in this anticipation. … The result is that orthodox monetary or fiscal
measures designed to stimulate could potentially be thwarted
by the self-protective instincts of financial and other markets.
Quite specifically, when financial markets jump to anticipate
inflationary consequences, and workers and businesses act on
the same assumption, there is room for grave doubt that the
traditional measures of purely demand stimulus can succeed in
their avowed purpose of enhancing real growth.

Alan Greenspan made the same point in congressional
testimony in 1993:
The effects of policy on the economy depend critically on
how market participants react to actions taken by the
Federal Reserve, as well as on expectations of our future
actions. … [T]he huge losses suffered by bondholders during
the 1970s and early 1980s sensitized them to the slightest
sign … of rising inflation. … An overly expansionary monetary policy, or even its anticipation, is embedded fairly soon
in higher inflationary expectations and nominal bond yields.
Producers incorporate expected cost increases quickly into
their own prices, and eventually any increase in output
disappears as inflation rises.

A New Nominal Anchor
By summer 1979, the United States had lost the nominal
anchor provided by a residual expectation of inflation
stationarity. The bond rate fluctuated widely at a level that
exceeded 10 percent until December 1985. The persistent
effort to change the inflationary expectations of the public,
unmoored in the prior period of stop-go monetary policy,
formed the crucible in which Volcker and Greenspan forged
a new monetary standard. At the time, the change to a

4

Region Focus • Spring/Summer 2008

preemptive policy of raising the funds rate in the absence of
rising inflation engendered fierce criticism. The abandonment of aggregate-demand management in favor of
stabilizing inflationary expectations was a departure for
unknown shores.
Volcker and Greenspan had to reduce the expectation of
high inflation manifested in the high level of bond rates.
Furthermore, financial markets had come to associate inflation shocks (relative price shocks that pass through to the
price level) and positive growth gaps (above-trend real output growth) with increases in trend inflation. After the
initial disinflation that brought inflation down to 4 percent
in 1983, the FOMC still had to convince markets that a go
phase would not follow a stop phase. It had to forego expansionary policy early during economic recovery when
inflation had fallen but unemployment had not yet returned
to full employment. The Volcker-Greenspan expected-inflation/growth gap policy emerged in 1983 when the FOMC
raised the funds rate in response to rising bond rates despite
the existence of high unemployment and falling inflation.
Greenspan reconfirmed the policy during the “jobless recovery” from the 1990 recession when the FOMC lowered the
funds rate only gradually to work down the inflationary
expectations embodied in long-term bond rates.
As a consequence of responding to the increases in bond
rates produced by positive growth gaps, the FOMC replaced
an output-gap target with a growth-gap indicator. It raised
the funds rate in response to sustained above-trend growth
rather than waiting until a perceived negative growth gap
approached zero and inflation rose. The more expeditious
movement in the funds rate eventually convinced markets
that the FOMC would keep real growth in line with potential growth promptly enough to prevent increases
in inflation. As a result, in response to shocks, market
participants began to move the forward real interest
rates embodied in the yield curve continuously in a way
effectively estimated to return real output to potential. The
alternation of intervals of stimulative and restrictive monetary policy disappeared. Ironically, allowing the price system
to work rather than attempting to improve upon it produced
more rather than less economic stability.
RF

✧

RF_SUM_08

9/11/08

11:46 AM

Page 5

JARGONALERT
Productivity
or an economist, the word “productivity” can have
several meanings. But if you’re reading about productivity in your daily newspaper, you’re probably reading about
“labor productivity.” It is defined as the average value of output produced for every hour worked by the nation’s
employees. It is the most widely used measure of the overall
productivity of the economy. However, as a measure, labor
productivity is a blunt tool: It can show us the trends in productivity, but it can’t tell us much about how those trends
came about.
To understand what labor productivity measures,
consider the example of an aluminum factory that produces
$1,000 worth of aluminum a day, and
employs 10 workers who each work 10hour days. Note that the number of labor
hours that go into producing that aluminum each day is 100. Dividing the value
of the aluminum produced by the number
of labor hours required to produce it
yields the labor productivity — in this
case, $10 an hour.
In the United States, economy-wide
labor productivity is measured by the
Bureau of Labor Statistics (BLS) and figures released every
quarter. These data are closely followed by stock markets
and policymakers. Since labor productivity growth is an
indicator of economic growth, the growth of labor productivity over the previous quarter is particularly important.
One way a firm can improve labor productivity is
through increasing the amount of capital they invest per
worker. This is called “capital deepening.” Capital is comprised of plant and equipment, so capital deepening can be
achieved through expanding plant size or buying more
equipment. With more capital to work with, workers can
produce more and this could lead to higher firm revenues.
(Of course, there are limits to the labor productivity growth
that this can achieve as there is a limit to the amount of
capital each worker can efficiently utilize.)
A second measure of productivity, called “total factor
productivity” — or TFP, for short — is a broader measure.
TFP takes into account the amount of capital employed in
production in a more explicit way by measuring the productivity of the combination of labor and capital. When TFP
rises, labor productivity rises as well. The reverse, however,
is not necessarily true.
TFP can be viewed as a measure of the level of overall
technology in an economy. We often think of technology as
it pertains to items such as computers and cars, for instance.
One might be tempted to think of TFP in the same narrow

F

terms. However, economists discuss TFP in broader terms,
and define it as being comprised of all factors other than
labor and capital affecting production. TFP can be influenced by elements such as the regulatory environment,
managerial talent, as well as those more traditionally associated with technology such as the level of sophistication in
equipment designs.
From 1996 to 2006, economists recorded a significant
rise in the growth of labor productivity in the United States.
According to a 2007 study by the Congressional Budget
Office, between 1996 and 2006 the average rate of annual
labor productivity growth was 2.9 percent, compared to an
average rate of 1.4 percent from 1974
to 1995.
In recent years, labor productivity
growth has been largely driven by
robust TFP growth. But the sources of
this high TFP growth are hard to pinpoint. One theory attributes the
acceleration in TFP growth between
2001 and 2006 to the boom in information technology (IT) investment in the
1990s. These investments provided
firms with an immediate labor productivity boost due to the
effects of capital deepening. After that initial period, firms
may have developed better business practices tied around
the new IT capital. These new practices could have led to an
increase in the growth rate of TFP, which in turn, translated
into higher labor productivity growth in the post-2001
period.
While labor productivity growth was strong for much of
the preceding decade, the future trajectory of labor productivity growth remains to be seen. Future trends in labor
productivity are particularly important because of the direct
relationship between labor productivity and labor compensation. In the long run, economic theory predicts that wage
growth will follow labor productivity growth. The intuition
behind this is simple: If workers are producing more, then
firms will have to increase wages to compensate workers for
their increased productivity.
However, there is debate about whether this relationship
between wages and productivity actually holds in practice.
Some point to studies which show that U.S. wage growth has
been lagging productivity growth since the mid-1970s.
Others counter by pointing out that, among other things,
many of these studies examine only growth in take-home
pay, and fail to take into account growth in the levels of noncash benefits (such as employer-provided health care) which
often constitute a major part of worker compensation. RF

Spring/Summer 2008 • Region Focus

ILLUSTRATION: TIMOTHY COOK

BY K H A L I D A B DA L L A

5

RF_SUM_08

9/11/08

11:46 AM

Page 6

RESEARCHSPOTLIGHT
Does Unemployment Insurance Discourage Work?
BY K H A L I D A B DA L L A

increases in unemployment durations are due to both the
stablished under the Social Security Act of 1935,
liquidity and moral hazard effects. Determining the ratio of
unemployment insurance (UI) is one of the largest
the two effects in raising unemployment durations under UI
government labor programs in the United States. In
determines the extent to which UI is optimal.
most states, UI programs replace 50 percent of a claimant’s
“To the extent that it is the liquidity effect, UI reduces
preunemployment wage up to a maximum benefit level for
the need for agents to rush back to work because they have
up to six months. In a new paper, economist Raj Chetty
insufficient ability to smooth consumption; if it is primarily
of the University of California at Berkeley presents an evalthe moral hazard effect, UI is subsidizing unproductive
uation of the efficiency of the UI system.
leisure,” Chetty writes.
Chetty begins his paper by noting, “One of the classic
Chetty takes advantage of changes in benefit levels across
empirical results in public finance is that social insurance
U.S. states to compare the effect of changes in benefit levels
programs such as unemployment insurance reduce labor
on the unemployment durations
supply.” Various studies have
of constrained and unconstrained
found that a 10 percent increase in
“Moral Hazard versus Liquidity and
households. He finds that a 10
UI benefits is associated with
percent increase in UI benefits is
increases in the average duration
Optimal Unemployment Insurance”
associated with a 7 percent to 10
of unemployment of between 4
by Raj Chetty. Journal of Political
percent increase in unemploypercent and 8 percent. The longEconomy, April 2008,
ment durations within the
established explanation for this
constrained group. On the other
finding is that UI benefits create
vol. 116, no. 2, pp. 173-234.
hand, the unconstrained group is
an incentive for workers to remain
far less affected by increases in
unemployed. This incentive stems
benefit levels. The fact that there is a differential between
from the fact that receipt of UI benefits is conditional on a
the constrained and unconstrained groups indicates that the
worker remaining unemployed. In the language of economliquidity effect is in play.
ics, UI benefits are said to induce “moral hazard” among
However, Chetty notes that while this result is indicative
workers. Such behavior is welfare-reducing — making it
of the existence of a liquidity effect, it doesn’t reveal its magundesirable from a policy perspective.
nitude. To determine the magnitude of the liquidity effect,
However, Chetty argues that the standard view of the UI
Chetty turns to another type of unemployment compensaprogram overstates the effect of moral hazard. He argues
tion: lump-sum severance payments. The effect of
that UI does not increase unemployment durations solely
lump-sum payments on unemployment durations is entirely
due to moral hazard. Rather, there is a second channel
due to the liquidity effect. This is because receipt of the
through which UI causes longer unemployment durations:
payment is not conditional on the worker remaining unemthe “liquidity effect.” The liquidity effect is directly tied to
ployed. Therefore, lump-sum payments do not induce moral
the observation that many workers have limited liquid net
hazard. He finds that workers who received lump-sum payworth at the time of job loss. These workers are unable to
ments had longer unemployment durations than those who
“smooth consumption” over the course of their unemploydidn’t receive the payments. Because moral hazard is
ment. Instead, they have to make cuts in their expenditures,
unlikely to be driving this difference, Chetty concludes that
some of which might prove quite difficult. As a result, liqthe liquidity effect is the cause.
uidity constrained workers face greater pressure to quickly
He writes, “Using data from the United States, I estimate
find employment than unconstrained workers.
that the liquidity effect accounts for 60 percent of the
Receipt of UI benefits, however, improves constrained
marginal effect of UI benefits on durations at current
workers’ liquidity, allowing them to more easily smooth conbenefit rates. This estimate implies that a benefit equal to
sumption. Consequently, they may spend more time looking
50 percent of the preunemployment wage is near optimal in
for jobs that match their particular skills. In contrast to the
a UI system that pays constant benefits for six months.”
moral hazard effect, the liquidity effect is socially beneficial.
Chetty’s findings are at odds with much of the previous
If private credit and insurance markets were free of disliterature on unemployment insurance. His provocative
tortions, then liquidity constrained workers could tap them
paper will likely stimulate further research on this important
for liquidity. However, when private market imperfections
topic, research that will be of interest to academics and
exist, the UI program can fill the gap by providing
policymakers alike.
liquidity to constrained workers. In such a case, UI-induced
RF

E

6

Region Focus • Spring/Summer 2008

RF_SUM_08

9/26/08

4:05 PM

Page 7

POLICYUPDATE
New Farm Bill Extends Menu
BY B E T T Y J OYC E N A S H

t’s hard to come up with an example of something in
your life that the new farm bill won’t affect in one way
or another. Everybody eats at this table.
The Food, Conservation and Energy Act of 2008, passed
despite a presidential veto in May, will require about
$307 billion to pay for its “programs, plans, institutes, partnerships, initiatives, assistance, authorities, grants, and
opportunities.” The biggest chunk, $209 billion, will go
toward nutrition programs. It tweaks the food stamp
program and changes its name, provides fresh fruits and
vegetables for poor children who receive governmentfunded school lunches, and doubles money for the federal
purchase of commodities such as cheese.
There is $35 billion for various supports to commodity
farmers, keeping the 1930s-era subsidy structure mostly
intact. And the bill authorizes $25 billion for conservation,
according to the Congressional Budget Office.
Farmers in the Fifth District raise livestock and grow cotton, soybeans, and corn, as well as a wide range of specialty
crops like sweet potatoes. In North Carolina, livestock
(broilers and hogs mostly) makes up two-thirds of all
agricultural production. Horticulture and greenhouse production round out the state’s agricultural picture.
For the first time, fruit and vegetable growers will get
federal help with marketing efforts and the safe handling of
products. The United States Department of Agriculture also
will spend new money on fresh food from local farms for
school lunches. There’s also money in this bill to market
local and organically grown food. In an example of how
far the bill ranges beyond the farm, money will also go to
install broadband in remote areas and to lend money to rural
businesses.
The bill’s continued subsidy system disappoints agricultural economists who track farm policy, like David Orden, of
the Global Issues Initiative of Virginia Tech’s Institute for
Society, Culture and Environment. After all, the idea behind
the 1996 “Agricultural Market Transition Act” was to reduce
subsidies. Tobacco farmers’ quotas were eventually bought
out, ending that industry’s federal support. (See Region Focus,
summer 2005.)
“[It keeps] the direct payments, some $5 billion in payments,” Orden says. The payments are controversial because
crop farmers are currently reaping high commodity prices.
But livestock farmers, of course, face rising feed costs.
Current farm income is about 50 percent higher than its
10-year average, says agricultural economist Barry Goodwin
of North Carolina State University. The value of agricultural
assets has risen over the past decade, contributing to
agricultural landowners’ wealth. Average farm household net

I

worth was almost $900,000 in 2006, yet there was no limit
placed on adjusted gross income for payment eligibility,
Goodwin notes. The Environmental Quality Incentives
Program (EQIP) will get $3.4 billion more, a 27 percent
increase. The money goes to farmers who use environmentally safe practices. Chesapeake Bay farmers will receive
$88 million a year, double the current funding, for a wide
range of conservation efforts to staunch runoff and its
ensuing damage.
Other pieces of the bill include grants and loans for rural
water and sewer systems, farmers markets, and agricultural
research. And, for asparagus farmers, $15 million to compensate for losses due to competition from imports.
The energy industry will burn through $320 million in
loan guarantees for biorefineries that use products such as
switchgrass, corn husks, and cobs. While the bill reduced
the tax credit for ethanol blenders by six cents, it extended
the tariff on biofuel imports, like sugar-based ethanol from
Brazil. There’s even money to buy and store sugar from U.S.
growers for biofuel — never mind that sugar could be
bought for half the price if not for import barriers and
domestic production controls.
Such provisions worry Orden because the bill contains
little to position the United States favorably for a future
Doha round of World Trade Organization talks, which broke
down in July over tariff disputes. For instance, the WTO has
challenged the direct payments program because it forbids
farmers from planting fruits and vegetables on land removed
from production. A recent ruling in a case brought by Brazil
against the U.S. cotton program determined that such a prohibition was not consistent with WTO subsidy rules.
A new plan that covers commodity crops would cushion
participating farmers against low yields and falling prices.
But the plan, the Average Crop Revenue Election (ACRE),
could prove expensive. It would pay out when revenues for a
particular crop in a state fall below a trigger amount. That
amount will be calculated based on a moving average of prior
years using national-average prices and state-level yields. In
2006 and 2007, commodity prices reached record levels, so
future payouts could be huge.
On the research side, the bill creates the National
Institute of Food and Agriculture, and authorizes
$78 million to study organic food production. There’s also
$22 million to help farmers switch to organic farming and
money to train farmers and ranchers who are disadvantaged
or just starting out.
It’s a lavish soup-to-nuts buffet that expands the reach of
traditional farm bills. Agricultural policy will never be just
meat and potatoes again.
RF

Spring/Summer 2008 • Region Focus

7

RF_SUM_08

9/11/08

11:47 AM

Page 8

SHORTTAKES
PUBLIC TRANSPORTATION

Gas Prices Boost Light Rail
harlotte’s new light rail line, a rarity in sprawling southeastern cities, is rolling like a juggernaut. In a case of
perfect timing, Charlotte’s 9.6 mile Lynx Blue Line opened
as gas prices climbed, in November 2007. It has exceeded
passenger forecasts by 40 percent so far.
Pump prices have prompted a 2.8 percent national decline
in vehicle miles traveled so far this year. In fact, an analysis by
Cambridge Energy Research Associates suggests that, if
petroleum prices stay at or near current levels, gasoline
demand in the United States may have peaked. Consumers
are driving less and are also choosing more fuel-efficient cars
based on gasoline prices that began rising two years ago. Car
sales in the United States have declined since mid-2005, and
hybrid vehicle sales have increased by more than a third from
2006 to 2007.
Another byproduct of higher fuel prices is that more
riders are seeking out public transportation. The Greater
Richmond Transit Corporation (GRTC) reports that more
suburban dwellers are riding the bus. Vanpools have also
become more popular. Last April, nearly 3,000 more people,
15 percent more, rode vanpools than in April 2007, according
to the GRTC. And in Norfolk, Va., 32 percent more people
rode the bus in the first quarter of 2008 than in the same
period in 2007. The city broke ground on its light rail project
last fall.
Charlotte’s rail line has become a model. “Since we have
opened, from December [2007] through the end of April, we
are averaging daily about 13,000, just during the week,” says
Jean Leier, spokesman for the Charlotte Area Transit System
(CATS). The light rail system in Baltimore had almost
17 percent more riders in the first quarter of 2008 than it did
in that quarter of 2007.
While this spark of public interest may help the benefit
side of the public transportation ledger, light rail remains a
heavily subsidized way to travel. Light rail is tough for economists to accept because these systems cost big money to
build, money that typically comes from all taxpayers through
federal grants as well as state and local taxes.
Public transportation doesn’t pay for itself. “Even if every
person in a city rode light rail, the subsidy rate per rider
would still greatly exceed that of the automobile because
the fares for light rail cover only about 25 percent
of the operating cost of an additional passenger — the
remaining 75 percent is subsidized,” says Thomas Garrett, an
economist at the St. Louis Fed who has studied light rail
transportation.
Part of the problem is that driving has seemed cheap by
comparison. But drivers don’t pay the full cost of that either.
They do pay federal and state gas taxes if they drive, and that

C

8

Region Focus • Spring/Summer 2008

The Charlotte Area Transit System’s Lynx light rail line has
exceeded its ridership projections by 40 percent since it opened
in November 2007.

helps fund highway construction and maintenance. Yet driving involves hidden costs. Drivers pay in fuel, time, and car
depreciation, but not costs imposed on others such as pollution and congestion. That leads to a higher-than-optimal
number of cars on the road much of the time.
Many midsized cities were developed on the assumption
that car use would remain prevalent, and it’s been hard to
design and implement successful rail projects. Charlotte
planned commercial projects around its proposed rail line.
“There is a lot of density being built around the line,” Leier
says, mixed-use developments as well as entertainment venues. The Time Warner Cable Arena, home of the Charlotte
National Basketball Association franchise the Bobcats, sits
adjacent to the Blue Line. The Courtside, a 17-story building
less than two blocks from two light rail stations, has sold out
its 107 residential units. Other projects are under way.
Norfolk’s light rail route will serve the Eastern Virginia
Medical Center through the downtown, with stops at
Harbor Park, the minor league baseball stadium, among
others. Private developers are responding with transitoriented project proposals, says James Toscano, spokesman
for Hampton Roads Transit.
For the people in Charlotte and Norfolk, these light rail
projects offer a convenient alternative to driving and parking. In general, the benefits of light rail projects like these are
concentrated, with the costs dispersed among many, many
taxpayers, according to Garrett and co-author Molly
Castelazo: “The many who stand to lose will lose only a
little, whereas the few who stand to gain will gain a lot.”
— BETTY JOYCE NASH

DRILL THOSE HILLS

Fuel Price Spike Lures New Investment
nergy firms have pumped up natural gas and oil exploration in fossil-fuel rich West Virginia because of rising
fuel prices and improved drilling technology.

E

PHOTOGRAPHY: CHARLOTTE AREA TRANSIT SYSTEM

FRESH FOOD

9/11/08

11:47 AM

Page 9

“There’s lots of excitement among major players,” says
Charlie Burd, executive director of the Independent Oil and
Gas Association of West Virginia.
Potential for natural gas within a layer of rock called the
Marcellus shale has further fueled that excitement.
Dominion has rights to drill on about 1.9 million acres, and
has leased about 205,000 of those acres in West Virginia and
western Pennsylvania to Antero Resources Corp. for $552
million and a 7.5 percent royalty. Chesapeake Energy Corp.
in 2005 bought Columbia Natural Resources, a natural gas
exploration firm with assets in Appalachia, for $2.9 billion.
This Marcellus shale formation begins in New York and
extends through Pennsylvania down the eastern spine of the
Appalachian Mountains. It holds natural gas that, until
recently, was considered too expensive to extract. But yields
from a similar formation in Texas has focused attention on
this West Virginia shale.
A geologist with the West Virginia Geological
and Economic Survey, Lee Avary, is fielding double the
usual information inquiries from energy firms and many
landowners, curious about drilling rights contracts they’re
being offered.
This untapped natural gas is good news because the state
lies so close to the large Northeast markets, Avary notes.
Appalachian natural gas is typically shipped to northeastern
utility companies, while its oil is shipped south for refining.
Another way to gauge increased activity is to count oil and
gas rigs. They’re expensive, and volatile prices make the
drilling rigs hard to plan and place. West Virginia had
32 rigs by the end of 2007, compared to 14 at the end of
2000, according to Baker Hughes, a petroleum industry
services firm.
In the Mountain State’s early history, oil and natural gas
was a complete mystery — a nuisance, in fact — for salt
miners. A candle flame could cause gas “vents” to flare, and
these “burning springs” were described by Thomas Jefferson
as early as 1781. Deeper drilling yielded oil, and by 1859, 200
barrels a day came out of the ground. Oil production peaked
at 16 million barrels in 1900.
In 2006, West Virginia produced about 200 billion cubic
feet of natural gas and 1.6 million barrels of oil.
— BETTY JOYCE NASH

FRESH FOOD

More Shoppers Seek Local Flavors
ave you ever gotten to the grocery store only to realize
that the fresh fruits and vegetables weren’t so fresh?
That’s just one of many reasons people have become
“locavores,” adopting a lifestyle of buying and eating locally
grown produce.
Though the definition of “local” varies, it typically
describes food grown within a radius of 50 to 150 miles.
These foods are becoming easier to find. In 1994, there

H

The spinach growing in the foreground won’t have far to go
once it’s ready to harvest — Trail’s End Farm in Montpelier,
Va., ships only within a 30-mile radius.

were 1,755 farmers markets in the United States; by 2006,
that number had reached 4,385. Ellwood Thompson’s Local
Market in Richmond, Va., sells produce within 24 to 48
hours of harvest. Customers know its origin, and farmers
aren’t forced to use as many preservatives since they ship
locally.
Trail’s End Farm, owned and operated by Sherri Cantrell
and her family, is located in Montpelier, Va. One of its
biggest challenges is keeping up with demand. “If you grow
it, you can sell it,” she says.
It’s hard to talk about eating locally without addressing
the perception that it is better for the environment. While
that’s a popular way of thinking, it’s not always true. A recent
article by Carnegie Mellon University economists
Christopher Weber and H. Scott Matthews found that
83 percent of carbon emissions associated with food are
from the production phase, while transportation represents
only 11 percent.
However, travel distance affects not only the environment but also food quality. Food may travel 1,500 miles
before it lands on your dinner table, says Nancy Creamer
of North Carolina State University. Buying locally may
create jobs within the farming and food sectors, and keep
potential revenue within a community.
At a 2007 Virginia Food Security Summit in
Charlottesville, Kenneth Meter, president of the Crossroads
Resource Center, reported that Virginians spend $8.9 billion
on food imported from outside the state. If the state was
able to increase purchases from local farms by just 15 percent, Virginia farms could earn $2.2 billion of new income.
Matt Benson, an economist with the Virginia
Cooperative Extension, says the trend of buying and eating
locally will continue, but its success hinges on creating systems that benefit both farmers and consumers. “If we could
get area restaurants to buy from local farmers, that would be
a start,” Benson says.
Though it’s difficult to name one specific reason eating
and buying locally has become more popular, one thing is for
sure: It’s hard to beat the freshness factor. With the rise of
farmers markets, it’s becoming easier than ever to enjoy the
“local flavor.”
— JULIA RALSTON FORNERIS

Spring/Summer 2008 • Region Focus

PHOTOGRAPHY: TRAIL’S END FARM

RF_SUM_08

9

RF_SUM_08

9/26/08

4:04 PM

Page 10

AROUNDTHEFED
The Costs and Benefits of Disclosure
BY B E T T Y J OYC E N A S H A N D K H A L I D A B DA L L A

“Should Bank Supervisors Disclose Information About Their
Banks?” Edward Simpson Prescott, Federal Reserve Bank of
Richmond Economic Quarterly, Winter 2008, vol. 94, no. 1,
pp. 1-16.

ank supervisors monitor banks for “safety and soundness.” If investigations detect problems, supervisors
can act to reduce a bank’s risk, which protects taxpayer liability. The supervisors collect, on- and off- site, a wide body
of information, such as details on problem loans. They use
this information to rate banks, and results remain private
and confidential as required by regulatory policy.
Why not let banks voluntarily disclose or require supervisors to share useful information that, incidentally, costs
about $3 billion (in 2005) to collect? So if banks could
disclose their risk ratings, would better information lead to
more efficient market prices of bank securities and avoid
costly, duplicate collection efforts?
Richmond Fed economist Ned Prescott built a model
to investigate whether there was a good reason to require
disclosure. He found that public disclosure of bank ratings
by supervisors can create an incentive for banks to withhold
information so they can get better ratings and gain market
favor. So, mandatory disclosure may hurt the ability of the
supervisor to collect that information in the first place.
(In the model, allowing banks to make exam results public is
the same as requiring a supervisor to share the information.)
Prescott also shows that allowing a bank to voluntarily
disclose its exam report is no better. If a bank did not
disclose its report voluntarily, the markets would assume it
withheld the information because it had a bad rating since,
if it had a good rating, it would have disclosed the information. As a result, voluntary disclosure can impair supervisors’
ability to gather information in the same way that
mandatory disclosure can — by creating incentives for
banks to withhold it. His findings demonstrate that there
are good reasons for supervisors not to share some of this
information.

B

“What Is the Optimal Inflation Rate?” Roberto M. Billi and
George A. Kahn. Federal Reserve Bank of Kansas City
Economic Review, Second Quarter 2008, pp. 5-27.

he Federal Reserve Act calls on Fed policymakers to
maintain price stability and maximum employment.
The optimal long-run inflation rate is the rate that best
fulfills this dual mandate. Kansas City Fed economists
Roberto Billi and George Kahn argue in a new paper
that for the Fed to carry out its mandate, its long-run

T

10

Region Focus • Spring/Summer 2008

inflation target cannot be zero percent per year.
According to the authors, if the inflation rate is at zero
percent, an adverse shock could easily push the inflation rate
below zero. A negative inflation rate — known as deflation
— can be particularly harmful to an economy. A positive —
but low — inflation rate could serve as a buffer against any
adverse shocks that could push the inflation rate into deflationary territory.
The authors cite studies that show an upward bias of as
much as 1 percent in the Consumer Price Index annual inflation rate and as much as 0.6 percent in that of the Personal
Consumption Expenditures (PCE) price index. As a result,
an annual inflation rate of zero percent as measured by these
indices would mean that the economy is undergoing a deflation. Billi and Kahn note that when the inflation rate is very
close to zero, the federal funds rate is likely to be near zero
as well. Thus, the ability of the Fed to lower the federal funds
rate would be restricted. This could constrain the Fed’s ability to stimulate a slumping economy.
With these issues in mind, Billi and Kahn use a macroeconomic model of the U.S. economy to calculate the
optimal annual inflation rate. They find that the optimal
annual inflation rate is between 0.7 percent and 1.4 percent
when measured using the PCE price index. This estimate is
lower than previously published estimates that had specified
the optimal annual inflation rate to be about 2 percent.
“How Do EITC Recipients Spend Their Refunds?” Andrew
Goodman-Bacon and Leslie McGranahan. Federal Reserve
Bank of Chicago Economic Perspectives, Second Quarter 2008,
pp. 17-32.

irst introduced in 1975, the Earned Income Tax Credit
(EITC) is one of the largest federal government assistance programs targeted to lower-income households.
Designed to encourage work force participation, the program distributed $40 billion to 22 million families in 2004.
In a new study, Chicago Fed economist Leslie
McGranahan and former associate economist, Andrew
Goodman-Bacon, investigate the spending patterns of
EITC-eligible households during February and March, the
period in which EITC benefits are disbursed. The authors
found that these households increase relative average
monthly spending on vehicles in February by 35 percent relative to non-EITC families. The EITC families also spent
more on other transportation costs. “Given the crucial role
of access to transportation in promoting work, this leads to
the conclusion that recipient spending patterns support the
program’s prowork goals,” the authors conclude.
RF

F

RF_SUM_08

9/11/08

11:47 AM

Page 11

RF_SUM_08

9/11/08

11:47 AM

Page 12

The ongoing evolution of mainstream economics
BY D O U G C A M P B E L L

n the spring of 2003, a dozen economists quietly gathered in a hotel conference room in downtown St. Louis
to talk about the state of their profession. They shared
a general malaise. In their view, academic economics had
become too narrow and too rigid, and scholarly articles too
abstract, technical, and disconnected from the real world.
“We had a sense that economists were failing in an important sense to bring economic insight to bear on public
understanding and public policy,” recalls Dan Klein, a
professor at George Mason University in Fairfax, Va., who
organized the gathering.
Out of this meeting was born a new economics journal —
Econ Journal Watch, with its premiere issue in 2004.
Published three times a year and edited by Klein, the journal
consists mainly of refereed “Comments” essays that critique
articles in other economics journals, sometimes questioning
their data, other times their premises or their logic. The
stated mission is to watch “the journals for inappropriate
assumptions, weak chains of argument, phony claims of
relevance, and omissions of pertinent truths.”
To be clear, Klein and his fellow journal organizers belong
to a specific ideological strain. (And there are plenty in the
profession who do not share their malaise. After all, the
mainstream is still, well, “the mainstream.”) Klein calls them
the “Smith-Hayek-Friedmans,” after Adam Smith, author of
The Wealth of Nations and generally regarded as the founding
father of economics; Friedrich Hayek, a Nobel Prize winner
known for his defense of free markets and contributions to
what became known as the “Austrian School” of economics;
and Milton Friedman, another Nobel Prize winner whose
work became synonymous with the neoclassical “Chicago
School” and whose essays galvanized public interest in economic principles.
Those who follow in this tradition are pretty close to
being mainstream economists, though perhaps even more
free-market tilting and not as technically oriented as those
who preside over the field’s top journals. It is not surprising
that their journal is at heart a critique of the economic
orthodoxy. But it is only one of many critiques, some from
the far end of the ideological spectrum and others rather
close to the middle.

I

12

Region Focus • Spring/Summer 2008

Klein and his cohorts want to know why more economists aren’t addressing the Big Questions. Where are the
plain-spoken economists of yore who helped guide public
opinion? As Klein puts it: “There’s this lingering question of
people of my ilk — why isn’t there a Milton Friedman
today?”
Questions from other camps also abound. As is natural
during turbulent times such as these, many questions
focus on macroeconomics — the study of economy-wide
phenomena. Income inequality is widening and more
domestic jobs are being lost to free trade. The recent credit
market turmoil provides numerous examples of borrowers
and lenders making poor choices. Is economics too set in
its ways to consider alternative explanations for how
individuals and firms make decisions?
It’s a fair question. But it would be unfair to suggest that
it is going unanswered. As it is, many view the supposed failings of high-level economics as greatly exaggerated. Is
macroeconomics too theoretical? Perhaps in some cases, but
it’s unlikely you can devise workable policy proposals without first establishing a solid theory about how people will
react to those new policies. Too much math? Well, the fact is
that economics is a quantitative field. Especially for the purposes of conducting macroeconomic policy, quantitative
judgments are essential. Helen Tauchen, associate chair of
the economics department at the University of North
Carolina, Chapel Hill, says: “The inherently dynamic nature
of economic decisions, the statistical difficulties in using
nonlaboratory data, and the complication of handling interactions among strategic agents all require nontrivial
mathematical approaches.”
In this issue of Region Focus, we describe how economics
is trying to get at the Big Questions — the way the field is
embarking on a reorganization, how its members are communicating with each other and nonspecialists, and how
their research focuses are shifting.
By no means is this an exhaustive exploration of the state
of economics, and the following historical summary is just
that — a heavily abridged and simplified review to help place
these articles in historical context. We aim instead to capture the uniqueness and — most of all — the enthusiasm

RF_SUM_08

9/11/08

11:47 AM

Page 13

that permeate the economics discipline today. In fact,
debate among economists is in some ways livelier than ever,
with universities experiencing a heyday in applications and
enrollment; blogs providing informal venues for discourse;
and exciting new research frontiers beginning to produce
real results.

S M I T H : LO C , LC- U S Z 62 - 1 74 0 7 ; K E Y N E S : I M F ; F R I E D M A N , L U C A S : U N I V E R S I T Y O F C H I C AG O

A Brief History of Economic Thought
In the beginning, there was Adam Smith. The “classical
model” of the economy that is attributed to Smith — as well
as David Ricardo and John Stuart Mill — assumed that markets exhibited perfect competition; that people make
decisions based on real, not nominal, values; and that these
people are basically the same in their preferences and economic behavior. Obviously, this was an oversimplification
that limited the model’s reach. For instance, in the classical
model there are no business cycles — the historical boombust sequence of economic fluctuations. Output is
determined by changes in aggregate supply, which in turn is
often adversely influenced by government interference.
Hence, classical economists were advocates of a “laissezfaire,” or hands-off, approach.
While the next 200 years were eventful, the classical
model maintained its dominance. But with the Great
Depression came great change in the prevailing economic
paradigm. In 1936, John Maynard Keynes published The
General Theory of Employment, Interest, and Money. Few works
have so shaken up their disciplines. Among the differences
between Keynes and his predecessors was that he provided a
model which encompassed both the macroeconomy — an
aggregate description of how the economy works — and the
microeconomy. He also put short-term conditions at the
forefront, famously remarking, “In the long run we are all
dead.”
The key to what became known as the Keynesian model
was aggregate demand. (Over the years, you see some clear
differences in beliefs between Keynes and the practitioners
who call themselves Keynesians.) Keynesians relied on the
so-called IS-LM model, which showed how demand was
impacted by changes in investment and savings (IS) and
changes in liquidity and money (LM). In this model, shifts in
consumption levels as well as investment can have an effect
on demand.
Keynes himself thought people formed their expectations based on “animal spirits” and not economic
fundamentals. As a result, aggregate demand tended to move
erratically along with the mood of the marketplace.
Keynesians also believed policymakers had several key
tools with which to bring about changes in consumption
and, by extension, aggregate demand. Fiscal policy — raising
or cutting taxes — is one way that Keynesians believed the
economy could be fine-tuned.
Keynes also provided an answer to why the Great
Depression occurred: High expectations about the future
occurred in the midst of a stock market bubble and the
economy’s general overproduction of goods. This in turn

reduced investment and popped the stock market bubble.
Wall Street’s crash lowered wealth and spurred low expectations about the future of the economy, both of which had
the effect of further reducing investment and consumption.
In sum, aggregate demand collapsed. To reverse the
situation, Keynesians advocated stimulating demand via
government spending.
Keynesians ruled the policy world for at least two
decades after World War II. But then the monetarists, led
by Milton Friedman, entered the picture. The monetarists
from the University of Chicago held that changes in the
money supply were the real driver of business cycles because
of their ability to change aggregate demand.
Where Keynesians believed that prices and
wages were somewhat “sticky” because markets were not perfectly competitive,
monetarists believed that expectations
about the future were stickier. These “sticky
expectations” were the main culprit in
upsetting the process of getting supply and
demand back into equilibrium. It was this
backward-looking nature of expectations
that allowed a loosening of monetary policy
Adam Smith
to have (temporary) stimulative effects on
real production and consumption in the
economy. But that effect would wear off as
expectations eventually caught up with
increases in realized inflation. Thus, the
central bank’s main job should be to avoid
causing inflation by tightly controlling the
money supply. From monetarists came the
maxim: “Inflation is always and everywhere a
monetary phenomenon.”
In the mid-1970s Robert Lucas articulatJohn Maynard Keynes
ed his “rational expectations” hypothesis,
which has endured as arguably the most
influential contribution to macroeconomic
theory ever since then. Lucas tended to
agree with monetarists, but he added the
notion that people form their expectations
of the future by using all available information — they are forward-looking more
often than they are backward-looking.
He also suggested that they are unlikely to
Milton Friedman
make predictable, systematic errors. While
a monetarist would have assumed people
would react to inflation only upon
experiencing it, a disciple of rational expectations believes people will see that
expansionary monetary policy could lead to
higher inflation, and thus immediately
incorporate that information into their
financial behavior.
The famous example is a football game —
Robert Lucas
data show that throwing passes leads to
more touchdowns than simply running the

Spring/Summer 2008 • Region Focus

13

RF_SUM_08

9/11/08

11:47 AM

Page 14

ball. So should a team simply throw the ball all the time? Of
course not, because the defense would respond with new
formations to quash a pass-only offense. The Lucas critique
at heart pointed out what should have been obvious:
People’s behavior will change as policy changes.
From the perspective that markets contain much imperfect information or firms and people face constraints on
their borrowing, for example, the rational expectations
theory provides a useful framework for understanding the
economy. More to the point, it remedies the main problem
with previous economic theories.
Closely associated with the rational expectations
approach is “real business cycle” theory, developed by eventual Nobel Prize winners Finn Kydland and Edward C.
Prescott, and which held much sway during the 1980s.
So-called RBC models emphasized the importance of the
supply side of the economy in determining output. They also
drew heavily from microeconomic principles — the rational
individual responding to incentives who tries to maximize
the “utility” of his marginal decisions over time as well as the
tendency of markets to move toward equilibrium. In RBC
models, prices and wages change rapidly.
The New Keynesians arrived in force by the late 1980s to
build upon the neoclassical/rational expectations/RBC
approaches. New Keynesians come in several forms, but in
general they believe that sticky (or slow-changing) prices and
wages are the key to understanding the effects of monetary
policy, which in turn is central to economic output. New
Keynesian models also take into account the possibility of
both demand- and supply-driven recessions.

Where Are We Now?
For macroeconomists, a leading notion is that they have
achieved a “new neoclassical synthesis,” a term coined in a
1997 paper by former Richmond Fed economist Marvin
Goodfriend and Robert King, a Richmond Fed visiting
scholar. In the 1960s, Goodfriend and King argued, the original synthesis included the acceptance of the common
optimization tools of microeconomics, a belief in the power
of sticky prices, and the need to provide useful macroeconomic policy advice.
The new synthesis marries Keynesian short-run demand
policies with classical let-the-market-decide microeconomic
policies. It combines the most compelling parts of
Keynesian and classical models with rational expectations,
monetarist, RBC, and New Keynesian theories. “There are
new dynamic microeconomic foundations for macroeconomics,” Goodfriend and King wrote. “These common
methodological ideas are implemented in models that range
from the flexible, small models of academic research to the
new rational-expectations policy model of the Federal
Reserve Board.”
One thing that should be clear at this point is that the
dominant economic paradigm has shifted significantly over
the years, sometimes abruptly, and that at any given time many
economists disagree with the prevailing economic paradigm.

14

Region Focus • Spring/Summer 2008

The economy is, at this writing, experiencing a downturn of, as yet, an undetermined length and magnitude.
Macroeconomic models may do very well at theoretically
evaluating the effects of various policies, but how confident
is anyone, including the people who build the most widely
used models, that they can really help forecast or understand
the economy?
At a more fundamental level, today’s questions have
centered on the perceived rigidity of the economic orthodoxy. Last year, the New York Times looked at how some
economists felt like outcasts after raising doubts about the
uniform virtues of free markets. Alan Blinder, a former
Federal Reserve Board governor, was quoted as saying that
“there is too much ideology” and that economics was too
often “a triumph of theory over fact.” Economics blogs
spent weeks debating an article in The Nation that spotlighted the “heterodox” wing of economics and described
the mainstream as smug and inflexible to new, possibly better ideas. In an April op-ed piece in the Boston Globe,
economist Richard Thaler and legal scholar Cass Sunstein
used the mortgage crisis as an example of the failure of economic orthodoxy. After the fact, it’s clear that credit was
extended to all sorts of people who shouldn’t have received
any. In response, Thaler and Sunstein favor the emerging
field of “behavioral economics,” in which “the robot-like
creatures who populate standard economic theories are
replaced with real human beings.”
Some of the criticism is to be expected, both in terms of
its timing (accompanying the downturn) and from its
sources. For example, John Willoughby, chairman of the
economics department at American University in
Washington, D.C., wonders why so many economists seem
to ignore growing bodies of research. “The rational expectations, dynamic programming models seem to me to bear
very little connection to what economists actually do when
trying to stabilize the economy,” Willoughby says. “There
are a lot of interesting things being done in behavioral and
experimental and game theory that challenge the notion
that there’s one sort of steady state to which the economy
is heading — not that most economists strictly believed
that but even as a theoretical framework I think that’s
breaking down.”
On the other hand, someone like Alan Blinder is hardly
out of the mainstream. Nor is Thomas Nechyba, chairman
of Duke University’s economics department, who worries
that macroeconomics in particular has become too theoretical. “There is a new paradigm in the more micro-based way
we are doing macro. But if it can’t succeed in explaining
actual data, the stylized facts that are out there, and do it in
more than a calibrated model with replicated facts — I think
it’s going to be in trouble.”
Tom Humphrey, who retired from the Richmond Fed in
2004, is a historian of economics who remains engaged in the
profession. Humphrey says he takes a relatively optimistic
view. By no means is economics in crisis, he says, and one
should not be overly restrictive in defining what a “main-

RF_SUM_08

9/11/08

11:47 AM

Page 15

stream” economist thinks. Even a diehard neoclassical economist might agree that in the short run people can behave
irrationally and make mistakes.

Watchdogs
One of the traditional mechanisms that defines the intellectual currents in economics are the journals. As in other
academic disciplines, article submissions are vetted by other
economists before acceptance. The big journals — American

Economic Review, Journal of Political Economy, Quarterly
Journal of Economics, and Econometrica to name a few — naturally tend to accept papers that agree with the worldview of
the referees. That’s not an easy thing to change so it may
take awhile for generally accepted paradigms to shift as well.
But what can accelerate the shift is an open, intellectual
exchange of the ideas, theories, and methods that appear in
the leading economics journals. At least that is what Klein
and his cohorts at Econ Journal Watch hope. Klein does not

Q&A: General Equilibrium Models
General equilibrium models are the preferred tool of many
macroeconomists today. To get a better understanding of these
models, we asked Richmond Fed economist Kartik Athreya
to explain.

What’s a standard general equilibrium model?
General equilibrium refers to situations in which the desires
of consumers and producers for all commodities under
study are simultaneously reconciled. A standard general equilibrium model is the “competitive” one, where consumers and
producers meet in markets in which both parties assume that
the prices of goods are beyond their control. A competitive
general equilibrium occurs when we’ve found a set of prices
that leads households to demand precisely the amount that
firms wish to produce at those prices.
At its heart, a general equilibrium model is a collection of
two objects: One is a set of assumptions about the behaviors of
households and firms, and the other is an “equilibrating” institution, which is how the actions of individual actors restrict
each other. The behavior assumed for households is that they
are utility maximizing — they make themselves as well-off as
possible given their constraints. For firms, it’s profit maximization. All general equilibrium models are going to have these
two ingredients. The big achievement of competitive equilibrium theory was to show that “usually” — if households and
firms took prices as given when optimizing and paid no
attention to anything but these prices — supply would equal
demand in all markets.

assumed to be correct — not always, but on average.
In the context of monetary policy, people have started
employing these models because they think expectations of
future inflation are something important to guide the behavior
of actors. These models take a big step toward escaping the
Lucas critique (which states that relying on historical data is
misleading because people will change their behavior based on
changes in policy) because the actors are modeled as always
reacting optimally to policy changes.

What do you feed into these models?
In the model, the attitudes of households and the capabilities
of firms will be given mathematical representations that are
summarized in a set of numbers that we call “parameters.”
For example, the way that people value future consumption
relative to current consumption, or how averse to risk households are. In assigning numerical values to parameters, we let
agents operate under current policies and then ask, “What
numbers must be chosen for the parameters such that the equilibrium behavior of the model matches what we see in the real
world?” This strategy is called calibration.

What do you get out of these models?
You predict outcomes for all the objects that the actors in the
model care about. For households, the goal of the model is to
deliver predictions of how much people will consume and work
at different dates and under different circumstances, and what
prices they will face. For firms, it’s often how much they will
produce and invest.

What’s a dynamic stochastic general equilibrium
(DSGE) model?

How big is a typical DSGE model?

It’s any general equilibrium model in which the actors must
make decisions over time in an uncertain environment.
Firms look forward to the future and households think about
retirement — that’s the dynamic part of the model.
“Stochastic” refers to the fact that economic actors in the
model face uncertainty. And equilibrium in this case refers to
the presumption that supply equals demand in markets for
goods traded both in the present as well as in the future.
In models where prices equilibrate competing interests,
people’s expectations of the future values of prices must be
specified. In standard DSGE models, these expectations are

They’re small in the sense that I can describe a model to you in
five or six equations. For most models, a single page would
summarize them, and their solutions can be obtained in minutes, if not seconds, on many computers. They’re big in the
sense that they presume that individual actors are acting as if
they perform fantastically complicated computations. The old
“non-equilibrium” models were actually much bigger. The
internal consistency required of the current models makes
their computation grow rapidly more demanding as they get
“larger” and has so far prevented most of them from getting
too big.

Spring/Summer 2008 • Region Focus

15

RF_SUM_08

9/30/08

12:29 PM

Page 16

think his publication has spurred the leading journal editors
to reexamine their product. What he thinks is that his journal’s very existence and continued financial and intellectual
support is testament to the willingness of the economics
discipline to embrace new and improved ideas. And while
the field of economics in 2008 may not have its own Milton
Friedman, Klein thinks it’s a good sign that more people are
at least talking about the absence of such a figure.
He says: “Clearly today there is more empirical work
going on, and I think model building has come down a
notch; so-called theory is continuing to come down in
prestige and that’s a good thing … so I think that I’m ready
to believe that things are getting better. I sure hope so.”
If economics is itself a market, then the best models
should rise to the top. Today, there are more ways to percolate new ideas than ever — from a widening array of
journals, to blogs, to curricula in college classrooms, and to
a surprising run of New York Times best-selling economics
books. Then again, the process of rising can take some time.
In 1970, it would have been difficult to find an economist

who believed the Keynesian paradigm would be dead 10
years later. As for today’s paradigm? Perhaps we’ll know in
10 more years.
RF

READINGS
Friedman, Milton. The Optimum Quantity of Money and Other
Essays. Chicago: Aldine, 1969.
Goodfriend, Marvin, and Robert G. King. “The New
Neoclassical Synthesis and the Role of Monetary Policy.”
Federal Reserve Bank of Richmond Working Paper 98-5,
June 1997.
Keynes, John Maynard. The General Theory of Employment,
Interest, and Money. London: MacMillan Press, 1936.
Lucas, Robert E. “Expectations and the Neutrality of
Money.” Journal of Economic Theory, April 1972, vol. 4, no. 2,
pp. 103-124.

Economist, Study Thyself
The way economists are trained has come a long way in the
past 20 years. Has it come far enough?
BY D O U G C A M P B E L L

major in economics, once as popular as an 8 a.m.
lecture, lately finds itself in high demand. Universities across the nation report a growing number
of undergraduates entering their programs in economics.
At the graduate level, competition for admission to the
top schools is just plain brutal.
Let’s turn to the empirical evidence: According to the
Digest of Education Statistics, the number of economics
majors at U.S. universities jumped 22.5 percent between 2001
and 2006; the number of master’s students was up 37.5, while
the number of doctorates grew by a much tamer but still
strong 9.3 percent. To be sure, an economics degree is by no
means dominant on most campuses — it still represents only
about 1.6 percent of all bachelor degrees conferred in the
United States. On the other hand, growth in an economics
degree is almost 4 percentage points higher than total
degrees. And the popularity of economics appears to have
come at the expense of some other traditionally popular

A

16

Region Focus • Spring/Summer 2008

degrees — the number of sociology bachelors, for example,
actually dropped 5.7 percent between 2001 and 2006.
And now, in the parlance of the discipline, some stylized
facts from the Fifth Federal Reserve District, which reaches
from South Carolina to Maryland: At Duke University, one
in four undergraduates majors in economics. At George
Mason University, applications skyrocketed after faculty
member Vernon Smith won the Nobel Prize in economics.
Clemson’s pool of economics majors has increased 65 percent in the past four years alone; Wake Forest University’s
doubled in just the past year.
But don’t get carried away. For while it’s true that economics is enjoying a period of perhaps unsurpassed
popularity on college campuses, there is no shortage of questions about its direction. Chiefly, some faculty members
worry that the core curriculum — particularly at the graduate level — is becoming too technical, too theoretical, and
fails to address relevant policy questions. A Ph.D. program

RF_SUM_08

10/1/08

11:47 AM

Page 17

can teach students how to build an impressively complicated mathematical model — so is it really just training people
how to be good at math and theory, and ignoring practical
applications that might help end poverty, grow employment,
and improve the general welfare? After all, if an economist
can’t address those questions, what’s the point of being an
economist?
“This is a concern I’ve had as long as I’ve been in the profession: As we get more math, we get less interesting,” says
Doug Pearce, economics chair at North Carolina State
University.
But for every academic economist who feels that way,
there almost certainly is a counterpart who is less discouraged. Peter Murrell, economics chair at the University of
Maryland, agrees that first- and second-year graduate
courses tend to lay the math on thick, but “beyond that, and
especially at the dissertation stage, we are producing students who are studying some unbelievable topics.” Indeed,
graduates from the most technical economics programs in
the United States who can also devise answers to practical
questions are in high demand at research institutions.
In their influential 1987 paper, “The Making of an
Economist,” David Colander and Arjo Klamer rebuked
graduate education in economics at the top schools for a
perceived overemphasis on technique and an avoidance of
practical applications. Recently, Colander revisited this
topic with the idea of evaluating whether any change had
happened. As his surveys show — and our interviews with
department chairs across the Mid-Atlantic confirm — much
has changed in academic economics over the past 20 years.
There is still plenty of math and theory, of course, but there
are more practical applications than ever.

Big Major On Campus
When people talk about the on-campus popularity of economics, they are usually referring to the undergraduate level.
Among academic observers, the consensus is that students
who formerly saw value in a variety of other social science
degrees now view economics as more worthwhile.
Some attribute the growing cachet of an economics
major to the “Freakonomics” phenomena. Stephen Dubner
and Steven Levitt’s popular 2005 book turned on a new generation to the fun and virtues of economic analysis. But
department chairs interviewed for this article discounted
the Freakonomics effect, arguing that growth in the discipline began at least a decade earlier, and that it’s still a rare
18-year-old who has read the book.
Granted, economics is sometimes looked at as the poor
man’s business degree. To the question: “What can I do with
an economics major?” an economics blogger joked:
“Anything you could do with a business degree only for less
money.” But the money isn’t bad for recent graduates.
According to the National Association of Colleges and
Employers, economics graduates got average starting salary
offers in 2007 of $47,782, compared with $35,092 for history
graduates.

The benefits of an education in economics are fairly clear.
At the introductory level, the math is basic and the lessons
practical. It’s a useful background when it comes to landing
a job. “Businesses increasingly realized that people studying
economics have two valuable skills,” says Raymond Sauer,
economics chair at Clemson University. “They develop their
analytical skills and skills for working with data. If you can
think about data, analyze it, and communicate your findings
to management, that’s a valuable set of skills that are relatively scarce among other degrees.”
The popularity varies by school, of course. At Duke, economics chair Thomas Nechyba attributes the growth and
appeal of economics in part to the school’s lack of a business
degree. Meanwhile, West Virginia University has only 100
economics majors; director William Trumbull believes that
the existence of a strong business program lures away many
would-be economics majors.

Doctoral Doubts
Graduate economics is likewise experiencing a heyday in
terms of enrollment. Bolstered in large part by a surge of
international students — for whom the value of a U.S.
economics degree is huge — department chairs say that
admission standards are extremely high right now. But
whereas there is little debate about the real-world value of an
undergraduate economics degree, the same thing can’t be
said at the graduate level.
The overarching concerns are twofold and related: First,
there is worry that the high-level math that graduate
students endure during their first two years is unnecessarily
grueling and, sometimes, unconnected to the curriculum
that follows. Second, there is unease that economics risks
losing its connection to real-world problems because of its
focus on theory and complex models. This second concern is
most acute in the subfield of macroeconomics, which studies forces that affect the entire economy, such as inflation
and growth. (By contrast, microeconomics is chiefly interested in individual decisions and markets within the wider
economy.)
These are long-standing perceptions, well articulated 20
years ago by economics journalist Robert Kuttner who complained that economics departments were “graduating a
generation of idiot savants, brilliant at esoteric mathematics
yet innocent of actual economic life.”
The math that graduate economics students take in their
first two years is not to be trifled with. Andrew Foerster, who
begins his third year at Duke University’s graduate program
this fall (and who worked two years as a research associate
with the Richmond Fed), sees good and bad in the system.
It may have the effect of unnecessarily warding off some
otherwise perfectly capable would-be economists, he says,
and the disconnect between undergraduate and graduate
curriculum is conspicuous. “It’s certainly grueling, but perhaps not always unnecessary,” Foerster says. “It’s a lot more
mathematical and less graphical … it’s certainly a transition,
and one that I think a lot of people who are good students

Spring/Summer 2008 • Region Focus

17

RF_SUM_08

9/11/08

11:47 AM

Page 18

have a difficult time making.” But with math, Foerster says,
students are better prepared to engage in economic discourse at the highest levels.
At the University of South Carolina, economics chairman
Randolph Martin says he is impressed with the depth of
knowledge displayed by today’s young economists. But he
wonders whether some programs go overboard in their
preparations. “Sometimes I wonder if a question is worth all
this gunpower they’re throwing at it?” Martin says. “I don’t
want to underplay the tools that they’re taught … but even
with the young turks in the applied kinds of areas, I wonder
whether their work has some relevance to the world and not
just pure theory or at such a high-level of analytics that you
don’t know what you get out of it.”
Robert Whaples is economics chair at Wake Forest
University, which doesn’t have a graduate program. But
Whaples is an economic historian who pays attention to the
economic zeitgeist and he is concerned about the direction
of graduate education, particularly as it applies to macroeconomics. In a review of The Making of an Economist, Redux,
Colander’s follow-up to his 1987 work, Whaples laments that
the very principles of economic thought tend to be forgotten at the graduate level. “You thought that economics was
all about Milton Friedman vs. John Maynard Keynes? Think

again. Mundane issues like monetary and fiscal policy aren’t
abstract enough,” Whaples writes. “The payoff in economics
is for novelty and cleverness. … The incentives are to show
that you are ‘smart,’ not necessarily that you are wise or
learned.” (Though, to be fair, there is still a large amount of
work being done at top graduate programs on monetary and
fiscal policy that is helping economists to illuminate and reconcile the views of Keynes, Friedman, and others.)

The Ivory Tower Problem
Beyond technique and methodology, there is the second
related problem: ensuring that what gets taught at the
graduate level has at least some application to the real world.
For example: At Georgetown University, former economics
chair Matt Canzoneri notices a general trend in academia
away from cultivating economists who want to make policy.
What they want is to publish, which — no coincidence — is
the way to tenure and general peer recognition. “Here and in
other institutions over the last 10 years, there’s been
more emphasis on theory and math and econometric
modeling, and we’re losing all the applied policy type people,” Canzoneri says. “The ‘Brookings’ style person is
disappearing from academia and the rewards are going to
those who publish in refereed journals … that’s a trend that

Ph.D.-Granting Economics Programs in the Fifth District
American University
Washington, D.C.
Chairman: John Willoughby
Graduate Students: About 100 Ph.D. in residence
Full-time Faculty: 21 professors
Departmental Paradigm: A split between heterodox
and mainstream
George Washington University
Washington, D.C.
Chairman: Robert Phillips
Graduate Students: 18 M.A., 97 Ph.D.
Full-time Faculty: 29 professors
Georgetown University
Washington, D.C.
Chairman: James Albrecht
Graduate Students: About 65 Ph.D. in residence
Full-time Faculty: 28 professors
Johns Hopkins University
Baltimore, Md.
Chairman: Joseph Harrington
Graduate Students: 54 in residence
Full-time Faculty: 14 professors

U
h

University of Maryland
College Park, Md.
Chairman: Peter Murrell
Graduate Students: 130 in residence
Full-time Faculty: 37 professors
North Carolina State University
Raleigh, N.C.
Chairman: Doug Pearce
Graduate Students: About 140
Full-time Faculty: 21 professors

Duke University
Durham, N.C.
Chairman: Thomas Nechyba
Graduate Students: 81 Ph.D. in residence
Full-time Faculty: 38 professors
Departmental Paradigm: An emphasis on crossing
subdisciplinary boundaries in the social sciences
University of North Carolina, Chapel Hill
Chairman: John Akin
Graduate Students: 95 in residence
Full-time Faculty: 23 professors

University of North Carolina, Greensboro
Chairman: Stuart Allen
Graduate Students: 13 in residence
Full-time Faculty: 14 professors

18

Region Focus • Spring/Summer 2008

RF_SUM_08

U
h

9/26/08

4:02 PM

Page 19

I’m not too happy with.”
The issue is not so pressing with microeconomics, which
has blossomed in recent decades. But in macroeconomics,
there is a large disconnect between what undergraduates and
graduate students learn about economics. The problem,
however, may not be because macro has become less rooted
in reality while micro has not. The problem could be that
economists have yet to find a better way to present the
insights of necessarily dynamic macro models to undergrads.
At the undergraduate level, students learn basic
Keynesian economics — about aggregate supply and aggregate demand, and the famed IS-LM model, which shows how
changes in investment-savings and liquidity-money supply
affect national income. These are useful lessons that teach
students about models and how to use them in studying policy questions. But they are somewhat outdated.
In graduate school, Keynes is quite literally dead, and
suddenly students are transported to the world of Robert
Lucas and rational expectations, paving the way to the main
tool of macroeconomists: dynamic stochastic general equilibrium models (see page 15). The result is a double whammy
— the jarring intellectual transition that students endure as
they move to the graduate level, and then the ensuing observation that dynamic stochastic general equilibrium models

have their own problems. For while these models strive
to more accurately portray how the economy really works,
they sometimes tend to fall short and the complexity can
frustrate students.
Here is how one student who Colander surveyed put it:
“The macro courses are pretty worthless, and we don’t see
why we have to do it, because we don’t see what is taught as
a plausible description of the economy.”
Meanwhile, an interesting side effect of the waning interest in graduate macroeconomics is the relative dearth of
Ph.D. macroeconomists in the job market. At West Virginia
University, chairman Trumbull says that he has constant
difficulty finding suitable candidates for macro slots.
“You’ve got to be doing numerical analysis, computable
general equilibrium stuff, and we don’t have that [among
faculty members],” Trumbull says.

Forward Thinking
All of this seems to point to a discipline in trouble. But if you
take a step back, it’s easy to see that the debates going on
inside economics are no more heated than in other fields.
And they are useful debates. A survey of economics departments in the Mid-Atlantic shows that, on these campuses at
least, academic economists are constantly reevaluating their

o

Clemson University
Clemson, S.C.
Chairman: Raymond Sauer
Graduate Students: 56 Ph.D. in residence
Full-time Faculty: 25 professors
(with new slots being added)
Departmental Paradigm: A blend of the
Chicago and Virginia school traditions
University of South Carolina
Columbia, S.C.
Chairman: Randolph Martin
Graduate Students: 12 Ph.D. in residence
Full-time Faculty: 15 professors
University of Virginia
Charlottesville, Va.
Chairman: William Johnson
Graduate Students: 100 in residence
Full-time Faculty: 32 professors

Virginia Tech
Blacksburg, Va.
Chairman: Hans Haller
Graduate Students: 22
Full-time Faculty: 15 professors

u

George Mason University
Fairfax, Va.
Chairman: Don Boudreaux
Graduate Students: 160 Ph.D. in residence
Full-time Faculty: 35 professors
Departmental Paradigm: You name it — from Austrian
to Public Choice to Experimental

w

West Virginia University
Morgantown, W.Va.
Chairman: William Trumbull
Graduate Students: 50 in residence,
with up to 12 graduating each year
Full-time Faculty: 19 professors
Departmental Paradigm: Tends toward
free-market orthodoxy

NOTE: Figures are estimates or based on information accurate as
of June 2008 and may depend on a department’s affiliation with
other departments. Except as specified, graduate student figures
include both Ph.D. and master’s programs.

Spring/Summer 2008 • Region Focus

19

RF_SUM_08

9/11/08

11:47 AM

Page 20

approaches to training the next generation of economists.
American University’s John Willoughby likes to describe
his program as one that aims to present the vast array of economic perspectives. American’s is one of a handful of
departments that does not scorn “heterodox” economists —
those who tend to break from mainstream thought on everything from the virtues of free trade to the rationality of
individuals. At the graduate level, students can choose
between the mainstream theory track or the heterodox
theory track, and every doctoral student must take at least
one class in the other track.
“There is a disconnect at the highest levels,” he says.
“So many graduate students who go into economics have
received a monolithic view of what economics is, and they
are less prepared for the real variety that exists.”
Willoughby’s definition of monolithic might differ from
some other department chairs. American is unique in its
employment of many radical economists. But other economics programs in the Mid-Atlantic can hardly be
characterized as monolithic. Georgetown’s Canzoneri is
proud of the saltwater/freshwater diversity of his faculty,
referring to the historical split between the coastal (more
steeped in Keynesian economics) and the inland (monetarism and New Classical) schools. At Clemson, the
emphasis is squarely on applied policy economics, with
“almost no effort to train people as economic theorists,”
chairman Sauer says. George Mason is the “most methodologically diverse Ph.D.-granting institution in the
English-speaking world,” says chairman Don Boudreaux.
“We have armchair theorists, Austrians, and even experimental economists who aren’t sure the demand curve slopes
downward unless they test it in a lab, and public choice people who produce multiple regressions.”
As for the core curriculum, it is inarguably true that the
first year or two of graduate economics education is loaded
with skull-cracking math. But after that, it is important to
note, there is a shift to encouraging creativity. In their first
years, students are equipped with the tools necessary to conduct high-level economics. Then, they can be unleashed to
grapple with the ultimate goal: to generate new knowledge,
as Joseph Harrington, economics chair at Johns Hopkins
University, put it. To do that, students need to be able to not
only answer questions, but to also ask the right questions. “It
can be a considerable challenge to get students accustomed
to posing a question, when almost all of their educational
experience has involved being given a question and then told
to answer it,” Harrington says. “The intent is to reach a balance between the teaching of mathematical methods
essential to economic analysis and the development of a
mind for independent inquiry.”
It is in fact something of a movement. At the University
of North Carolina, Chapel Hill, there is no backing away
from the emphasis on math in early graduate education, but
there is a recognition that other talents need to be developed too. “Mathematical ability and training are very
important for Ph.D. economists but other skills are as

20

Region Focus • Spring/Summer 2008

important,” says Helen Tauchen, director of graduate studies
and associate chair at UNC. “In particular, the best economists are also creative, have excellent economic intuition,
and can work independently.” Toward that end, the Ph.D.
program was recently revised so that students start writing
research papers and thinking about dissertation research
topics sooner.
Likewise at the University of Virginia, faculty members
noticed that many students were having difficulty in transitioning to the research portion of their studies, maybe
because they had spent the first part so immersed in learning methodological tools. “So we have recently changed our
program to try to get students into the activity of writing, of
doing research, of thinking about good research questions
and how to attack them as early as the second year of the
program,” says William Johnson, economics chair at
Virginia. “It’s too early to tell whether this is working, but we
are optimistic.”
George Mason’s Boudreaux says that some 20 years
ago, his attitude about university economics was
decidedly pessimistic. But today he holds the opposite view
— he brims with enthusiasm that most academic economists have learned the lesson that, no matter how powerful
their tools, they won’t be able to predict the future. “At
George Mason, we don’t even try to do that, it’s not
even possible,” Boudreaux says. Instead, his faculty tends
toward empirical analysis and stays away from teaching
abstract modeling.
A growing sentiment is that the “too technical/too theoretical” critique of graduate economics may be outdated.
Peter Murrell, economics chair at the University of
Maryland, acknowledges that as recently as 1990, he might
have agreed with the detractors. But today, Murrell sees universities as unleashing highly skilled practitioners on highly
practical topics. “This is a very good time to be in economics education,” he says. “Not only is there powerful interest
in the field, but I think economics is more interesting than
ever before. The types of topics we attack, the way we can
produce fundamental application lessons for public policy
— it’s a great time to be an economist.”
Hearing of such approaches, David Colander finds himself pleased. Granted, macroeconomics remains a problem
spot, he believes. By no means does he — or most academic
educators in general — believe that macroeconomics has
taken a wrong turn in the way it is taught. Instead, Colander
recommends that the core macro curriculum be limited to
courses on institutions and how they work, as well as introducing dynamic stochastic general equilibrium models —
but leaving the use of such models to upper-level classes for
students headed into macroeconomics.
Colander readily admits that his 1980s research on graduate economics education probably had little influence in
changing how economists are made. But he believes that
“The Making of an Economist” struck a chord, or expressed
a near universal concern among academic economists. Today,
the focus is on helping to equip economists with proper and

RF_SUM_08

9/11/08

11:47 AM

Page 21

effective tools for attacking real problems. The math
remains intense, Colander agrees, but because the admission
process at top graduate schools is so rigorous, few students
can’t handle it.
“Economists are still economists. What they do is model,
and that hasn’t changed,” Colander says. “But economics is
reasonable and does change, and it’s changed more toward
what we need, with more empirical work and loosened up
theory. That happened on its own, not because of a report.”
At least, that’s his theory.
RF

READINGS
Colander, David, and Arjo Klamer. “The Making of an
Economist.” Journal of Economic Perspectives, Fall 1987,
vol. 1, no. 2, pp. 95-111.
Colander, David. The Making of an Economist, Redux.
Princeton, NJ, Princeton University Press, 2007.

Econblogs
Economists think out loud online
Recent Posts
Find and Link
Explanatory Economics
The Podium
Blogging and Big Ideas

Friday, July 25, 2008

Economic Blogs
Hundreds of economic blogs have sprung up on the Internet, many written by
academics. What gives? How did economics become so popular?
Posted by Econblogger at 12:40 PM

Subscribe to:

My Blog List

Search: Ivory Tower

Marginal Revolution
The Big Picture
Calculated Risk
Freakonomics
Greg Mankiw's Blog
Mish's Global Economic Trend Analysis
Economist's View
Real Time Economics
Paul Krugman
Robert Reich's Blog

But the Harvard economist finds the blog — short for
Web log — useful because it serves as a reference catalog for
his ideas. “I now constantly Google my own blog for ideas
that I knew I had at some point,” he says. “Previously, the
ideas would have come and gone. The first good thing is that
I have them a little more developed, and, secondly, I can
actually recover them.”
Some 113 million blogs range from engineering to poetry
to diapers to sunsets, you name it. Economists’ blogs occupy
an impressive niche in this new social media universe. The
authors of the best-selling Freakonomics, for instance, write a
blog hosted by the New York Times that bobs around in the

BY B E T T Y J OYC E N A S H

D

ani Rodrik launched a blog in
2007 and now he’s in too deep to

quit. “I still get the thought that maybe I
should stop,” he says. “It does take time.”

top 60 of all blogs, according to the authority of Web log
traffic, Technorati. And the top 10 economics blogs appear
in that list’s top 5,000, according to economist Aaron Schiff,
who uses Technorati data to rank economics blogs on his
Web site. He chalks the popularity of the econblogs up to
the zeitgeist into which books such as Freakonomics, Tim
Harford’s The Undercover Economist, and a raft of others
have tapped. “The public is increasingly realizing that economics has a lot of useful things to say about their daily
personal and business lives,” Schiff notes. “And economists
are becoming better at communicating in relatively plain
language.”

Spring/Summer 2008 • Region Focus

21

RF_SUM_08

9/11/08

11:47 AM

Page 22

Find and Link
Over the past decade, Web logs have evolved from mere
collections of links into vehicles of expression that use
graphics, audio, and even video. Many bloggers — authors of
Web logs — invite readers to post comments, and that
creates a forum for worldwide public conversations.
“Now, thanks to the Web and blogs, the public can
participate easily,” Schiff says. In the two-way exchange,
both sides learn. Comments, however, can be occasionally
inappropriate and also take time to monitor. That’s why
economist Greg Mankiw disabled the comments feature on
his blog in 2007.
Rodrik’s blog attracted immediate attention, most likely
because he’s a well-known academic. The blog attracted a
post from Harvard colleague Mankiw, an early and widely
read blogger and also a high-profile academic. The new blog
went from about five hits a day to roughly 6,000.
Blogs can form bridges across disciplines and connect
readers from disparate backgrounds. Rodrik records
thoughts on his blog at least five days a week, and sometimes
links to empirical research, often inspiring swift commentary of high quality.
“I’m also struck by how I get pushback,” he says. “I’m
known for a certain kind of views. I hear from certain
readers who are critics of those views, which is great — it
shows me that I’m not just preaching to the converted.”
As bloggers post comments and link to academic
papers, readers can shortcut to the expanding body of
economic research. Blogs’ historical antecedents lie in
letters, conferences, pamphlets, journals, seminars, informal
lunches, and watercooler chats.
But the immediacy and range of this particular channel is
unprecedented. “In the past I think it was very hard for
specialists in a field to communicate with nonspecialists,”
Schiff says. “This has changed dramatically in the past 10
years or so, and I think it’s a great thing.”

Explanatory Economics
Blogs may offer the best way to follow unfolding economic
events, says Tyler Cowen. He co-authors the blog Marginal
Revolution with his George Mason University colleague
Alex Tabarrok. Marginal Revolution was one of the first of
its ilk in 2003 because “we saw there was a scarcity of excellent economics blogs and thought we could make our mark,”
Cowen says.
And they have: It often ranks first or second among economics blogs on Schiff ’s Web site, along with Freakonomics.
Economics blogs can penetrate complicated news stories
about the economy because economists just “understand it
better than most journalists,” Cowen says.
While the prose in economics papers can be obscure and
hard to follow, economics bloggers explain difficult concepts
and place research in context.
Economics research in particular lends itself to blogging
because there’s a bottom line. “With economics, you state
the main empirical result in a paragraph, link to the paper, to

22

Region Focus • Spring/Summer 2008

some definite claim,” Cowen notes. “It’s a dialogue, people
link back and forth, add to each other’s points. So there’s
this open window into the world of economics that you
don’t get in other fields.” Most of his readers are not economists, he says, yet they offer important insights. And Cowen
ranges widely on the blog — from food to country music, for
instance — complete with revenue-producing links to
Amazon.com.
“I find [the blog] keeps me very sharp especially because
you have open comments. If you say something wrong, you
get zapped immediately.”
Even a cursory review demonstrates that blog posts can
touch nerves, yet remain civil — even friendly. Some veer
toward ideology, and that defines a certain readership, from
free-market blogs to liberal Paul Krugman’s blog at the
New York Times.
“Blogs need to distinguish themselves from one another,
and one way to do that is by ideology,” Schiff notes. “I would
say that Freakonomics and Marginal Revolution are pretty
neutral,” he observes. “On the other hand, Paul Krugman is
very political and Greg Mankiw somewhat less so.”
This dissemination of economic thought and the accompanying controversy seem positive. Economist John
Whitehead says he catches heat on the blog Environmental
Economics that he writes with co-author Tim Haab. While
his “geeky” research ideas don’t spike traffic, his posts about
global warming economic policies do. Take the debate about
whether carbon taxes will reduce greenhouse gas emissions
more effectively than cap-and-trade policies. “The party line
[in economics] is that carbon taxes are superior for dealing
with climate change,” he says, adding that he supports a cap
on carbon emissions and the trading of those allowances.
“I get ripped pretty hard from economists about that,” he
says. “Every time I mention cap and trade I get a flood of
comments.”
Policy economists, of course, find the blog an essential
tool. On Mother’s Day, Diane Rogers started the Economist
Mom blog, “where analytical rigor meets a mother’s intuition.” She wanted to go beyond conventional research
papers, conferences, and issue briefs to bring discussions
about fiscal responsibility to a wider audience. “It’s such a
big and important issue for the future of our economy, the
economy our children will inherit.” Rogers works for a nonprofit advocacy group in Washington, D.C.
The popularity of these econblogs can only enhance
economic education. Every day, Cowen receives 70-some
blog-related e-mails. “This notion that you can wake up
every day and read the top minds in the field talking to each
other … I think it’s phenomenal and it’s all free. People still
underestimate what a breakthrough this is, for economists
and the world of ideas in general.”

The Podium
Blogs enhance economics instruction, professors say, with
timely examples that textbooks can’t provide. Try it. Sit
down with an economist via blog for cyber conversations

RF_SUM_08

9/11/08

11:48 AM

Page 23

about taxes or global warming or gas prices or strategies in
wine gifting or ways to divide housework. Those two latter
ideas come via Tim Harford, an economics columnist with
the Financial Times who also writes a blog.
Readers can enjoy lively debates, sometimes accompanied by YouTube videos. Harford and behavioral economist
Dan Ariely of the Massachusetts Institute of Technology
conducted such an online exchange last spring about the
assumptions of irrationality in economics. A subsequent
video post showed Ariely debating a picture of Harford
pasted above a sofa.
Blogs replace the office door for economist Craig
Newmark of North Carolina State. He used to clip and post,
but now does so virtually on his blog, appropriately titled
Newmark’s Door, started in 2002.
“One thing I’ve found recently is that I’ve had more than
a few students tell me that they are learning from my blog,”
he says. (Students are often surprised that he blogs. Go
figure.) It’s no accident that many economics bloggers also
teach. “People who teach feel they have something to
communicate,” Newmark says.
While Newmark blogs purely for pleasure, he says the
blog earns him and his wife about $10,000 a year. He gets
some 400 hits a day, but that was bumped up in January and
February to 650 for reasons that are unclear to him, he says.
On the downside, blogs can use up valuable research
time. A successful blog takes effort to prepare and maintain
because it requires more than an occasional post. Instead,
blogs need regular updates. People “visit,” if not every day,
several times a week. For that reason, comments and
responses take on a familiar, informal tone. “If you look at it
as a snapshot, you miss a big part of what is going on,”
Cowen notes.

Blogging and Big Ideas
OK, so maybe this generation of blogging economists won’t
extract a deep enough insight to win the Nobel Prize in
economics in 30 years, but you never know. The effects of
blogs on traditional academic research are unquantifiable.
But research can circulate via blogs, and the collegial nature
of the virtual economics department inspires research.
Since we don’t know how great minds detect the germ of
an idea, a blog is as good a way as any to generate inspiration,
says economist William Trumbull, who heads the economics
department at West Virginia University. “Where, for example, did John Nash get his ideas for game theory?” Trumbull
asks. “It could have been some chance thing, a snippet of
a conversation he overheard. It could have been no more
significant than something you’d read on a blog.”
Academic currency, however, is measured by the number
of publications in traditional refereed journals. Blogs seem
unlikely to affect that content, says economist Daniel
Hamermesh, “except to the extent that the time people
spend writing the stuff reduces the quantity and quality of
their research.” Hamermesh guest blogs for Freakonomics.
Yet blogs popularize research, explain it, bring it to a

wider audience with a mere plug and a link, and can also
broadcast ideas that may interest nonacademic publishers.
Marginal Revolution has led Cowen to a book contract and
a column in the New York Times.
And all this would be impossible without an audience,
the readers who participate in this social and quasiacademic enterprise. “People tell you about new stuff,”
Cowen says, and that sets his mind in motion. Plus, he reads
widely to keep Marginal Revolution fresh and lively; while he
doesn’t spend more than an hour or so actually writing a
post, he’s up late reading. But he’d be doing that regardless.
Posts and ensuing comments provide value and insights.
It’s more than just a new channel. It changes the way people
think and track ideas, and could ultimately influence and
affect scholarship, for better or worse.
“University professors spend a lot of time talking about
ideas,” Newmark says. “If you go to lunch, 50 percent of the
talk is about ideas; now we can widen that conversation.”
While he doesn’t want to exaggerate that impact, “it has
more than zero effect.”
But publication in academic journals remains the
priority. “I think any exposure you might get through blogging is just an additional side benefit,” Schiff notes.
Blogs could affect research choices and that’s not necessarily negative. “Ultimately academics will care about
getting published in journals and the opinion of other economists matters more than blogging,” Schiff adds. To Cowen,
blogs enhance research. “So many academics and economists work on little things that nobody cares about,” he says.
“If this brings a shift from that, then that’s for the better.”
If a paper is unsound, experts instantly weigh in. That’s an
immediate and public check that differs from the mysterious referee process.
“It definitely gets people to work on more popular topics.
For me that’s a good thing,” he continues. “It gets people
to write more clearly, [for] more people than your 20
specialists.”
Blogging also hones research instincts. Whitehead, now
that he’s blogging, reads more, including other blogs.
“It used to be I’d have a geeky research paper and be at a
loss at the end about how to sell it in terms of policy and
practical applications,” he says. “Now, I always seem to have
a handle on what makes the paper halfway important or
what policy it can be applied to.”
Blog technology could also speed publishing.
Professional organizations could sponsor blogs enabling
real-time discussions on papers rather than formal comments years after publication. “Research could be a whole lot
more efficient,” Whitehead says. Already, journals publish
online as soon as they’re ready. But the Internet could speed
the discussion and research part of it.
There is a trend toward open publishing, such as
the online journal, The Economists’ Voice, edited by Nobel
Prize winner Joseph Stiglitz, along with co-editors Brad
DeLong and Aaron Edlin. The journal’s editors publish
articles, often by prominent intellectuals, that are often

Spring/Summer 2008 • Region Focus

23

RF_SUM_08

9/11/08

11:48 AM

Page 24

picked up by major newspapers.
Bloggers see their Web sites as complements to scholarship rather than substitutes. “I can’t really see any negative
effects from that,” Schiff says. “And the obvious positive
effect is that it exposes people to things that otherwise
might only get published in academic journals.”
Still, it’s not clear that blogging can enhance a career.
Most, if not all, economics bloggers write from the lofty
position of tenure. But not all bloggers in every academic
discipline do. Rodrik probably wouldn’t blog if he were seeking tenure at a top academic institution. “I guess I’m
sufficiently established that I don’t give a damn,” he says.
The blogging wave may have crested but there’s always

room for another voice, however difficult to discern among
the cacophony. “For someone like me who is less well known
[than Krugman or Rodrik],” Schiff says, “it takes a much
longer time to build up a readership.”
As technology evolves, so will the blog, its authors, and
dynamic audience. The whole enterprise may embody the
ideal of the influential economist Friedrich Hayek, who
believed in the power of decentralized, unplanned activity
— “spontaneous order.” While there’s no coordination
per se, it’s kind of a market where rules emerge, Craig
Newmark says.
Perhaps it’s not surprising that some of the Austrian
thinker’s devotees have a blog called Café Hayek.
RF

Searching for Homo Economicus

T
T

BY ST E P H E N S L I V I N S K I

he audience that gathered in the ornate concert hall
for that night’s ceremony probably noticed the
similarities of the two guests of honor standing next
to each other on stage. Both wore tuxedos accented by
white bowties and vests as was appropriate for the occasion. Both wore glasses and were about the same height.
But the audience probably noticed a difference too. The
guest of honor standing on the right sported a ponytail that
reached almost halfway down the back of his tuxedo jacket
— a rare sight at a ceremony like this.
Delving into each man’s biography, the spectators might
have noticed more differences. The man on the left was born
in Tel Aviv and studied psychology as an undergraduate
because it struck him as more practical than philosophy.
The ponytailed man was an economist born in Wichita,
Kan., who, before pursing the study of economics, started
out his academic life in electrical engineering because he
wanted to avoid the harder math classes required of physics
students.
Yet there was an overriding similarity that evening, and it
was the reason for the tuxedos. Both men were about to be
awarded the Nobel Prize in Economics.
The date was Dec. 10, 2002. The man on the left was
Daniel Kahneman; the man on the right was Vernon Smith.
Both are regarded as academic pioneers for their use of laboratory experiments as a way to test the basic premises of
modern economics. Yet the conclusions that each came to
over decades of their own research appear at odds with each
other. At issue is a fundamental question that cuts
to the root of economic methodology: Do people act rationally in a market setting and what does that mean for the
study of economics?

24

Region Focus • Spring/Summer 2008

Or, to put it another way: Did Homo economicus ever walk
the earth and, if so, is he now extinct?
Homo economicus is a metaphorical species of human
who is able to, as economists say, optimize. He exhibits
rationality in the economic sense by making decisions, even
in uncertain situations, based mainly on self-interest and a
strong grasp of the alternatives at hand. The mathematic
and analytical models that are the stock in trade of modern
economics rely on the prevalence of this form of human for
markets to reach equilibrium.
The group of researchers who call themselves “behavioral
economists,” like Kahneman, believe people don’t often act
that way in reality and have run multiple experiments to try
to prove it. On the other side of the debate are Smith and his
colleagues — the “experimental economists” — who have
been able to show that markets can reach equilibrium when
subjected to the right sort of tests in a laboratory. Yet, if
people are indeed fundamentally irrational in the economic
sense, would they really be able to make the kinds of decisions which help bring the market to equilibrium?
The debate about whether there ever was such a creature
as Homo economicus has recently broken into the mainstream
media discussion about how economists view the world. It’s
a discussion that has been at least 50 years in the making and
probably won’t end soon.

Efficient Markets and Irrational Men
Vernon Smith notes that his brand of experimental economics began with a bout of insomnia. He was teaching at
Purdue in 1955, and in the middle of one particular night he
began to think about an experience he had at Harvard as a
graduate student.

RF_SUM_08

9/11/08

11:48 AM

Page 25

Economist Edward Chamberlin had run a series of experiments with various groups of Harvard students when Smith
was pursuing his Ph.D. Chamberlin would tell some students
in this experiment that they were buyers and the rest sellers.
He would then give them a card with a number on it. For the
sellers, that value represented the minimum selling price for
the unit of good they needed to sell; for the buyers, it stood
for the maximum buying price. On paper, these values corresponded to places on a hypothetical supply or demand
curve. Then Chamberlin let the students circle the room and
negotiate whatever contract they wanted. Once a bargain
had been struck between a buyer and a seller, the transaction
was recorded on the classroom blackboard.
What Chamberlin had in mind was an experimental test
of competitive equilibrium theory, which suggests a market
will converge on a single price where supply and demand
overlap. Instead, his experiments produced trades at substantially different prices, and the observed average price
was actually lower than equilibrium theory would predict.
The paper Chamberlin published on the experiments
went virtually unnoticed by the economics profession. But
Vernon Smith had taken part in one of these experiments
and thought there might be something more to them.
“So, there I was, wide awake at 3 a.m., thinking about
Chamberlin’s silly experiment,” Smith recounted in a 1991
essay. “The thought occurred to me that the idea of doing an
experiment was right, but what was wrong was that if you
were going to show that competitive equilibrium was not
realizable … you should choose an institution of exchange
that might be more favorable to yielding competitive equilibrium. Then when such an equilibrium failed to be
approached, you would have a powerful result.”
Smith’s experiment made two main changes to
Chamberlin’s design. The first was in structure: Smith decided to use a “double auction” mechanism in which buyers and
sellers called out their bids and the successful trades were
recorded by the moderator, an arrangement that more closely mimicked a real-life commodity or stock exchange.
He also tried the experiment with the same group of people
for multiple rounds to allow them to learn from their previous experience.
A competitive equilibrium emerged from this more
structured market environment. Smith initially didn’t
believe the results so he tried it with another set of students.
And then another. Over the following several years, he found
himself producing experimental results that exhibited stunning consistency and robustness. Competitive equilibrium
theory was being vindicated.
Meanwhile, a political scientist named Herbert Simon at
Carnegie Mellon University published a 1955 Quarterly
Journal of Economics article titled, “A Behavioral Model of
Rational Choice.” With this essay, Simon opened up a line of
inquiry that for years to come would challenge the foundation of classical economics.
“Traditional economic theory postulates an ‘economic
man,’ who, in the course of being ‘economic’ is also ‘ration-

al,’ ” wrote Simon. “Rather, I shall assume that the concept
of ‘economic man’ … is in need of fairly drastic revision.”
The means by which Simon did this was to bring into the
analysis some insights from psychology. He posited that
humans have natural limits on their cognitive ability. So
instead of supposing a rational man who can instantly reason
to the optimal solution to a problem, Simon thought economists should define the agents within their models as
exhibiting “bounded rationality.” This uniquely human form
of rationality is one in which a person arrives at a solution
that may not be perfect in a computational sense but is simply good enough to satisfy them. “Because of the
psychological limits of the organism … actual human rationality-striving can at best be an extremely crude and
simplified approximation to the kind of global rationality”
that is often implied in economics models, Simon wrote.
Simon received a Nobel Prize in Economics for this
approach in 1978, making him the first noneconomist to win
that prize. But the research program that eventually became
known as behavioral economics didn’t really come into its
own until psychologist Daniel Kahneman and his co-author
Amos Tversky (a cognitive psychologist based at Stanford
University before his death in 1996), began to make their
mark on the economics profession.
One of the first high-profile articles their collaboration
produced appeared in the journal Econometrica in 1979 — a
contribution that would turn out to be the most-cited
article in that journal’s history. In it, the authors proposed a
new way to look at how people make decisions. They too
suggested that people do not weigh risky choices the way a
computer (or Homo economicus) would.
They tested this insight with a series of experiments in
which participants were asked if they would accept certain
gambles. The result was that people’s answers tended to
diverge from what they would be if the respondents were
optimally assessing the true risks of each gamble. That’s
because, Kahneman and Tversky posited, people don’t
think in terms of traditional probability theory. People
instead think in terms of the prospects for losing what they
already have.
“If you think in terms of major losses, because losses
loom much larger than gains — that’s a very well-established
finding — you tend to be very risk-averse,” Kahneman told
Forbes in 2002.
“I’ll give you an example: Suppose someone offered you a
gamble on the toss of a coin. If you guess right, you win
$15,000; if you guess wrong, you lose $10,000. Practically no
one wants it. Then I ask people to think of their wealth, and
now think of two states of the world. In one you own [your
current assets] minus $10,000 and in the other you own
[your current assets] plus $15,000. Which state of the world
do you like better? Everybody likes the second one. So when
you think in terms of wealth — the final state — you tend to
be much closer to risk-neutral than when you think of gains
and losses.”
Kahneman’s conclusions spawned a host of articles that

Spring/Summer 2008 • Region Focus

25

RF_SUM_08

9/11/08

11:48 AM

Page 26

sought to displace the old assumptions about rationality in
economics. The collection of observations were grouped
loosely under the umbrella of what came to be known as
“prospect theory.”
After the publication of the Econometrica article,
Kahneman began collaborating with economist Richard
Thaler, currently of the University of Chicago, on a few
experiments that were meant to flesh out the boundaries of
prospect theory. What they and their colleagues discovered
would stand for about 20 years as one of the more enduring
insights of behavioral economics. New research, however,
has begun to call into question the robustness of some of
these results.

The Endowment Effect
Imagine that you decide to participate in one of these
behavioral economics experiments. When you show up at
the lab, you are given either a ballpoint pen or a coffee mug.
Which one you get is decided by purely random chance.
Then you’re asked if you’d like to trade what you’ve been
“endowed” — that’s economist-speak for what you’ve been
given. In this case, say it’s the mug. If you decide to give up
the mug, you’ll get the pen which, you are told, is of equal
value.
Behavioralists predict, based on the many versions of this
experiment they’ve conducted, that you probably won’t
trade the mug for the pen. But it’s not because the mug is
inherently nicer than the pen. In fact, when the option to
take home the mug is given to those who have the pen, most
of them decide not to trade either.
According to standard economic theory, that shouldn’t
happen. Since the goods were randomly distributed, there
should be much more trading in these experiments than
actually occurs.
Behavioral economists call this the “endowment effect.”
It predicts that the subjects in the experiment would have an
inherent aversion to losing what they already have. Parting
with the endowed good is perceived by the mug holders as a
loss greater than the potential gain from acquiring another
good of equal value. If true, this could tarnish some of the
classic notions about the efficiency of markets and the ability of people to trade rationally within them. A world in
which some trades don’t occur simply because too many
people are scared of parting with their goods would be one
with many suboptimal economic outcomes.
Economists Charles Plott of Caltech — a pioneer in
experimental economics — and Kathryn Zeiler of the
Georgetown University Law Center, were able to
duplicate the results of these experiments (particularly
one by Kahneman and Thaler, but also one by their
occasional co-author, Jack Knetsch, currently of Simon
Fraser University). But when they did so, they began to
notice some interesting things.
For instance, in the original experiments, subjects were
told to raise their hand when they wanted to trade their
good for the other good. When Plott and Zeiler ran the

26

Region Focus • Spring/Summer 2008

same experiment, they noticed that subjects were looking to
others for cues. “When we asked them after the experiment
how they made their decision, many of them said they
looked around the room to see what other people were
doing,” says Zeiler. So, Plott and Zeiler decided to rerun the
experiment and introduce a secret ballot in which players
mark their decision to trade or not on a note card.
They didn’t take for granted any other element of the
original experiments either. They even played around with
the procedures by which the good was handed to the experiment’s participants. In the original experiment, the subjects
were told, “I’m giving you the mug. It is a gift. You own it.
It is yours.” But Plott and Zeiler speculated that might
have signaled a certain high level of value for the mug.
Besides, the subjects might not know if the pen they might
get as a result of the trade is really any good. So, Plott
and Zeiler simply told the participants: “The mug is yours.
You own it.”
They also adjusted for other possible factors that might
have skewed the original results. The participants got to
inspect the other good before they made their choice.
None of these were options given to the participants in
early endowment effect experiments of Kahneman and
Knetsch.
“Once you control for these other things that might be
causing the gaps — even if you leave in place all conditions
necessary to trigger prospect theory — you don’t see gaps
anymore,” says Zeiler. “If endowment effect theory was
correct, we should still see those exchange asymmetries.”
It’s a good example of how rules and institutions can
change an experiment’s outcome. In fact, that’s a crucial
element in the debate between behavioralists and experimentalists. The experimentalist camp’s main critique is that
modern behavioralists are interested mostly in uncovering
deviations from the textbook versions of rationality, not in
discovering whether there is something unique about markets that help people reach socially beneficial outcomes.
For instance, some behavioral experiments don’t give the
subjects an opportunity to learn from their mistakes in the
context of a market mechanism or a trading situation that is
repeated more than once. Yet markets in the real world provide no shortage of educational experiences and repeat
encounters.

Rediscovering Homo Economicus
“In principle, as I see it, experimental market economics and
behavioral economics are complementary,” writes Vernon
Smith in his most recent book. The man who sought to
make economics a more experimental enterprise in the first
place instead suggests that the goal of experiments should be
to more closely approximate real-world markets.
In many of Smith’s own experiments, the markets in the
laboratory reach a competitive equilibrium even though the
traders don’t consciously realize how optimal their behavior
really was. As he wrote in 1991, “subjects are not aware
that they are achieving maximum profits collectively and

RF_SUM_08

9/11/08

11:48 AM

Page 27

individually, in equilibrium, and, in fact, deny this when
asked.”
Humans do seem to optimize, in the aggregate, over a
long time period. Experimental research provides solid evidence that a structured market environment is important to
this process. In the real world, laws and trading procedures
are essential for markets to function well. And experiments
can give us critical insight about how best to structure
those rules.
Progress needs market participants who can learn from

experience too. “People can make a lot of cognitive ‘errors’
on the way to creating a new market,” writes the onceponytailed Smith. (Eyewitness accounts confirm he opted
for shorter hair sometime in 2007.) “What are important
about individual choices are the decisions that cause people
across time and generations to change tasks, locations, and
directions in an effort to better themselves in response to
market prices.”
In other words, there is still a little Homo economicus in all
of us. We just have to know how to lure him out of hiding. RF

READINGS
Kahneman, Daniel, and Amos Tversky. “Prospect Theory: An
Analysis of Decision and Risk.” Econometrica, March 1979, vol. 27,
no. 2, pp. 263-291.

Simon, Herbert. “A Behavioral Model of Rational Choice.”
Quarterly Journal of Economics, February 1955, vol. 69, no. 1,
pp. 99-118.

Plott, Charles, and Kathryn Zeiler. “Exchange Asymmetries
Incorrectly Interpreted as Evidence of Endowment Effect Theory
and Prospect Theory?” American Economic Review, September 2007,
vol. 97, no. 4, pp. 1449-1466.

Smith, Vernon L. Bargaining and Market Behavior: Essays in
Experimental Economics. New York: Cambridge University
Press, 2000.
Smith, Vernon L. Rationality in Economics: Constructivist and
Ecological Forms. New York: Cambridge University Press, 2008.

Research Publications
of the Federal Reserve Bank of Richmond

Coming This Fall
Look for the online Economic Briefs series highlighting
the latest research by economists at the Richmond Fed

Spring/Summer 2008 • Region Focus

27

9/26/08

4:00 PM

Page 28

South Carolina recently ranked third
among the 31 U.S. states with nuclear
capacity, making it the state with
the most nuclear capacity in the southeastern United States. South Carolina’s
V.C. Summer is among the nuclear
plants planning to add a new reactor.

Going Nuclear
The future looks brighter for a once-maligned industry
BY VA N E S S A S U M O

he partial meltdown of a reactor core at the
Pennsylvania Three Mile Island nuclear power plant
in 1979 was a watershed event. Although it resulted
in no deaths or injuries, it is considered the most serious
accident in the domestic nuclear power industry’s operating
history. No new plants were proposed in the United States
after that incident, and the plant construction that was
underway saw cost overruns exceed 250 percent, according
to a Congressional Budget Office (CBO) study.
Poor performance, safety concerns, and the high cost of
constructing a nuclear plant relative to other sources of
power continued to plague the industry for several years.
But the industry’s fortunes seem to be turning. With the
demand for electricity in the United States expected to grow
20 percent by the end of the next decade, the country needs
more power generation capacity. That could be satisfied by
building coal, natural gas, or nuclear plants — power sources
that can provide electricity around the clock. Growing concerns over global warming, however, are prompting
policymakers, investors, and even environmentalists to take
a fresh look at nuclear power.
Unlike plants that generate electricity by burning fossil
fuels, nuclear power does not produce carbon dioxide, a primary greenhouse gas which many consider to be at alarming
levels already. As a result, expanding nuclear power is often
regarded as a vital component in a portfolio of solutions to
the problem of global warming.
In choosing the type of plant to build, companies certainly are looking at the possibility that lawmakers may

T

28

Region Focus • Spring/Summer 2008

decide to limit carbon dioxide emissions. That would effectively put a price on this greenhouse gas and increase the
cost of electricity generated by using fossil fuels.
There are other factors, too, that will help make nuclear
power a more attractive bet. New licensing procedures,
investment incentives under the Energy Policy Act of 2005,
and significant technological improvements in the latest
generation of advanced nuclear reactors are giving companies the confidence to invest in new nuclear plants.
The most visible sign of renewed interest is that since
2007 about nine companies have filed for applications with
the Nuclear Regulatory Commission to build new nuclear
reactors, the first applications in 30 years. Most of these
companies operate in the Fifth District, including
Dominion, Duke Energy, Progress Energy, South Carolina
Electric & Gas, and UniStar Nuclear Energy — a joint company formed by Constellation Energy and the EDF Group, a
European energy company.
The industry seems to be in a good position to make a
comeback. But while favorable conditions are giving the
industry hope, the task ahead is still a daunting one. “I think
there is certainly a brighter prospect today,” says Eugene
Grecheck, vice president for nuclear development at
Dominion. “But that needs to be tempered by the fact that
we still [have] a lot of work to do.”

The Economics of Nuclear Power
More than 100 nuclear reactors currently provide about
one-fifth of the total electricity generated in the United

PHOTOGRAPHY: SCE&G

RF_SUM_08

RF_SUM_08

9/11/08

11:49 AM

Page 29

States. Nuclear plants produce electricity using the heat
generated by nuclear fission — a process that splits the
nucleus of a heavy element, causing a carefully controlled
chain reaction that releases a tremendous amount of energy.
Once the plant is built, nuclear power can be a relatively
cheap source of electricity. The average cost of producing
electricity from nuclear power in 2006 was about 34 percent
to 66 percent lower than the electricity generated from
fossil fuels, according to the Energy Information
Administration. While the cost of operating and maintaining a nuclear power plant is higher than the upkeep required
for plants using fossil fuels, nuclear fuel — typically uranium
— is cheaper than coal or natural gas.
The existing fleet’s performance has also improved significantly over time. “Fortunately we’ve figured out how to
run our plants well,” says David Modeen, director of external affairs at the Electric Power Research Institute, a think
tank. In the past, protective shutdowns — called “trips” —
frequently occurred that forced plants to go temporarily
offline. Today, a much more fluid and refined control system
has dramatically brought down a nuclear reactor’s average
number of trips a year. “Most plants don’t trip in a year, they
just run,” says Modeen. As a result, the industry’s plants have
recently been running at about 90 percent of capacity, up
from about 60 percent two decades ago.
Industry executives certainly think that the nuclear
plants’ safety technology has improved greatly since the
Three Mile Island accident. “We’ve had a long period of
demonstrated safe and reliable energy supply from our
current nuclear fleet,” says Joe Turnage, senior vice president
for strategy at UniStar. “As an investor, you would not
proceed to put the capital in these projects unless you had a
compelling case that your performance and safety track
record are assured,” he says.
The latest generation of nuclear reactors is expected to
continue to improve that record. For instance, some of the
new designs include “passive safety” systems that use natural
forces such as gravity and natural circulation to prevent an
accident in the event of malfunction. No operator intervention is required. A simplified design also makes these
reactors easier to operate and less prone to mechanical
errors.
But while its performance is encouraging, the price tag
for building a nuclear plant and the uncertainty around that
estimate might give investors some pause. Each plant can
cost several billions of dollars and is more expensive than
building one that runs on gas or coal — alternatives that
can also generate electricity 24 hours a day. How much
more expensive may be hard to pin down. “The history of
the industry on cost forecasting is not too good,” noted
MIT economist Paul Joskow at a 2006 conference
on nuclear power. Nuclear plants built in the 1970s
through the early 1990s cost much more than was anticipated, mostly because of regulatory delays, safety scares, and
poor designs.
For investors to jump into nuclear power, cost estimates

must be credible. And after the numbers are crunched,
the total cost of constructing and operating a nuclear
power plant must be lower than conventional fossil fuel
alternatives.
A 2003 MIT study finds that, under most conditions, a
new nuclear power plant would be more expensive than a
coal or a natural gas plant. But assuming a high natural gas
price scenario, nuclear plants may be able to compete with
natural gas plants as long as the cost of building a nuclear
plant falls by 25 percent, construction time is cut by one year,
and the cost of financing it becomes as low as funding a coal
or gas plant.
Life gets easier for nuclear, however, if carbon dioxide
emissions are priced. Electricity generated from a coal or
natural gas plant emits high levels of carbon dioxide that is
thought to be harmful to the environment. However, the
cost of these side effects is not reflected in the price of
electricity. But if emissions were to be priced through a tax
or a cap and trade system, the cost of electricity generated
by burning fossil fuels could go up significantly.
The MIT study finds that at a price of $100 to $200 per
ton of carbon emitted, nuclear becomes more attractive
than coal and can even be cheaper than natural gas. A more
recent 2008 CBO study likewise finds that at a charge of $45
per metric ton of carbon dioxide (about $165 per ton of
carbon), nuclear power would be the least expensive choice
for building new base-load plants. However, if this price
falls below $5, conventional coal plants would be the lowest
cost source of generating capacity. Between these two
prices, natural gas would have an advantage over both
nuclear and coal.
Emission charges would certainly make nuclear power a
more attractive investment than conventional fossil fuel
generation. But even in the absence of a price on carbon
emissions, the CBO study notes that the investment incentives under the Energy Policy Act of 2005 “would most likely
lead to the planning and construction of at least a few
nuclear plants in the next decade.” Of these incentives, the
industry thinks that the loan guarantee program, which covers 80 percent of construction costs, is particularly valuable
because it brings down the cost of financing a new plant —
a major hurdle in wooing investors to what is perceived to be
a relatively risky project. If the first round of nuclear plants
to take advantage of this benefit can demonstrate that construction can be completed with only a few snags, then the
uncertainty of building the next round of plants — and the
financing cost — may fall substantially even without loan
guarantees.
An improved licensing process for the new generation of
nuclear plants removes another key uncertainty. In the past,
nuclear plants were required to get two separate licenses —
one to build and another to operate. That meant a fully constructed plant could wait years before it operated
commercially. Today’s process combines those two licenses
and grants approval before a major commitment to construction and a huge amount of expenditure has been made.

Spring/Summer 2008 • Region Focus

29

RF_SUM_08

9/11/08

11:49 AM

Page 30

Powering Up
Function
Baseload Plants

The backbone of the electric power grid;
designed to operate continuously except
during breakdowns or scheduled shutdowns.

Intermediate Plants

To supplement baseload plants, these plants
operate during the daytime, and are turned
on as needed during the evening.

Peaking Power Plants

To provide additional electricity only during
periods of peak demand.

SOURCE: European Nuclear Society

Another important difference that allows the construction of nuclear plants to proceed more smoothly this time
around is that the plant designs are now highly standardized.
“Out of the 104 plants currently operating in the United
States, almost all of them are custom designed,” says
Grecheck. Companies today can take a largely completed
design and simply make minimal changes to adapt it to a
specific site. They can also learn from the construction
experience of a similar plant in the country or abroad.
UniStar’s chosen reactor design, for instance, is being constructed in Finland and France. “Ours is probably serial
number 5,” says Turnage. “We did not want first-of-a-kind
engineering risk. We did not want to be serial number 1.”
The economics of nuclear power may also be improved
by various state policies. Many of the proposed nuclear
plants are located in states that regulate the rates which
power companies charge. Rate regulation may provide these
companies some guarantee that its customers will pay back
the cost of building a “traditional plant.” In contrast, a “merchant plant” that relies on the market for setting its rates
places the risk squarely on investors rather than its customers. So which type of plant will most likely be built?
“There will be a mix,” says MIT professor John Deutch and
one of the authors of the 2003 MIT report. “But with the
size of capital [needed], regulated plants will be easier to
finance.”
Some states with rate regulations, such as Georgia,
Florida, North Carolina, South Carolina, Mississippi,
and Kansas, are also allowing utilities to recover the cost
of new nuclear plants while construction is in progress. But
even in states that have no rate regulations, like California
and Maryland, lawmakers are considering limits on carbon
dioxide emissions that would give nuclear a definite
advantage.

Public Attitudes
Concern over global warming is perhaps the biggest driver in
the renewed interest in nuclear power among policymakers.
Studies have found that a sizeable reduction in greenhouse
gas emissions cannot be achieved without a shift toward

30

Region Focus •

Spring/Summer 2008

less carbon-intensive technologies.
Electricity companies are looking at
the entire spectrum of possible genPower Source
eration sources to meet the growing
demand for electricity, not just
Coal, nuclear,
nuclear. Renewable energy is another
hydroelectric,
carbon-free alternative. But unlike
or natural gas
solar and wind power, which produce
Coal or natural
electricity only when the sun is shingas
ing or the wind is blowing, nuclear
power can provide constant base-load
Natural gas or
electricity much like coal and natural
hydroelectric
gas plants.
As a result, a number of environmentalists have spoken out in favor of
nuclear power to meet the growing
demand for electricity. “The only technology ready to fill the
gap and stop the carbon dioxide loading of the atmosphere
is nuclear power,” noted environmentalist and The Whole
Earth Catalog creator Stewart Brand in a 2005 article.
Public opinion toward nuclear power has inched up over
the last two decades, says MIT professor Stephen
Ansolabehere, who conducted a survey of people’s attitudes
toward nuclear power and other power sources. In 2007, the
survey found that about 39 percent felt that the United
States should reduce the use of nuclear power. That’s down
from about 47 percent in 2002. Oil is still the most disliked
power source but its popularity has dropped even more,
which might have helped increase support for nuclear
power. That the accident at Three Mile Island happened
almost three decades ago seems to have pushed it further
from people’s minds. “As generations replace each other, you
forget about what events shaped people’s impressions,” says
Ansolabehere.
However, it seems that people are warming up to nuclear
power not because of concerns about climate change.
Indeed, the survey finds that the issue of global warming is
uncorrelated with people’s preferences about nuclear power
— or just about any other energy source. “People don’t
really connect global warming and nuclear power,” says
Ansolabehere. And when people were asked which energy
source they thought contributed most to global warming, a
strikingly high percentage answered “nuclear power.” So
while policymakers, investors, and others who are very much
engaged in this issue tend to agree that nuclear is an important part of the solution to stabilizing greenhouse gas
emissions, public attitudes seem to lag behind.
The survey also revealed that people were willing to pay
only about $5 more a month on their energy bill to help mitigate global warming. Ansolabehere says that this is about a
fifth of what is needed to reduce greenhouse gases under the
Kyoto Protocol, an international agreement on climate
change. This suggests that most would rather stick to cheaper but dirtier electricity than switch to a cleaner but more
expensive source like nuclear power.
Ansolabehere’s survey also finds that nuclear power is

RF_SUM_08

9/11/08

11:49 AM

Page 31

viewed as somewhat harmful by the public. Safety is still an
important concern, but the management and disposal of
radioactive nuclear waste is the biggest reservation. When
presented with solutions to this issue, support for nuclear
power expansion goes up.
While the results of the survey suggest that the public
seems to have gotten it wrong in terms of the relationship
between nuclear power and greenhouse gas emissions, they
seem to be spot-on in identifying the waste problem as one
of the most important challenges facing the industry.

A Nuclear Power Renaissance?
Without new investments in nuclear power plants, the country’s capacity for generating electricity from this power
source will quickly decline after 2030, said Joskow at a 2006
nuclear power conference. But if the industry is to expand in
such a way that it can continue to play an important role in
future electricity supply and at the same time make
a significant contribution to stabilizing greenhouse gases, a
number of concerns must be addressed.
What to do with spent fuel from nuclear plants that will
remain highly radioactive for thousands of years is perhaps
foremost among these concerns. “The perceived lack of
progress towards successful waste disposal clearly stands as
one of the primary obstacles to nuclear power expansion
around the world,” noted the 2003 MIT report. Efforts to
find solutions have been mostly focused on the planned construction of a permanent disposal facility at Yucca Mountain
in Nevada, but that has been much delayed.
Today’s cost estimates for building new nuclear plants
have also been climbing — at least double what the industry
quoted just a few years ago. The cost of copper, steel, concrete, and manufactured components that go into these
plants has been rising. Moreover, a significant expansion of
nuclear power in the next few decades would only exacerbate the scarcity of materials and skilled labor —
electricians, plumbers, pipe fitters and the like, not just engineers — who are necessary to build and run nuclear plants.
“With the potential for multiple companies moving forward
with plans for new nuclear, the availability of critical materials and qualified workers could pose a challenge,” says South
Carolina Electric & Gas spokesman Eric Boomhower.
The long hiatus in construction in the industry has not
helped. “We’ve lost much of the infrastructure in the United

States that existed 30 years ago to support nuclear plants and
we’re going to have to rebuild that,” says Turnage.
Companies will be looking all over the world to find what
they need, but supply overseas is tight too. For instance,
Turnage says that there is only one company in the world
which makes ultralarge forgings that are used in construction of the reactor pressure vessels — that’s where the
nuclear fuel is contained. He’s starting to see a response,
however, from companies that are eager to supply their
needs. Turnage says that because of the size of their order for
turbine generators, Alstom, a big player in the power business, is investing in a plant in Chattanooga, Tenn.
None of the companies that have filed for a license have
actually committed to building a plant. While waiting for
their license, companies are working hard to get more
certainty on what the costs of building these new plants will
be. “All the stakeholders in approving such a large investment would like to be confident that we understand what
the costs are going to look like,” says Grecheck. These companies are also ordering materials that require a long lead
time, applying for a loan guarantee, looking at future market
conditions, and seeking approval from state agencies before
they can begin construction. If all these elements come
together as planned and on schedule, the first new nuclear
plants in a generation could start operating by the middle of
the next decade.
Despite the challenges and a long to-do list, some say that
we’re already seeing a veritable renaissance for nuclear
power. But many in the industry say that they still have a
long way to go.
“I kind of cringe when I hear ‘renaissance,’ ” says Modeen
of the Electric Power Research Institute. “It’s a
heartfelt respect for the technology, somewhat humbled
by the daunting task before us and we want to do
it well. It’s not going to be easy.” Given the industry’s
history, Modeen would love to see those first few plants built
very well and go from there. He understands that even one
accident could upset the hard work that has been put in to
overcome the public’s long aversion to nuclear power.
“Society is not to the point [with nuclear power] like we are
with plane crashes. People will still be flying planes tomorrow,” says Modeen. For the modern nuclear power industry,
an old adage seems appropriate: It’s best to proceed slowly
but surely.
RF

READINGS
Ansolabehere, Stephen. “Public Attitudes Towards America’s
Energy Options: Insights for Nuclear Energy.” MIT Center for
Advanced Nuclear Energy Systems, MIT-NES -TR-008,
June 2007.
Brand, Stewart. “Environmental Heresies.” Technology Review,
May 2005.

Deutch, John (Co-Chair), Ernest J. Moniz, (Co-Chair) et al.
“The Future of Nuclear Power: An Interdisciplinary MIT Study.”
Cambridge, MA: Massachusetts Institute of Technology, July 2003.
Electric Power Research Institute (EPRI). “Generation
Technologies for a Carbon Constrained World.” EPRI Journal,
Summer 2006.

Congressional Budget Office. “Nuclear Power’s Role in Generating
Electricity.” May 2008.

Spring/Summer 2008 • Region Focus

31

RF_SUM_08

10/6/08

12:51 PM

Page 32

INTERVIEW
Charles Holt
The field of experimental economics became more
widely known with the awarding of the Nobel Prize in
economics to the groundbreaking experimentalist
Vernon Smith in 2002. It has many practitioners today,
but one of the most respected — and busiest — is
Charles Holt of the University of Virginia.
In addition to racking up an impressive track record of
market experiments, Holt has helped bridge the gap
between the laboratory and the real world. He designed
a new type of auction that was used by the Federal
Communications Commission (FCC) this year to lease
critical segments of the electromagnetic spectrum.
Recently, he has been involved with designing a market
for carbon emissions among the northeastern states
that would improve on the less-than-successful
experience with carbon markets in Europe.
As an early advocate of the experimental approach to
economics, Holt was active in the formative years of the
Economic Science Association — the professional
organization for experimental economists — and
counts Vernon Smith as one of his professional
influences and close friends. Holt also co-authored the
first comprehensive textbook in the experimental
economics field.

✧
RF: When did you know you were interested in studying
economics?
Holt: The football coach in my high school history class, in
Blacksburg where I grew up, made a comment about how he
couldn’t understand why baseball teams traded one player
for another. He wondered: Didn’t they know the one player
was getting the short end of the stick? I remember thinking:
Gee, what if one team has extra first basemen and another
has an extra pitcher. They could make a swap and both
teams would win more games and be better off. So, I think
I was interested in economics early on.
Later, there was a required religion class in my Northern
Virginia boarding school, and we were having a discussion
on usury. I remember asking why, if there’s a shortage of
funds, the people who have the money couldn’t charge a
high interest rate. I remember all these people looking at me
like I was saying something very sinful.

32

Region Focus •

Spring/Summer 2008

Then I went to Washington and Lee. I majored in
economics and politics. One year we read the book, The
Calculus of Consent, which was written by James Buchanan
and Gordon Tullock. They were at the University of Virginia
and had written the book the year before. So it had come
over the mountains and into the classroom one year after it
was written and long before it had really gotten widespread
attention.
So, I remember being interested in politics and
economics and wondering which one I wanted to go to graduate school for. I realized that the things I was learning in
economics will still be established principles in 20 years,
while the things I was learning in politics seemed more fluid
and undeveloped.
RF: Who were your influences in grad school?
Holt: I went to grad school at Carnegie Mellon. It was a tiny
program — I took maybe eight classes. Two of them were
taught by people who later won Nobel Prizes.
Robert Lucas was one of them. That was a class of six or
seven people. When he started class he would say something
like: “This is a pattern of employment participation across

PHOTOGRAPHY: JANE HALEY/UNIVERSITY OF VIRGINIA PUBLIC AFFAIRS

Stephen Slivinski interviewed Holt at his office on the
University of Virginia campus on June 26, 2008.

RF_SUM_08

9/29/08

3:59 PM

Page 33

from Vernon Smith that I thought was very interesting. I got
different countries, and here’s a correlation. I want to try to
another from Charlie Plott, and I remember thinking it was
explain it. So let’s do an econometric exercise.” In the next
very interesting too. What struck me was that you would see
class he would say, “I’m going to show you the starting point
patterns in the data that were consistent with what you
of a model that doesn’t quite explain what I want it to. So
would see in the theoretical models. Then the Economic
I’m going to throw it away and start again.” It was like an
Science Association started having meetings in Tucson in
ongoing insight into his research process. He took his time.
the ’80s. I went to the very first ones of those, and I kept
But what I liked was that he would focus on what he thought
going. They had a big influence on me.
was an important empirical pattern and didn’t go off into the
theoretical world without thinking about what was really
going on.
RF: You designed an auction to help Georgia apportion
Ed Prescott, who was one of my thesis advisors, was
irrigation rights in 2000. Tell me about that.
another one. Ed was the same way. He would always ask
about what empirical regularity you were trying to explain.
Holt: There was a drought that year and Georgia had some
He was focused. He took great joy in doing research.
money from the national tobacco settlement. The officials
Carnegie Mellon was largely an engineering school, and
there didn’t want to hold hearings and decide which land
the business school had a practical feel to it. The department
would be taken out of production. So, they decided on a proheads would basically tell the economics faculty that if they
gram to pay farmers not to irrigate. It was a voluntary
wrote a paper with a graduate student, they would just
program in which the farmers would bid on how much they
assume the faculty member did all the work so they shouldwould want to receive — a lump-sum payment on a per-acre
n’t worry about sharing their ideas. They would still be
basis — for not irrigating that season.
getting full credit for them. As a result, they did a lot of joint
Ron Cummings, Susan Laury — both from Georgia State
work with graduate students, and I was a direct beneficiary
University — and I started running experiments with stuof that.
dents as soon as the law passed. We came up with some
I remember working on a paper with the president of the
designs that the Georgia Environmental Protection
university, Dick Cyert. He had been a co-author with
Division (EPD) steered us away from for political reasons,
Herbert Simon and Franco Modigliani, and he was involved
and that was perfectly sensible. We ended up with a multiin the early days of behavioral economics. Cyert and I were
round auction where the provisional winners — those who
working with Morris DeGroot, the statistician who was my
were asking for the least amount of money not to farm —
other thesis advisor, and we would meet on Saturday mornwere announced after each round of bidding.
ings to work on our paper. The great thing I remember about
In the experiments to test the auction design, we used
that was these guys had a lot of constudents but we let them talk to
fidence. We wrote a couple of papers
each other and collude. It was a very
together, and they each got rejected.
uncontrolled situation for a normal
➤ Present Position
But it didn’t phase them at all.
laboratory experiment. That’s
A. Willis Robertson Professor of
Another thing I remember about
because we knew the farmers who
Political Economy, University of Virginia
that experience was that they spent a
were going to be involved in the auc➤ Previous Faculty Appointment
lot of time on the process. We read
tion had cell phones and probably
the papers out loud before we sent
knew each other. So we wanted to
University of Minnesota (1977-1983)
them to journals. Every sentence had
create that kind of environment.
➤ Education
to be perfect.
The EPD officials from Atlanta
B.A., Washington and Lee University
would
come to some of the experi(1970); M.A. and Ph.D., Carnegie Mellon
ments
and
just sit down to watch the
RF: When did you become interUniversity (1977)
process. I think it gave them a good
ested in economics experiments?
➤ Selected Publications
idea of how a multiround auction
Author or co-author of papers in such
would play out. We also did a field
Holt: After grad school, I went to
journals as the American Economic Review,
test in southern Georgia where we
the University of Minnesota and
Games and Economic Behavior, American
set up bid stations in different
taught there. I became interested in
Political Science Review, and Southern
towns about 50 miles apart.
experiments at that time. I had writEconomic Journal. Co-author (with
Everything was run through the
ten a thesis on auctions that
Douglas Davis) of the textbook,
Web to a site in Atlanta where the
compared revenues raised in differExperimental Economics (1993)
EPD officials could watch the bids
ent types of auctions. When journal
➤ Awards and Offices
coming in.
editors would see the word “auction”
Past president of the Southern Economic
Then they asked us to run the
in the title of a paper that was subAssociation and the Economic Science
actual auction for them. We set up
mitted, they’d want to get a theorist’s
Association; Honorary Professor,
bid sites in seven or eight different
point of view. So I was receiving
University of Amsterdam
locations around the Flint River
these papers to referee. I got one

Charles Holt

Spring/Summer 2008 • Region Focus

33

RF_SUM_08

9/17/08

10:11 AM

Page 34

ence between the national bid
Valley in southern Georgia, and
and the sum of the regional
the EPD officials would watch the
For me, experiments provide a
bids is how much higher the
bids from Atlanta and decide if
regional bidders have to go to
they wanted to go to another
hands-on connection between
knock out the national bidder.
round of bidding. Because the bidIn a case like that, our proceders never knew if the first round
the beauty of economic theory
dure would take the difference
would be the last round, they
and actual human behavior.
between the bids and allocate
knew they had to be serious about
it to the regional level. So,
their bids.
each region had a price which
The farmers would fill out their
was their current high bid, plus a share of the increase
bids by hand on paper —they were basically contracts — we
needed to beat the national bid. These prices provided inforwould review them, and then enter them into the computer.
mation to the bidders during the auction about how high
My bid site was a grade school dining hall that had those litthey needed to go to get into the action.
tle, low seats. And these farmers were big guys so it was
One idea behind that procedure is that it helps the bidamusing to watch.
ders on the regional level solve a coordination problem.
The people who were more willing to not irrigate during
Each person would prefer someone else to raise their bid to
that growing season posted the lowest bids, and they would
knock off the national bidder. What this does is push everybe included in the next round of the auction. You, of course,
body up together. The FCC gave this procedure a name
want the most valuable land and crops to be irrigated, so the
(Hierarchical Package Bidding), an acronym (HPB), and
farmers of those crops — like peach trees, for instance —
decided to use it for a large band (the C block) of the 700
would either bid high or not participate in the auction at all.
MHz spectrum auction held this past spring.
The auction took place one Saturday morning. It was
There was no set number of rounds either. The FCC
over and done about three weeks before the deadline for
procedure has always been to let the auction keep going
planting. That’s the great thing about auctions: They’re fast
until there are no more bids coming in. So the process
and they’re fair.
lasted a couple of months, from January to March, and
consisted of over 100 rounds. This auction raised about
RF: What sort of auction did you design recently for the
$19 billion.
FCC and what was unique about it?
Holt: My co-author Jacob Goeree [of Caltech] and I
contacted Martha Stancill at the FCC. We sent her an idea
on how to set up a simple combinatorial auction that doesn’t
have a zillion possible combinations of licenses and so it
would be easy for bidders to understand how pricing works.
This type of auction is one in which you can bid on a
collection of licenses. Say you have one national license and
multiple regions across the country. The goal is to let the
bidding determine if the license gets awarded as a
single national one or a bunch of regional ones. In general, in
a combinatorial auction the number of possible combinations gets large very quickly — it’s an exponential function.
That complexity deterred officials from using them for a
decade. The procedure we suggested was simple enough that
you could do it with a paper and pencil if you needed to.
That simplicity also gave the officials confidence that they
could answer questions about it in a press conference if they
had to.
For a company like Verizon, it might be more valuable to
have a collection of licenses in a region. So, if in a particular
round of bidding, the highest regional bids add up to more
than the national bid, you would provisionally declare the
regional bidders the winners. But you would reveal those
regional bids so the national bidder would know how high
they have to go to knock out the regional bidders.
Conversely, if the highest bid is a national bid, as it was
sometime during the actual FCC auction, then the differ-

34

Region Focus • Spring/Summer 2008

RF: You’ve been involved with helping to design a
regional greenhouse gas emission auction among the
northeastern states recently. How has that differed
from the European experiment with carbon permit
trading?
Holt: Carbon trading was tried in Europe, but there were
problems with how things were implemented. They took the
current emissions levels and then divided them by a certain
number of allowances — each allowance was good for one
ton of carbon dioxide. From the current level, the officials
would then scale down the total level of emissions to a
specific target and release that number of allowances into
the market. Since the electric power companies were one of
the groups who needed many of these allowances, they
argued to their governments that, if they had to pay for their
allowances, they would simply raise the price of electricity.
That’s never a popular thing to do, so they were given the
allowances for free, and approximately proportional to their
past emissions.
Some of the companies that had extra allowances in the
Eastern European countries would turn around and sell
them and suddenly the prices were very high for a while.
Those companies got windfall profits from that. And the
price of electricity rose anyway — if you cut back on output,
price, of course, will tend to rise. This created a backlash
against the entire cap-and-trade process.

RF_SUM_08

9/17/08

10:11 AM

Page 35

Here in the United States, a number of northeastern
states, led by New York, set up an arrangement where they
would cooperate to regulate carbon dioxide emissions in
that region. They wanted to focus on the electric power
generators.
Early on, a decision was made — and I think this was a
very important decision — to require at least 25 percent of
the allowances to be allocated by auction instead of simply
giving them away. Then New York announced early that
they would allocate 100 percent of their allowances by auction, and everyone is anticipating that most, if not all, of the
allowances the other states issue will be allocated by auction
too. This is the opposite of the European approach where
about only 5 percent of the allowances were auctioned. Now
the Europeans are interested in what the Regional
Greenhouse Gas Initiative in the northeast states — or
RGGI, pronounced “Reggie” — is doing. They send representatives to the RGGI meetings, and I think they will go to
100 percent auctions fairly soon.
The states in RGGI are doing this with the understanding that the proceeds of the auctions could be spent on a
variety of things, such as strategic energy initiatives or conservation programs. Or, if the price of electricity rises, the
proceeds could provide relief for low-income consumers.
RF: How will the RGGI system work and what were
your contributions to its development?
Holt: The RGGI administrators will set the cap on emissions, and electricity providers will bid on the number of
allowances, each of which equals a set number of tons of carbon. The goal of the administrators initially is to set a fairly
loose initial cap so there are no surprises in the auctions —
no bottlenecks or dramatic run-ups in price. Then, over
time, they would gradually tighten the emissions cap for the
next 15 years so the firms can scale their emissions down in a
planned, coordinated way. This also gives conservation
programs time to come into effect.
The Georgia and FCC auctions were meant to be held
only once. The RGGI auctions will be held quarterly in an
ongoing fashion, beginning this September. That actually
takes a little bit of pressure off the auction design process —
if one design doesn’t work so well, you can try another one
later. But I think it’s important, for the success of the
program, for the first several auctions to go well.
The RGGI auction design team included environmental
economists Karen Palmer and Dallas Burtraw from
Resources for the Future in D.C., Jacob Goeree
from Caltech, and Bill Shobe from the University of
Virginia, who had run an innovative clock auction for
nitrous oxide emissions allowances for the state of Virginia
several years before. My role was to set up the laboratory
experiments. There was a concern in the RGGI meetings
about possible collusion in the auction process. So, in many
of our experiments, we focused on the possibility of firms
to either collude tacitly or even explicitly — we would let

subjects talk to each other in a chat room to see what effect
that had.
For instance, one of the possible auction types we tested
was a clock auction. That’s when you announce a price — in
this case, you start low — so demand is much higher than
supply. Then the auctioneer notches the price up in increments, and each time you do that demand falls a little bit.
You stop when demand equals supply.
In the experiments with opportunities for open discussion, you could look at the chat room transcripts and see
what the participants were thinking. You would see them
say things like, “Well, in the last auction we all started off
demanding a lot. And when the price rose, we all cut back
our demand. In the end, we had to accept the result. So, with
the next round, why don’t we just start with what we got in
the end of the last round? Instead of letting the price go up,
let’s agree to stop the auction right at the beginning.” So,
many of the clock auction experiments stopped right away
because of that collusion. Also, the discussion focused on
only one dimension — the quantity dimension — and not
on the price dimension because that was determined by
the clock.
For the September auction, the RGGI administrators
have opted for a sealed-bid uniform-price auction. In that
one, the bids are submitted secretly to the auctioneer, they
are ranked from high to low, and the price at which supply
equals demand is where the price of the allowance is set.
When we tested it, we discovered that this design was somewhat more resilient to collusion than the clock auction, both
when chatting was permitted and when it was not.
It’s important to realize what you can take away from
these experiments. In the real world, collusion is illegal.
There will also be lots of bidders. Brokers can buy
allowances in our scenario and then sell them to different
companies. That’s going to make collusion a lot harder too.
RF: How can experiments be used to teach economics?
Holt: Economic research can be fun and exciting if you
follow your interests. For me, experiments provide a handson connection between the beauty of economic theory and
actual human behavior. The auctions and games I use in
research are great for adding excitement to economics
classes, which otherwise can be dauntingly theoretical.
Teams of students in my classes design their own experiments and use the lab software to run them on the other
students, followed by a presentation of the results in the
next class. Those who have been in the auction or market
have seen the economic process from the inside, learned
lessons the hard way, and class discussions are often lively
and focused as a result. There’s no better way to teach
notions like opportunity cost or sunk cost when some of the
students have earned 40 percent less than some of their
classmates who priced correctly. In case you’re wondering,
I pick one person at random afterward and pay them some
small fraction of their earnings.
RF

Spring/Summer 2008 • Region Focus

35

9/11/08

11:49 AM

Page 36

ECONOMICHISTORY
Liquid Gold
BY C H A R L E S G E R E N A

More than a
century ago,
West Virginians
tapped into their
vast reserves of
coal and turned it
into a valuable
form of oil

Before the federal government built
locks and dams on the Kanawha River
to make it more navigable for coal
barges like this one, coal oil production
provided a way for mining companies
along the river to make money.

36

Region Focus •

ore than three decades ago,
President Gerald Ford met
with coal producers at the
White House to discuss his proposals to support their industry’s growth.
At a dinner held on March 21, 1975,
he extolled the virtues of coal as an
alternative to oil, whose supply had
been cut off for five months by
OPEC the year before.
“Coal represents one immediate
and dependable answer to the question of how we solve our energy needs
in this nation,” said Ford. “It represents an American answer, not one
based on uncertain resources in faraway lands with different ideas and
diverse interests. It represents our
hope for the future. Coal is America’s
ace in the hole.”
Since then, the federal government
has supported research and development of what is called “coal liquefaction” technology. Yet no American
liquefaction plants are in commercial
operation today.
You have to look as far back as the
mid-19th century to find a time when
coal was liquefied into a usable
form on a broad scale. One product
called “coal oil” was widely sold as a
lubricant to keep machines moving
and as a lamp fuel to
keep communities out
of darkness. Among
those who benefited
from the boom were
coal mining companies
along West Virginia’s
Kanawha River and
one of its tributaries,
the Elk River.
The boom would
last only a decade,
however. The emergence of petroleum as
a competing product and the disruptions caused by the Civil War during
the 1860s would eventually undermine
the market for coal oil. Still, the devel-

M

Spring/Summer 2008

opment of this commodity was an
important step in the development of
the nation’s energy supply. Its story
offers some lessons for those who view
coal liquefaction as a path away from
dependence on petroleum.

There’s Black Gold in Them Hills
If the United States is the “Saudi
Arabia of coal” as some have called it,
West Virginia would be one of the reasons for that designation. As early as
1742, coal was found in what was then
considered western Virginia. After the
turn of the 19th century, it heated
people’s homes, stoked the furnaces
of salt manufacturers, and powered
the steamboats that traversed the
Ohio River.
Still, through the mid-1800s commercial mining was limited in the
southern counties of the future
West Virginia. “Here [in the Kanawha
River Valley], in their attempts to
match capital, labor supply, and
transportation facilities to abundant
resources, mine operators experienced
advantages and obstacles commonly
encountered by other Southern industrialists,” wrote West Virginia
historian Otis Rice in the November
1965 issue of the Journal of Southern
History.
The problem was it would be
decades before the Kanawha River
Valley had the canals and railroads to
ship coal in large quantities. “For
Kanawha coal the only outlet was the
Kanawha River, which was navigable
for only about six months each year,”
noted Rice. “Because of the hazardous
state of the [river] and the lack
of other transportation facilities,
Kanawha Valley coal producers made
very few attempts prior to 1850 to ship
coal out of the valley.”
Rice added, “As a result, the richness of the Kanawha resources,
revealed by the findings of William
Barton Rogers in the Old South’s most

PHOTOGRAPHY: WEST VIRGINIA AND REGIONAL HISTORY COLLECTION, WEST VIRGINIA UNIVERSITY LIBRARIES

RF_SUM_08

Page 37

Playing Catch-Up
Transportation improvements helped open up coalfields in
southern West Virginia starting in the 1880s, boosting the state’s
coal production. West Virginia would eventually catch up to
Pennsylvania, the Appalachian Basin’s leading coal producer,
partially due to the latter’s decline in anthracite mining.
300
250
200
150
100
50

Pennsylvania

1995

1985

1975

1965

1955

1945

1935

1925

1915

1905

0
1895

thorough geological survey, stood out in sharp contrast with
the valley’s share of the nation’s expanding coal market.”
While most of the region’s coal was consumed locally,
Pennsylvania became the nation’s leading coal producer.
So, West Virginians knew they were sitting on an abundant natural resource. But how would they extract the value
of that “black gold” in a broader marketplace? Coal oil would
provide the answer.
By the mid-19th century, the Industrial Revolution had
transformed England and was spreading across the Atlantic
to the United States. All of the machinery that kept factories
humming needed lubrication, so people turned to oils
derived from animal fat and vegetables.
Coal oil turned out to be a good replacement for these
lubricants. It kept the parts of a machine moving, while lacking the acidity of organic lubricants that ate at metal and
accelerated wear and tear.
Another significant market for coal oil also emerged: illumination. People used a variety of substances in lamps and
street lights, from oils extracted from animal fat and plants
to concoctions like grain alcohol and camphene — a mixture
of alcohol and turpentine, plus camphor oil to improve its
odor. The downside of many chemical fuels was their
volatility, while oils derived from lard congealed if they
weren’t kept warm. One of the leading animal fat-derived
fuels, whale oil, became scarcer and more expensive as
fisheries along the East Coast were depleted of sperm
whales.
Coal already played a role in the illumination market, but
not in liquid form. So-called “manufactured gas” was
extracted from coal and used in lighting throughout the 19th
century. However, its use was limited to businesses, municipal streetlights, and wealthy households since it was
expensive. Also, burning manufactured gas produced soot
and a strong odor, which is one of the reasons why natural
gas and electricity would supplant it in the next century.
By the 1850s, coal oil stepped into the spotlight. “The
building blocks and more importantly, the economic incentives, were in place,” wrote lighting expert Daniel Mattausch
in a recent magazine article on the history of lamp fuels.
“With increasing frequency, inventors investigated three
materials, raw petroleum, cannel coal, and the tar left over
from the production of illuminating gas.”
Inventor-entrepreneurs on both sides of the Atlantic
found they could modify the production process for manufactured gas to yield a liquid byproduct that had several
advantages over existing lamp fuels. Derived from cannel
coal that had a high hydrogen content, coal oil burned
brightly and produced less residue than either whale or lard
oil. In addition, it resisted cold weather like whale oil and
was much cheaper — 50 cents a gallon versus $2.00 to $2.50
per gallon for the most desirable whale oils. Camphene sold
for about the same price as coal oil, but the latter wasn’t as
volatile.
Most importantly, coal oil was a value-added product that
enabled Cannelton and other communities near the

1885

11:49 AM

1875

9/11/08

COAL PRODUCTION, SHORT TONS (IN MILLIONS)

RF_SUM_08

West Virginia

SOURCE: The COALPROD Database: Historical Production Data for the
Major Coal-Producing Regions of the Conterminous United States , U.S.
Geological Survey, March 2003

Kanawha and Elk rivers to take advantage of the cannel coal
deposits surrounding Charleston, W.Va. “There was still
a lot of bulk there” to transport, notes Mattausch in an
interview. But “it was a lot better than shipping tons of coal.”

Boom and Bust
As lamps that optimally burned coal oil were invented and
put on the market, coal oil cut into sales of whale oil. By
1860, dozens of coal oil plants were in operation in big cities
like Boston, New York, Cincinnati, and Pittsburgh.
In West Virginia, more than 40 companies secured charters for mining cannel coal (also known as “candle” coal)
along the Kanawha and its tributaries. “All but two or three
of these groups were formed after the discovery of cannel
coal at Cannelton, and the acts by which they were incorporated show clearly that the vast majority of them expected
to engage in the mining of cannel coal and in the manufacture of coal oil and other cannel-coal derivatives,” wrote
historian Otis Rice.
Not all of the mines yielded what their investors hoped,
while others found more profitable types of coal to mine and
sell. Still, at least six refineries produced coal oil in West
Virginia, including four in Kanawha County.
Despite using wasteful methods of liquefying coal,
noted Rice, the Cannelton factory managed to extract two
gallons of oil from each bushel of cannel coal. (According
to several accounts, that was about the best yield any producer could obtain.) Some of the oil was shipped to a
refinery in Maysville, Ky., about 180 miles northwest of
the factory, and some was likely shipped to Boston for sale
under contract.
Perhaps some version of coal oil would have continued to

Spring/Summer 2008 • Region Focus

37

RF_SUM_08

9/11/08

11:49 AM

Page 38

fuel the Industrial Revolution and eventually fill the tank of
every automobile on the road today. Instead, petroleum
assumed that pivotal role in the nation’s economy.
Petroleum was skimmed off of ponds and scooped from
holes in the ground for centuries. (West Virginia salt miners
considered the oily substance a nuisance when they found
it.) The flammable liquid was used in weapons; as an additive
to mortar, paints, and adhesives; and as a remedy for itchy
skin and a variety of other ailments.
During the 19th century, lamps burned petroleum, but
only on a limited basis since it produced a lot of smoke and
a strong odor. Also, supplies were generally limited to locations where petroleum seeped out from underground
reservoirs.
Then a former railroad conductor named Edwin Drake
was hired to dig a well in Titusville, Penn. He was searching
for “rock oil” as an alternative to whale oil and struck pay
dirt in August 1859. The subsequent rush led to the creation
of a petroleum-derived competitor to coal oil: kerosene.
The coal oil industry may have paved the way for the
petroleum industry. The backers of the Titusville oil well
sought the profits that coal miners were making from producing lamp fuel. Later, much of the refining and
distribution infrastructure for coal oil would eventually be
used for petroleum.
The Civil War shut down mining operations in the
Kanawha River Valley during the early 1860s, but mining
expanded significantly with postwar transportation
improvements.
“The Chesapeake & Ohio Railroad in 1873 and the
Norfolk & Western in 1881 opened up the southern West
Virginia smokeless coal reserves,” says C. Stuart McGehee,
chair of the history department at West Virginia State
University. “The Kanawha River was not channeled with
locks until the 1930s to allow serious barge traffic, now
mostly to electric power plants on the Ohio River.” Indeed
by that time, other profitable uses for coal had emerged
besides oil for illumination.
What about the commercialization of the incandescent
light bulb around 1880? “Edison’s light was a poor competitor with the gas and kerosene lights,” describes historian
Dan Mattausch. “It was much more expensive; it was something wealthy people showed off. Gas lights put out several
times the output of an electric light bulb and cost a fraction
of the price.” Eventually, though, electricity obviated the
need for coal oil, kerosene, or any fuel for lights.

Dethroning Petrol
The rise and fall of coal oil in West Virginia illustrates how
market demand can drive the development of new commodities. As the supply of one product approaches exhaustion and
becomes relatively expensive, companies are motivated to
find alternatives.
Today, hydrogen is often mentioned as the country’s nextgeneration energy source. But it’s hard for any new commodity to compete with petroleum, which has been a
relatively cheap transportation fuel for so long and has an
extensive production and distribution infrastructure developed around it.
There are a couple of historic examples of periods when
other countries used coal to make oil. In both cases, the
countries did so mainly because they were shut out of the
more routine channels of supply for oil. Nazi Germany used
coal liquefaction technology to keep their warplanes in the
air during World War II. South Africa used gasoline and
diesel fuel extracted from coal when the world turned its
back on the country during its age of apartheid.
If coal was turned into liquid fuel on a mass scale, the end
product could utilize the same infrastructure as petroleum
and be useable in the same vehicles by adding hydrogen to it
or removing carbon. However, using indirect liquefaction —
the same method South Africa has used — would produce
more than twice the amount of carbon dioxide as the production of diesel fuel.
Other countries are discovering there might be important investment opportunities for this sort of technology.
The Shenhua Group in China, the world’s largest coal producer, is building a plant in Inner Mongolia — 375 miles west
of Beijing — based on the Germans’ direct liquefaction
process. West Virginia University is working with the company to study the environmental and economic impact of
the plant.
With crude oil prices breaking new records, there could
be more talk in the United States of liquefying coal on a large
scale. Of course, the price of coal and the cost of building
liquefaction plants compared to the financial and ecological
costs of sticking with petroleum will ultimately be the
deciding factor.
In the meantime, Dan Mattausch has some advice for
those who are pursuing coal liquefaction. “It won’t lead
where you think it’s going to go. The people working with
coal oil had no concept that they were preparing the way for
something they had never heard of.”
RF

READINGS
Beaton, Kendall. “Dr. Gesner’s Kerosene: The Start of American
Oil Refining.” Business History Review, March 1955, vol. 29,
no. 1, pp. 28-53.
Kasey, Pam. “Coal Oil Has Rich History in W.Va.” The State Journal,
July 27, 2006.
Kovarik, Bill. “Henry Ford, Charles F. Kettering and the Fuel of the
Future.” Automotive History Review, Spring 1998, vol. 32, pp. 7-27.

38

Region Focus • Spring/Summer 2008

Mattausch, Daniel W. “A New Light, Part 1: The Origins of Paraffin
Oil Lighting.” The Rushlight, March 2008, vol. 74, no. 1, pp. 2-20.
Rice, Otis K. “Coal Mining in the Kanawha Valley to 1861: A View
of Industrialization in the Old South.” Journal of Southern History,
November 1965, vol. 31, no. 4, pp. 393-416.

RF_SUM_08

9/11/08

11:50 AM

Page 39

BOOKREVIEW
GOOD CAPITALISM, BAD
CAPITALISM, AND THE
ECONOMICS OF GROWTH AND
PROSPERITY
BY WILLIAM J. BAUMOL, ROBERT E.
LITAN, AND CARL J. SCHRAMM
NEW HAVEN, CONNECTICUT: YALE UNIVERSITY PRESS, 2007
321 PAGES
REVIEWED BY STEPHEN SLIVINSKI

his intriguing book — aimed mainly at the educated nonspecialist — rests on two ideas. The first
is capitalism hasn’t emerged in one specific form
everywhere that it has taken root. Some countries exhibit
forms of “bad capitalism” and others “good capitalism.”
The second is to define the best sort of good capitalism
and which policy prescriptions can help bad capitalist
systems transition to the good variety. Or, for that matter,
ensure that the good capitalist economies don’t backslide
into the other category.
The characteristic that makes all of the types of economic systems described in this book worthy of the title
“capitalism” is the right of individuals to hold private property. That, however, is where the similarities end. The
mechanisms that direct the productive energies and investments in each capitalist system are what differentiate one
form from another. Yet, teasing out which countries fit into
each category can sometimes be a tricky task, and one that
doesn’t really have a precise answer over time. Indeed, most
developed countries — including the United States —
exhibit characteristics of at least two of the four forms of
capitalism described in the book.
Looking at bad forms of capitalism, the first type is given
the name “state-guided capitalism.” A country subject to
this system is one in which government, not private
investors, decides which industries and firms will be winners
in the marketplace and public policy is designed to “direct
economic traffic,” in the authors’ description. Modern-day
China is one of the best examples, although as the authors
point out, even China does not exhibit a pure form of this
sort of top-down capitalism and is slowly moving away from
this model.
The second bad form is “oligarchic capitalism.” It’s similar to the state-guided version but the key element here is
that most of the property and wealth is held by a few firms
or owners. Another subtle difference between the two is
that in oligarchic systems, the stated goal of public policy is
patronage — think of Russia in the years immediately following the collapse of the Soviet Union, for instance. In a

T

state-guided system, however, the goal is presumably to
maximize economic growth.
In turning to good forms of capitalism, the authors first
describe what they call “big firm capitalism.” Here, the
economy is dominated by established large enterprises.
What makes the firms in this model different from those in
an oligarchic capitalist system is that ownership is widely
dispersed among many private shareholders.
“Entrepreneurial capitalism” is the second form of
good capitalism. In it, small innovative firms play the
most significant role in the economy. Dramatic innovation
distinguishes it from the incremental innovation that
characterizes big-firm capitalism.
The authors don’t jump to the conclusion that nations
should aspire only to entrepreneurial capitalism. Instead,
they suggest that the optimal capitalist system is a hybrid of
both the entrepreneurial and big-firm versions. As they
note, “no advanced economy can survive only with entrepreneurs (just as individuals cannot survive by eating just
one type of food). Big firms remain essential to refine and
mass-produce the radical innovations that entrepreneurs
have a greater propensity to develop or introduce.”
The book excels when it lays out the case for a new taxonomy of capitalism and how the world can be viewed more
coherently in light of it. Additionally, many of the authors’
policy prescriptions are sensible — such as tax reforms that
encourage more risk-taking, or lowering barriers to trade
and immigration to encourage economic competition.
But others might strike the reader as inconsistent with
the arguments made elsewhere in the book. For example,
the authors make a compelling case that an oversupply of
regulation is what trips up developing economies interested
in working their way into the ranks of the developed world.
Yet they don’t really grapple with the fundamental notion
that once the government’s power to regulate markets is let
out of the cage — even in a good capitalist economy — it is
often difficult to sufficiently leash it. This is a shortcoming
mainly because some of their proposals rest on the idea that
government policy can be used to keep the mix of big firms
and small firms at an optimal level. But this mix is the result
of a spontaneous process of market interactions, not one
that can be predicted by policymakers or even the firms
themselves.
Nor is it clear that a government entity could ever possess the ability to know what that optimal economic mix is
or to keep the policymaking process free from undue influence by one coalition or another. In the final analysis, the
lessons that economists have learned about how decisions
are made in legislative bodies or how economic organizations emerge should influence the reader’s appraisal of the
policy approaches proposed in this book.
RF

Spring/Summer 2008 • Region Focus

39

RF_SUM_08

9/11/08

11:50 AM

Page 40

DISTRICT ECONOMIC OVERVIEW
BY S O N YA R AV I N D R A N AT H WA D D E L L

conomic activity in the Fifth
District grew at a somewhat
softened pace in the fourth
quarter of 2007 and the first quarter of
2008 as weakness in housing and retail
sales offset growth in other sectors
of the service-providing industries.
Employment and income continued to
grow, but at a slower pace.

E

Healthy Labor Markets
District labor market conditions
remained generally healthy in the end
of 2007 and into 2008. Payroll employment in the Fifth District grew 1.0
percent over the year ending in the
first quarter of 2008, a rate double the
0.5 percent national payroll growth
over the same period. Although household unemployment ticked up 0.3
percentage point over the six months
spanning the end of 2007 and the
beginning of 2008, Fifth District joblessness settled at 4.5 percent by the
end of March 2008 — a solid 0.6 percentage point below the national
jobless rate of 5.1 percent.
Employment growth over the year
reflected solid growth in the serviceproviding industries, although employment growth within this sector
was mixed. While education and
health services payrolls grew 3.0 percent, employment in the financial
activities and information sectors

declined 0.8 percent and 0.4 percent,
respectively. Other indicators of service-sector activity were also mixed
over the last quarter of 2007 and the
first quarter of 2008. According to our
surveys, while revenue growth in nonretail services firms was steady or
growing over the period, retail firms

Real Estate Weakens

Economic activity in the
Fifth District grew at a
somewhat softened pace
in the first quarter
of 2008.
experienced sizeable drops in sales,
particularly of big-ticket items.
Similarly, retail price growth picked
up, particularly in the first three
months of 2008, while overall servicesector price growth remained steady.
The biggest job losses in the Fifth
District were in the goods-producing
industries — manufacturing firms
shed 28,700 jobs for a 0.3 percent
decline in payrolls and mining and
construction shed 5,000 jobs for a 0.6
percent decline. Other manufacturing
indicators from our survey also
describe a cooling of activity as new

Economic Indicators

40

Residential real estate activity weakened further in the final quarter of
2007 and first quarter of 2008. Permit
issuance declined 34.5 percent over the
year ending in March 2008, with
declines of more than 20 percent
reported in all Fifth District jurisdictions. Housing starts and home sales
also declined across all jurisdictions,
with 30.3 percent and 25.6 percent
drops, respectively, in the Fifth
District as a whole over the year.
Declining sales activity coincided
with cooling house price growth. After
many quarters of declining house price
growth, Fifth District house prices fell
outright over the first quarter of 2008
(0.1 percent) and since the first quarter
of 2007 (0.3 percent). Although North
Carolina, South Carolina, and West
Virginia still saw growth in house
prices over the quarter, prices in the
District of Columbia, Maryland, and
Virginia all declined, with D.C. and
Maryland house prices experiencing
declines of between 1.0 and 2.0
percent over the quarter and the year.

Households Faring Well
1st Qtr. 2008

Nonfarm Employment (000)
Fifth District
U.S.
Real Personal Income ($bil)
Fifth District
U.S.
Building Permits (000)
Fifth District
U.S.
Unemployment Rate (%)
Fifth District
U.S.

orders and shipments fell and demand
for certain products weakened.
Toward the end of the first quarter,
however, there was some indication of
potential turnaround in the manufacturing sector with growth in export
demand.

4th Qtr. 2007

Percent Change
(Year Ago)

14,005
137,917

13,979
138,031

1.2
0.9

949.0
9,998.9

946.8
9,970.7

2.2
2.7

36.1
226.7

35.7
262.8

-24.2
-26.1

4.4%
4.9%

4.3%
4.8%

Region Focus • Spring/Summer 2008

Household financial conditions
remained solid in the last quarter of
2007 and the first quarter of 2008,
as personal income continued to
grow at a 1.0 percent annualized rate
between the fourth quarter 2007 and
the first quarter 2008 — approximately on par with national income
growth. In addition, although mortgage delinquency rates rose in the
final quarter of 2007, delinquency
rates fell in every jurisdiction of the
Fifth District in the first quarter
of 2008.

RF_SUM_08

9/11/08

11:50 AM

Page 41

Nonfarm Employment

Unemployment Rate

Real Personal Income

Change From Prior Year

First Quarter 1997 – First Quarter 2008

Change From Prior Year

First Quarter 1997 – First Quarter 2008

First Quarter 1997 – First Quarter 2008
7.0

4%
3%
2%

7%

8%

6.0
6%

6%

5.0
5%

4%

7%

5%

1%
0%

3%

4%
4.0

-1%
-2%

3.0

2%

3%

97 98 99 00 01 02 03 04 05 06 07 08

97 98
98 99
97
99

00
01 02
03 04
00 01
02 03
04 05
05 06
06 07
07 08
08

Fifth District

1%
0%
-1%
97 98 99 00 01 02 03 04 05 06 07 08

United States

Nonfarm Employment
Metropolitan Areas

Unemployment Rate
Metropolitan Areas

Building Permits

Change From Prior Year

Change From Prior Year

First Quarter 1997 – First Quarter 2008

First Quarter 1997 – First Quarter 2008

First Quarter 1997 – First Quarter 2008

7%
6%
5%
4%
3%
2%
1%
0%
-1%
-2%
-3%

Change From Prior Year

7%

30%

6%

20%

5%

10%

4%

0%

3%

-10%

2%

-20%
-30%

1%
97 98 99 00 01 02 03 04 05 06 07 08
Charlotte

Baltimore

97 98 99 00 01 02 03 04 05 06 07 08

97 98 99 00 01 02 03 04 05 06 07 08

Washington

Charlotte

Baltimore

Washington

FRB-Richmond
Services Revenues Index

FRB-Richmond
Manufacturing Composite Index

First Quarter 1997 – Fourth Quarter 2008

First Quarter 1997 – First Quarter 2008

40

30

30

20

Fifth District

United States

House Prices
Change From Prior Year
First Quarter 1997 – First Quarter 2008

16%
14%
12%

20

10

10%

0

8%

10
0
6%

-10
-10

4%
-20

-20

2%
-30

-30
97 98 99 00 01 02 03 04 05 06 07 08

0%

97 98 99 00 01 02 03 04 05 06 07 08

97 98 99 00 01 02 03 04 05 06 07 08
Fifth District

United States

NOTES:

SOURCES:

1) FRB-Richmond survey indexes are diffusion indexes representing the percentage of responding firms
reporting increase minus the percentage reporting decrease.
The manufacturing composite index is a weighted average of the shipments, new orders, and
employment indexes.
2) Metropolitan area data, building permits, and house prices are not seasonally adjusted (nsa); all other
series are seasonally adjusted.

Real Personal Income: Bureau of Economic Analysis/Haver Analytics.
Unemployment rate: LAUS Program, Bureau of Labor Statistics, U.S. Department of Labor,
http://stats.bls.gov.
Employment: CES Survey, Bureau of Labor Statistics, U.S. Department of Labor, http://stats.bls.gov.
Building permits: U.S. Census Bureau, http://www.census.gov.
House prices: Office of Federal Housing Enterprise Oversight, http://www.ofheo.gov.

For more information, contact Sonya Ravindranath Waddell at (804) 697-2694 or e-mail sonya.ravindranath@rich.frb.org
Spring/Summer 2008 • Region Focus

41

9/11/08

11:50 AM

Page 42

STATE ECONOMIC CONDITIONS
BY S O N YA R AV I N D R A N AT H WA D D E L L

District of Columbia
conomic conditions in the District of Columbia were
mixed in the last quarter of 2007 and into the first
quarter of 2008. Real estate conditions weakened as house
prices, permitting activity, housing starts, and existing home
sales fell and foreclosures grew over both quarters.
Nonetheless, despite the rise in the household unemployment rate, payrolls in the area grew and income growth
remained steady.

E

U.S. and D.C. Employment Growth Since Jan. 2001

quarter, marking the third and fourth consecutive quarters
of decline.
On a more positive note, households remained in decent
financial condition overall, as real personal income growth
advanced at a 1.8 percent annualized rate in the first quarter
of 2008. In addition, although overall mortgage delinquency
rates rose in the final quarter of 2007 (for the third consecutive quarter), they fell back in the first quarter of 2008.
This decrease was spurred entirely by drops in the percentage of mortgages more than 30 days past due. The
percentage of mortgages more than 90 days past due
continued to creep up in the first quarter, for the seventh
consecutive quarter. Meanwhile, the rate of foreclosures
continued to climb to its highest level since 2001.

Index = Jan. 2001 = 100
110

U Maryland

108
106

102
100
98
96
01

02

03

04

05

06

District of Columbia

07

08

United States

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

Conflicting reports from the payroll and household
employment surveys indicated mixed conditions in the D.C.
labor market. Firms in the district added 4,900 jobs in the
fourth quarter of 2007 and 1,200 jobs in the first quarter of
2008, for 0.9 percent growth over the two quarters. Payroll
growth in professional and business services and government employment fueled much of the increase, although
government employment fell slightly in the first quarter of
2008. Despite the payroll increases, however, household
unemployment ticked up to 6.2 percent in the beginning of
2008 after remaining unchanged at 5.7 percent at the end of
last year. By the end of the first quarter of 2008, unemployment settled at 6.1 percent.
Housing market conditions in the District of Columbia
deteriorated further over the two quarters. Its House Price
Index fell 1.0 percent in the final quarter of 2007 and a
further 1.8 percent in the first quarter of 2008. This is the
first time that the jurisdiction has seen two straight quarters
of house price decline since 1997. In addition, new residential construction fell over the period as residential
permitting activity and housing starts dropped over the six
months. Furthermore, existing home sales in the district fell
13.0 percent in the fourth quarter and 5.0 percent in the first

42

ccording to recent data, Maryland’s economy showed
signs of continued weakness in the real estate market,
although household employment and financial conditions
remained solid. Residential permitting activity, house
prices, and home sales all fell over the six months
spanning the end of 2007 and the beginning of 2008,
while mortgage delinquency rates and foreclosures were
up. On the other hand, payrolls grew at a healthy clip,
unemployment rates remained steady, and household
balance sheets were buttressed by growth in real
personal income.
Labor markets in Maryland continued to advance toward
the end of 2007 and into 2008. Firms added 6,400 jobs to
the state economy (0.2 percent growth) in the last quarter of
2007, and 11,700 jobs (0.4 percent growth) in the first quarter of 2008. The latest period marks the 20th consecutive
quarter of payroll growth in the state. The strongest and

A

104

Region Focus • Spring/Summer 2008

U.S. and MD Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
108
106
INDEX LEVELS

INDEX LEVELS

RF_SUM_08

104
102
100
98
96
01

02

03

04

05

06

Maryland

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

07

08

United States

9/26/08

4:06 PM

Page 43

most consistent growth over the six months was in the professional and business services, educational and health
services, and leisure and hospitality sectors. Only the manufacturing sector shed jobs over both quarters.
Unemployment rates in the state remained steady, between
3.5 percent and 3.6 percent throughout 2007 and into 2008.
At 3.5 percent unemployment in the first quarter of 2008,
Maryland tied with Virginia for the lowest unemployment of
all Fifth District jurisdictions.
Housing market conditions were less rosy. According to
the House Price Index, house prices fell 0.1 percent in the
last quarter of 2007 and 1.2 percent in the first quarter of
2008, for the second and third consecutive quarters of
decline. This marks the first time since 1994 that the state
has seen three straight quarters of house price decline.
House prices in the first quarter of 2008 were 1.3 percent
lower than year-ago levels, marking the first quarter of yearover-year decline since 1997.
In addition, according to the National Association of
Realtors, home sales have fallen for four consecutive quarters, although the last quarter’s decline of 0.6 percent is far
less than the previous three quarters of declines that each
exceeded 10.0 percent. The number of house foreclosures
continued their rise over the last quarter of 2007 and into
the first quarter of 2008, for the sixth and seventh quarters
of consecutive quarterly increases.
Despite the clear contraction, the housing market
showed some positive signs. Although mortgage delinquency rates were up over the last quarter of 2007, peaking
at 5.7 percent (the highest rate since 2002), they fell to 5.2
percent over the first three months of 2008. Furthermore,
looking forward, residential permitting activity edged up 4.8
percent over the first three months of 2008 after having
dropped 25.3 percent in the fourth quarter of 2007. Finally,
real personal income continued to grow at an annualized 1.4
percent rate in the fourth quarter of 2007 and 1.5 percent
rate in the first quarter of 2008, indicating that household
financial conditions remain solid.

h North Carolina
orth Carolina’s economy showed signs of both weakness and rebound toward the end of 2007 and into
2008. Labor market conditions softened somewhat in the
first quarter of 2008, with reduced growth in payrolls,
a jump in unemployment, and no income growth.
Nonetheless, housing market conditions showed signs
of improvement, with growth in house prices, increases
in residential permitting activity, and drops in mortgage
delinquencies and foreclosures.

N

U.S. and NC Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
106
104
INDEX LEVELS

RF_SUM_08

102
100
98
96
94
01

02

03

04

05

06

North Carolina

07

08

United States

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

Employment surveys indicated that while labor market
conditions improved in the final quarter of 2007, they softened heading into 2008. Payroll employment grew 0.6
percent (25,000 jobs) in the fourth quarter of 2007, but only
0.2 percent (9,600 jobs) in the first quarter of 2008. The
first-quarter increase marked the smallest number of jobs
added to the state economy in a quarter since early 2005.
The biggest loss in the first three months of 2008 was
in manufacturing (4,500 jobs). In addition, 400 professional
and business services jobs were lost in the first quarter of
2008 after 6,300 jobs were added in that sector in the final
quarter of 2007.
Household unemployment data painted a similar picture
to payroll data. State unemployment remained steady at
4.7 percent in the last quarter of 2007, but jumped up to
5.0 percent in the first quarter of 2008, for the highest
quarterly unemployment since 2005. Household earnings
reflected the employment data; real personal income rose
0.5 percent (1.8 percent annualized) in the last quarter of
2007 and was flat over the first quarter of 2008.
Data on real estate conditions, while mixed, indicated a
slight improvement in housing market conditions heading
into 2008. Although home sales dropped for the fourth consecutive quarter, house prices, according to the House Price
Index, continued on their 17-year growth streak and rose
1.1 percent in the fourth quarter of 2007 and 0.8 percent in
the first quarter of 2008. Although residential permitting
activity was down 19.9 percent in the fourth quarter (the
third straight quarter of decline), activity edged up 1.2 percent in the first quarter of 2008.
Finally, both mortgage delinquencies and foreclosures
fell in the first quarter of 2008. Mortgage delinquencies
declined 0.7 percentage point after three quarters of
increase, and the percent of mortgage foreclosures initiated
edged lower by 0.04 percentage point after two quarters
of increase.

Spring/Summer 2008 • Region Focus

43

11:50 AM

Page 44

U.S. and SC Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
108
106
104
102
100
98
96
01

02

03

04

05

06

South Carolina

07

08

United States

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

o South Carolina
E

conomic conditions were mixed in South Carolina in the
six months spanning the end of 2007 and the beginning
of 2008. Payroll employment figures indicated a weakening
in the labor market, although unemployment declined over
the first three months of 2008. Conditions in the real estate
market varied, with house prices and home sales growing
while permit levels fell.
South Carolina’s labor markets weakened somewhat over
the last three months of 2007 and first three months of
2008. Firms added 200 jobs over the fourth quarter of 2007,
but shed 2,000 net jobs in the first quarter of 2008.
The starkest losses were felt in the construction sector
(5,000 jobs) and the trade, transportation, and utilities
sector (1,700 jobs). The first quarter of 2008 marked the
largest loss of jobs in a single quarter in South Carolina since
early 2003.
Although the number of employed persons in the state
declined over the first quarter, so did the number of
unemployed persons, and the steeper drop in the labor
force led to a 0.3 percentage point drop in the unemployment rate (after a 0.3 percentage point rise in the fourth
quarter of 2007). South Carolina, therefore, ended the first
quarter of 2008 with the same 5.8 percent unemployment
rate recorded in the third quarter of 2007.
Household balance sheets seemed to be improving, as
mortgage delinquencies rose 0.3 percentage point in the
final quarter of 2007, but then dropped 0.9 percentage point
in the first three months of 2008. Foreclosures also declined
in the first quarter of 2008 after two quarters of increase.
In the aggregate, real personal income levels continued
along their five-year growth streak at a moderate pace, with
a 0.9 percent annualized increase in the final quarter of 2007

44

Region Focus • Spring/Summer 2008

and a 0.4 percent increase in the first quarter of 2008. In
per-capita terms, however, real quarterly personal income
levels fell over the period by 0.5 percent and 1.1 percent
annualized in the end of 2007 and the beginning of 2008,
respectively.
In the real estate market, conditions were mixed. On the
one hand, house prices, according to the House Price Index,
continued along their 17-year growth streak, with 1.8 percent
house price growth in the final quarter of 2007, and 0.9 percent growth in the first quarter of 2008. Similarly, home
sales edged up 3.1 percent in the first quarter after two quarters of decline. On the other hand, permit levels fell 12.8
percent in the fourth quarter and a further 3.9 percent in the
first quarter. This left South Carolina with its largest yearover-year decline in permit levels in any quarter since 1989.

u Virginia
n Virginia economic conditions remained somewhat
shaky in the last quarter of 2007 through the first quarter
of 2008. The Commonwealth experienced tepid payroll
growth and increases in unemployment. Real income and
mortgage delinquencies showed signs of improvement, but
with high delinquency rates, increases in foreclosure rates,
declining house prices, and falling home sales, Virginia’s
housing sector remained in a somewhat weakened state.

I

U.S. and VA Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
108
106
INDEX LEVELS

9/11/08

INDEX LEVELS

RF_SUM_08

104
102
100
98
96
01

02

03

04

05

06

Virginia

07

08

United States

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

Firms added 1,800 jobs in the final quarter of 2007 and
4,700 in the first quarter of 2008. Although this is the 18th
consecutive quarter of payroll increases, these are some of
the lowest quarterly payroll increases that the state has seen
since the 2001-2003 period. The goods-producing sector
continued to shed jobs as the construction and
manufacturing sectors lost 1,700 jobs and 1,800 jobs, respec-

9/11/08

11:51 AM

Page 45

tively, in the first quarter of 2008. The tepid payroll growth
can partially explain, then, a steadily increasing household
unemployment rate that reached 3.2 percent in the final
three months of 2007 and 3.5 percent in the first three
months of 2008. Virginia is still tied with Maryland,
however, for the lowest unemployment in the Fifth District.
Real personal income fell slightly in the fourth quarter of
2007 but rebounded in the first quarter of 2008, buttressing
household balance sheets. Similarly, mortgage delinquencies
fell in the first quarter after three quarters of increase. Still,
fourth-quarter delinquencies hit their highest point since
2001. In addition, the percentage of mortgages past due by
90 days or more rose to 1.22 percent — a series high by a
wide margin. Foreclosure rates continued to grow, hitting a
record high of 0.7 percent in the first quarter of 2008.
The housing market remained sluggish although there
were signs of possible turnaround. The quarterly decline in
home sales flattened to zero in the first quarter of 2008 —
home sales were down 24.8 percent since the first quarter of
2007. Meanwhile, according to the House Price Index,
Virginia house prices fell 0.2 percent for the third consecutive quarter of decline, marking the first quarter of
year-over-year decline since 1995. Looking forward, however,
permitting activity grew 9.1 percent in the first quarter, after
two straight quarters of decline.

w West Virginia
conomic conditions in West Virginia showed some signs
of improvement in the six months spanning the end of
2007 and the beginning of 2008. Although there was little
change in employment and some weakness in the state’s
housing markets, the state did see some encouraging indicators, including positive real income growth, reduced
delinquency and foreclosure rates, and relatively steady
house price appreciation.
Data on West Virginia labor markets indicated that
conditions have changed little since the third quarter of
2007. Firms added 1,100 jobs in the fourth quarter of 2007
and 400 jobs in the first quarter of 2008, for just under 0.2
percent growth and 0.1 percent growth, respectively.
Relatively small gains and losses were felt across sectors in

E

U.S. and WV Employment Growth Since Jan. 2001
Index = Jan. 2001 = 100
106
104
INDEX LEVELS

RF_SUM_08

102
100
98
96
94
01

02

03

04

05

06

West Virginia

07

08

United States

SOURCE: Nonfarm Payroll Employment, BLS/Haver Analytics

both quarters, with only the construction and manufacturing sectors experiencing declines in both quarters while
educational and health services and leisure and hospitality
increased in both quarters. Not surprisingly, then, household
unemployment surveys indicated that unemployment rates
remained at 4.6 percent for both quarters — down from
4.7 percent in the third quarter of 2007.
The housing market was steady in West Virginia,
although there was a strong decline in residential permitting
activity in the first quarter of 2008. According to the House
Price Index, house prices increased 0.8 percent in the
first quarter of 2008 after edging up 0.5 percent in the fourth
quarter of 2007. Home sales were also up in both the
fourth quarter and in the first quarter. On a less positive
note, first-quarter residential permitting activity was down
after growth in the final quarter of 2007. In fact, the last
quarter marked the largest quarterly decline in permitting
activity since 1997.
Household financial conditions in West Virginia were
similar to, though slightly better, than those in the third
quarter of 2007. Real personal income grew slightly
(0.4 percent annualized) in the final quarter of 2007 and flatlined in the first quarter of 2008, as did per-capita income.
After three quarters of increase, mortgage delinquency rates
fell 1.1 percentage points in the first quarter. Foreclosure
rates also fell slightly.

For the latest in Fifth District economic conditions,
check out our new regional Snapshot publication at
www.richmondfed.org/research/regional_conditions/snapshot

Spring/Summer 2008 • Region Focus

45

RF_SUM_08

9/11/08

11:51 AM

Page 46

State Data, Q1:08
DC

MD

NC

SC

VA

WV

700.8
0.2
1.2

2,630.3
0.4
1.0

4,187.3
0.2
1.6

1,958.0
-0.1
1.0

3,770.1
0.1
0.4

758.3
0.1
0.3

1.6
-4.0
-5.9

130.8
-0.3
-1.6

531.4
-0.8
-2.8

247.6
-0.4
-1.2

273.3
-0.7
-2.9

57.9
-1.1
-2.7

Professional/Business Services Employment (000’s) 156.0
Q/Q Percent Change
0.1
Y/Y Percent Change
1.6

402.6
0.5
1.9

508.1
-0.1
3.4

223.8
-2.6
0.7

648.5
0.3
1.4

61.3
1.4
2.1

Government Employment (000’s)

233.4

482.3

702.8

341.0

691.8

145.0

-0.1
1.2

0.5
1.3

0.0
1.9

0.7
1.9

0.6
1.4

-0.1
-0.2

331.3
1.1
1.9

2,994.0
0.1
0.6

4,541.5
0.2
0.8

2,138.0
-0.5
0.4

4,099.5
0.4
1.8

812.7
0.3
0.7

6.1
5.7
5.7

3.5
3.6
3.6

5.0
4.7
4.5

5.8
6.1
5.8

3.5
3.2
2.9

4.6
4.6
4.4

30,834.3
0.5
2.0

221,698.7
0.4
1.2

261,262.1
0.0
1.3

117,411.6
0.1
1.7

272,116.5
0.4
0.8

45,715.0
0.0
0.6

153
-7.8
-81.7

3,662
4.8
-32.9

16,119
1.2
-30.4

7,060
-3.9
-32.8

8,247
9.1
-14.5

849
-32.0
-3.6

652.0
-1.8
-1.5

532.5
-1.2
-1.3

346.4
0.8
4.0

327.8
0.9
3.8

471.4
-0.2
-0.1

235.8
0.8
2.5

7.6
-5.0
-34.5

68.0
-0.6
-38.6

181.6
-2.2
-25.6

94.4
3.1
-16.9

102.0
0.0
-24.8

29.6
7.2
-11.9

Nonfarm Employment (000’s)
Q/Q Percent Change
Y/Y Percent Change
Manufacturing Employment (000’s)
Q/Q Percent Change
Y/Y Percent Change

Q/Q Percent Change
Y/Y Percent Change
Civilian Labor Force (000’s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q4:07
Q1:07
Real Personal Income ($Mil)
Q/Q Percent Change
Y/Y Percent Change
Building Permits
Q/Q Percent Change
Y/Y Percent Change
House Price Index (1980=100)
Q/Q Percent Change
Y/Y Percent Change
Sales of Existing Housing Units (000’s)
Q/Q Percent Change
Y/Y Percent Change

NOTES:
Nonfarm Payroll Employment, thousands of jobs, seasonally adjusted (SA) except in MSA’s; Bureau of Labor Statistics (BLS)/Haver Analytics, Manufacturing Employment, thousands of jobs, SA in all but DC and SC; BLS/Haver Analytics, Professional/Business
Services Employment, thousands of jobs, SA in all but SC; BLS/Haver Analytics, Government Employment, thousands of jobs, SA; BLS/Haver Analytics, Civilian Labor Force, thousands of persons, SA; BLS/Haver Analytics, Unemployment Rate, percent, SA
except in MSA’s; BLS/Haver Analytics, Building Permits, number of permits, NSA; U.S. Census Bureau/Haver Analytics, Sales of Existing Housing Units, thousands of units, SA; National Association of Realtors®

46

Region Focus • Spring/Summer 2008

RF_SUM_08

9/11/08

11:51 AM

Page 47

Metropolitan Area Data, Q1:08
Nonfarm Employment (000’s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q4:07
Q1:07
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Washington, DC MSA

Baltimore, MD MSA

Charlotte, NC MSA

2,41.5
-1.0
1.0

1,307.9
-1.9
0.7

862.1
-1.2
2.0

3.4
3.0
3.1

3.9
3.6
3.9

5.4
4.7
4.6

4,388
-10.1
-31.3

1,232
-8.1
-29.3

3,583
-6.0
-36.4

Raleigh, NC MSA
Nonfarm Employment (000’s)
Q/Q Percent Change
Y/Y Percent Change
Unemployment Rate (%)
Q4:07
Q1:07
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Unemployment Rate (%)
Q4:07
Q1:07
Building Permits
Q/Q Percent Change
Y/Y Percent Change

Columbia, SC MSA

522.8
-0.9
3.9

297.8
-0.9
2.0

368.7
-1.1
2.0

4.1
3.5
3.6

4.7
4.8
4.6

5.1
5.3
5.3

3,117
8.8
-23.3

1,333
7.9
-6.4

1,036
-10.1
-38.1

Norfolk, VA MSA
Nonfarm Employment (000)
Q/Q Percent Change
Y/Y Percent Change

Charleston, SC MSA

Richmond, VA MSA

Charleston, WV MSA

768.0
-1.5
0.8

631.0
-1.1
0.6

149.1
-0.9
0.4

4.1
3.3
3.2

3.9
3.2
3.1

4.6
3.7
4.7

1,363
10.6
-35.5

1,671
28.4
-7.6

39
5.4
-48.0

For more information, contact Sonya Ravindranath Waddell at (804) 697-2694 or e-mail sonya.ravindranath@rich.frb.org

Spring/Summer 2008 • Region Focus

47

RF_SUM_08

9/11/08

11:51 AM

Page 48

OPINION
More Theory, Please
BY J O H N A . W E I N B E RG

T

get less of it. That’s helpful to know, but it often isn’t suffihis issue of Region Focus features a special section
cient. For instance, you might wish to subsidize a certain
on the “state of modern economics.” Some of the
activity — say, education — if you believe it yields positive
articles report on debates regarding the direction
externalities. But, first, you want to know how large those
of the discipline. Are economists asking the right
externalities will be — and if there would be a more efficient
questions? Are they employing the right methods? Is their
way to achieve them. To determine this, we must turn to
work relevant to the general public? In short, has the
contemporary tools of data analysis.
economics profession lost its way?
For macroeconomists, the “New Keynesian” framework
These are all healthy questions to ask. Economists
has become a workhorse model for policy analysis. It is rich
shouldn’t blindly proceed, simply assuming that they are
enough to generate quantitative predications about how key
heading in the right direction. Occasionally, a little selfmacroeconomic indicators are likely to perform under alterreflection is necessary.
native settings. This, of course, is key to central bankers and
But, I think, the hand-wringing that some economists
other policymakers. But even as this framework is used
have been doing is overwrought. I am sympathetic to
extensively, researchers are studying its
the argument that much economic
limitations. This will ultimately lead to
research should ultimately have policy
We can question whether
even better models and better policy.
relevance. After all, I work as an econIn the introductory article to the
omist in the Federal Reserve System.
the economics profession
special section, the question is asked:
That doesn’t mean, however, that this
adequately rewards speaking
“Why isn’t there a Milton Friedman
research must eschew formal, matheto the public but that public
today?” While this question might
matical modeling. Quite the opposite.
seem to suggest that modern econoEconomics has a long and storied
communication will be most
mists have gotten bogged down in
past. The classical economists of the
valuable if it reflects
mathematical minutiae to the detri18th and 19th centuries had valuable
recent advances in economic ment of speaking to the public, I think
insights, many of which remain relethe answer actually lends support to my
vant today. Those insights, however,
theory and
argument. It’s true that most people
can be made more precise by the use of
quantitative analysis.
know Friedman from his Newsweek
contemporary formal methods. Such
methods also permit us to find new
columns, his books Capitalism and
implications or limitations to the classical economists’ ideas.
Freedom and Free to Choose, and his television appearances.
Indeed, the formal methods of modern economic theory
He wanted to directly address the public, and he did so eloare essential to policymaking. To take just one example, let’s
quently. But you can’t divorce his popular work from his
consider central bank lending. In the wake of the credit
academic work. Friedman was first and foremost a great
market turmoil that began in 2007, the Fed and other central
economist. It was his technical, scientific contributions that
banks expanded their provision of credit to the financial sysinformed his policy views and popular writings, not the
tem. It is impossible to understand the arguments for or
other way around. Without Friedman the mathematically
against such actions without reference to a theory of how
inclined economist, there likely would not have been
financial markets function and under which conditions they
Friedman the influential policy analyst.
may malfunction. Moreover, without the tools of formal
We can question whether the economics profession adeanalysis we cannot determine whether the theory on which
quately rewards speaking to the public — there is, in my
we are basing our policy choices “makes sense,” or exactly
opinion, an important role for economic education — but
what assumptions are needed to make its logic correct.
that public communication will be most valuable if it reflects
Of course, the relevance of a theory relies on its ability to
recent advances in economic theory and quantitative
explain some observable facts — that is to say, data. This
analysis. The discipline is making significant strides in
requires the use and refinement of formal quantitative tools.
understanding a wide range of economic phenomena.
For policymakers, the desirability of various choices often
We should not abandon that work. Rather, the trick is to
comes down to questions of magnitude. How big of a change
effectively communicate it to a broad audience. It’s not an
will a certain policy choice produce?
easy task, but one well worth pursuing.
RF
Consider the issues of subsidies and taxes. Economists
know that when you subsidize something, you are likely to
John A. Weinberg is senior vice president and director of
get more of it, and when you tax something, you are likely to
research at the Federal Reserve Bank of Richmond.

48

Region Focus • Spring/Summer 2008

RF_SUM_08 Full Cover

10/7/08

10:53 AM

Page 3

NEXTISSUE
Homeownership

Interview

Increasing the homeownership rate in the United States has
been a primary policy goal for decades. But despite various
government policies that subsidize investment in owneroccupied real estate, the rate of homeownership hasn’t budged
much in the past 30 years. Meanwhile, few stop to question
whether tilting the playing field in favor of homeowners has
any downside. Is it possible that the homeownership rate in
America is actually too high?

We talk to Joseph Gyourko of the
Wharton School of Business at the
University of Pennsylvania about urban
economics and the housing market.

Carbon Trading

Book Review

The debate over the best way to reduce carbon emissions has
boiled down to a choice between a carbon tax and a cap-andtrade system that uses market forces to determine which
companies should curtail their carbon emissions the most.
Both approaches have advantages and drawbacks. Yet, while
economists debate the issue, state governments in the
Northeast have embarked on an ambitious effort to institute a
carbon permit trading market in their region. The results could
have dramatic implications for the future of environmental
policy in the Fifth District.

Russell Roberts’ new book The Price of
Everything: A Parable of Possibility and
Prosperity shows that fiction can be an
effective way to illustrate basic economic
principles.

Jargon Alert
Not sure of the best way to motivate
your employees. Then you have a “principalagent problem.”

Immigrant Entrepreneurs
A recent study noted that a quarter of all the companies
started with venture capital in the United States over the past
15 years had at least one founder who was born outside of the
country. Is there something inherent in the experience of many
immigrants that makes them uniquely suited to becoming
entrepreneurs?

Visit us online:
www.richmondfed.org
• To view each issue’s articles
and web-exclusive content
• To add your name to our
mailing list
• To request an e-mail alert of
our online issue posting

RF_SUM_08 Full Cover

10/7/08

10:53 AM

Page 4

Savings and Retirement

H

ow financially prepared are Americans
when it comes to retirement? The
Federal Reserve Bank of Richmond’s 2007
Annual Report attempts to answer this in the
feature essay, “Are We Saving Enough?
Households and Retirement.”

Authors Senior Vice President and Director of Research
John Weinberg and economics writer Doug Campbell
examine the extent to which Americans are financially
prepared for retirement. In the essay, they describe the
results of careful studies on American saving habits. Review
of the research and data shows that most families nearing
retirement are saving adequately — or the best they can
given their lifetime expected incomes. However, estimates
about savings adequacy depend on the assumption that
entitlement program benefits will be forthcoming.
Population aging and the movement of baby boomers into
retirement challenge that assumption and may portend a
future in which some people would need to consume less
than they would otherwise. The authors describe the tradeoffs involved with various approaches, and conclude that the
sooner we settle on a solution to our federal retirement
problems the better off all generations are likely to be.

The annual report also includes reports on the region’s economy, as well as the Bank’s operational and financial activity,
and takes a special look at the Richmond Fed’s involvement
in Fifth Federal Reserve District communities.

The 2007 Annual Report is available on the Bank’s Web site at
www.richmondfed.org, or by contacting the Bank’s Public
Affairs Department at 804-697-8109.

Federal Reserve Bank
of Richmond
P.O. Box 27622
Richmond, VA 23261

PRST STD
U.S. POSTAGE PAID
RICHMOND VA
PERMIT NO. 2

Change Service Requested

Please send address label with subscription changes or address corrections to Public Affairs or call (804) 697-8109.