View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

Let’s Make It Clear:

How Central Counterparties Save(d) the Day*
BY CYRIL MONNET

T

he bankruptcy of Lehman Brothers in 2008
will certainly be featured in history books as
one of the greatest financial failures so far,
but it will also be recorded as yet another
episode of the historically successful performance of
clearing arrangements in ensuring the resiliency of
markets. Recognizing the usefulness of safe and sound
clearing and settlement procedures, the Federal Reserve
has recently supported the attempt to shift the clearing of
some contracts to a central counterparty. In this article,
Cyril Monnet outlines the arguments in favor of central
counterparty clearing, the economic rationale for trade
clearing through a central counterparty, and some possible
limits to the advantages of clearing trades through a
central counterparty.

Following the bankruptcy of
Lehman Brothers in September 2008,
market participants were worried
that Lehman’s positions of more
than $500 billion would take ages to
unwind. Lehman’s creditors did not
know whether they would be able to
recover all of the funds from their

Cyril Monnet is
a senior economic
advisor and
economist in
the Philadelphia
Fed’s Research
Department.
This article is
available free of
charge at www.
philadelphiafed.org/research-and-data/
publications/.
www.philadelphiafed.org

positions with Lehman or whether
they would have to write them down.
The uncertainty surrounding the
unwinding process put the market in
a frenzy. However, the unwinding of
Lehman’s positions was concluded in
the following month. In doing so, the
major clearinghouses, LCH.Clearnet
in the UK and DTCC in the U.S.,
restored some market confidence in
only a few days, after their actions
made it clear that unwinding Lehman’s
position would be a smooth process.
Lehman’s bankruptcy will
certainly be featured in history

*The views expressed here are those of the
author and do not necessarily represent
the views of the Federal Reserve Bank of
Philadelphia or the Federal Reserve System.

books as one of the greatest financial
failures so far, but it will also be
recorded as yet another episode of the
historically successful performance
of clearing arrangements in ensuring
the resiliency of markets. Clearing
and settlement systems had previously
come under severe stress during the
stock market crash of 1987. However,
as Ben Bernanke noted about that
particular crisis, before he was
appointed Chairman of the Federal
Reserve, clearing and settlement
systems, with the help of the Federal
Reserve, played a pivotal role in easing
liquidity conditions.1
Recognizing the usefulness of safe
and sound clearing and settlement
procedures, the Federal Reserve has
recently supported the attempt to shift
the clearing of some contracts, such
as credit default swap contracts,2 to a
central counterparty. In October 2008,
the Federal Reserve Bank of New
York stated that setting up a central
counterparty for credit default swaps
was one of its priorities for addressing
both operational and market design
concerns for over-the-counter (OTC)
derivatives.3
To make sense of this policy, we
need to understand the arguments in
favor of central counterparty clearing.
What is the economic rationale for
1

In the 2008 crisis, the Fed also played a
crucial role by taking on the credit default
swaps of the insurer AIG. We can only
speculate on the outcome, had the Fed
allowed AIG to default on its obligations.

2
A credit default swap (CDS) is an insurance
contract whereby the buyer receives insurance
on a credit instrument’s failure to pay — for
example, a bond or loan — in exchange
for a series of payments to the seller.
3

See New York Fed, 2008.
Business Review Q1 2010 1

introducing trade clearing through
a central counterparty? How did
market participants come to use
central counterparty clearing in the
first place? And are there limits to the
advantages of clearing trades through a
central counterparty?
THE CLEARING AND
SETTLEMENT SYSTEM
In the 1987 movie “Wall Street,”
the opening scene shows the trading
room of an investment bank, and
the brokers are scrambling for trades.
Many of the brokers are shouting about
hot leads and talking on the phone
to clients, advising them to dump or
buy certain stocks. The scene, which
shows both the chaos and drama of
the trading room, underlines this
aspect of Wall Street: Brokers can
make a fortune by just taking a few
hundredths of cents for each trade they
conduct.
But what happens once the
brokers hang up the phone? Then it is
time for the much less glamorous world
of clearing and settlement, also known
as the back office. And a central
counterparty (CCP) is one piece of
the larger clearing and settlement
puzzle. (See the Glossary of Terms for
definitions of some of the terminology
used in this article.)
To understand where clearing,
settlement, and CCPs fit into the
trading process, I will now take
you through the different stages of
a typical trade. A series of figures
will accompany my explanation. For
simplicity, consider an example with
three traders: Ace (A), Bull (B),
and Conservative (C), who wish to
place bets on the financial viability
of Direstraits, Inc. (D). Why do the
traders want to place these bets?
Broadly, there are two reasons: Some
traders may be hedging their exposure
to Direstraits; for example, one of
Direstraits’ lenders might want to

2 Q1 2010 Business Review

limit its losses in the event of a loan
default. Other traders may have (or
believe they have) information about
Direstraits’ prospects. These traders
are called speculators because they
seek to exploit price movements to
make large gains in a very short time.
In the first stage, the trading stage,

A central counterparty
(CCP) is one
piece of the larger
clearing and
settlement puzzle.
all traders agree on the terms of their
trade. To be concrete, I will use the
following contract (a simplified credit
default swap, or CDS): If Direstraits
goes bankrupt, the CDS seller agrees
to pay the buyer $5. No seller would
make this promise for free. I will
therefore assume that the buyer must
pay the seller $1 today (the price of
the contract). Often, it is convenient
to use the term counterparty when we
don’t want to be specific about whether
we’re talking about buyers or sellers.
For example, Bull is Ace’s counterparty
and Ace is Bull’s counterparty.
In my example, Ace sells two
contracts to Bull (Ace agrees to
pay $10 to Bull if Direstraits goes
bankrupt), Bull sells four contracts to
Conservative (Bull agrees to pay $20
to Conservative if Direstraits goes
bankrupt), and Conservative sells
three contracts to Ace (Conservative
agrees to pay $15 to Ace if Direstraits
goes bankrupt). It is important to note
that, at this stage, no cash changes
hands; negotiation on the terms of the
contract is all that’s going on. In the
movie, this is when brokers speak (or
rather scream) on the phone and write

their trades on tickets. What do they
do with these tickets? They send them
for clearing.
In the second stage, the clearing
stage, the terms of the trades (as
specified on the tickets) are written
down in three formal contracts that
Ace, Bull, and Conservative must
verify. Once the terms are approved,
the contracts become legally binding.
The traders can add other clauses,
such as the obligation to pledge
collateral. For instance, the contract
may require a seller to put $1 of
cash in a margin account for each
$2 it promises to pay in the event of
Direstraits’ bankruptcy.
If traders carry out their
transactions through a CCP, there
is a third stage: the CCP clearing
stage. In this stage, the three original
contracts are being replaced by six new
contracts. The essential terms of the
original contracts stay the same, but
the CCP becomes the buyer to every
seller and the seller to every buyer (this
process is also known as novation).
In our example, if Direstraits goes
bankrupt, Ace now has to pay $10
to the CCP, and the CCP has to pay
$10 to B, etc. The CCP may also add
clauses, such as an additional collateral
requirement.
Finally, at the settlement stage,
obligations must be executed per the
agreed terms. Here, cash changes
hands from the buyer of the contract
to the seller. Also, in our example,
settlement occurs if Direstraits
goes bankrupt during the life of the
contract. Depending on the contract
specifications, the settlement stage
can extend months after the contract
is cleared. Figure 1 illustrates the
payments due once the contracts
are cleared and in the case in which
Direstraits goes bankrupt, with and
without CCP clearing.
CCP clearing is therefore only
an additional step between clearing

www.philadelphiafed.org

FIGURE 1
A. No CCP Clearing
 
the contracts are cleared
Ace

Bull

$

2

$

8

$

Cons

6

Bull

Ace

Cons

Bull

Ace

Cons

B. CCP Clearing
 
the contracts are cleared
Ace

Bull
$

$

6

2

$

8

2

6

and settlement. However, this step is
not without consequences, and the
next section will explain its use and
how traders came to introduce CCP
clearing.
EVOLUTION OF CLEARING
AND SETTLEMENT
MECHANISMS TOWARD
CENTRAL COUNTERPARTY
CLEARING
Trade is at the heart of an
exchange-based economy. However,
counterparties to a trade may not
make their promised payments; goods

Bull

20

Cons

15

$

Ace

Ace

Bull
10

Cons

www.philadelphiafed.org

$

$
$

$

10

$

     
goes bankrupt during
       

8

CCP
$

     
goes bankrupt during
       

15

$

10

$

$

CCP
15

$

$

20

20

Cons

may be of dubious quality or, at worst,
they may not be delivered. The risk
that a counterparty will not fulfill its
end of the contract is called counterparty risk. As in the example above, I
will concentrate on counterparty risk
when Direstraits goes bankrupt.
There are obvious ways to reduce
these counterparty risks, but they
cost time and money, and they may
also limit the choices available to a
counterparty. One way is to rely on the
reputation of trading partners. However, building a reputation takes time,
and relying on reputation restricts

competition because it is hard for a
new entrant without an established
reputation to compete. Also, reputation will not guarantee performance by
a trader that is under enough financial stress; that is, a trader may be so
desperate that it doesn’t really have the
luxury to think about tomorrow.
A second way to limit counterparty risk is to impose a collateral
requirement. Pledging a sufficiently
large amount of collateral can limit
counterparty risk or even completely
eliminate it, if, for instance, the margin
account covers all future payments.
Unfortunately, the funds in the margin
account are not available for other
investments that might be more profitable.4 Also, traders could monitor their
counterparties, but this requires a lot
of time and resources.
Conscious of the importance of
managing counterparty risk, market participants introduced several
modifications to clearing arrangements
aimed at reducing counterparty risk in
the mid to late 19th century. Improvements have occurred incrementally;
however, James Moser, in his study,
outlines three particular steps in the
historical evolution of clearing and
settlement mechanisms. While what
follows is a broad historical description
based largely on his work, some of the
earlier and simpler arrangements are
still used.
Direct Settlement. The first
settlement mechanism is direct settlement. This is the most casual method
of settling trades, since settlement is

4
Collateral also involves a significant cost
(and benefit). As Gary Gorton points out,
“For the party calling for collateral, collateral
becomes a form of funding. Because [interest]
is paid on collateral, firms receiving collateral
can fund themselves...when issuing debt in
the market would cost them much more.
This is one reason that the scramble for
cash in the form of collateral calls is very
important. In fact, it is difficult to convey the
ferocity of the fights over collateral” (p.66).

Business Review Q1 2010 3

limited to the original counterparties.
An example of direct settlement is
when you pay cash to buy a newspaper
at a kiosk. Direct settlement works well
in that case, because if you can’t pay,
you don’t get the newspaper (and conversely, if you don’t get the newspaper,
the merchant doesn’t get the money).
In our trading example, direct settlement is represented in Figure 2. The
arrows denote the flow of payments
due, in the case in which Direstraits
goes bankrupt. Under our scenario,
Ace pays $10 to B, and so forth. However, this assumes everything is going
according to plan.
In reality, in the event that Direstraits does go bankrupt, Bull (for example) has a choice: Either Bull makes
the promised payment to C, or Bull
can choose to default. It is important
to note here that B’s financial condition is not part of our simple CDS
contract. In particular, in our simple
example, Conservative can’t opt out of
the contract, even if B’s ability to pay
deteriorates.
This has several consequences.
First, C’s expected losses may accumulate if B’s financial condition declines.
Second, Bull might gamble on resurrection, that is, take a big risk in the
slim hope of recovery. To limit losses,
the contract may require that Bull
place money in a margin account with

C, depending on B’s financial condition, for example, as measured by its
credit rating. While collateral limits
losses, it introduces another problem:
If Direstraits does not go bankrupt
during the life of the contract, Conservative may be tempted to delay or
refuse to return B’s collateral. So direct
settlement is prone to counterparty
risk, and collateral may not work very
well with direct settlement.

To control default risk,
ring arrangements
often require traders
to maintain margin
requirements.
Ring Settlement. The second
mechanism for settling trades is called
ring settlement. The purpose of a ring
settlement is to allow multilateral
netting — that is, the canceling of
payments of offsetting trades — by extending the set of counterparties that
can settle a single contract.
Let’s see how our traders might
organize a ring, in which the net
obligations replace the obligations of
the original contracts. In the event
that Direstraits goes bankrupt, Bull
has promised to pay out $20, and it has
been promised $10, so its net payment
is $10. Ace receives a net payment of

FIGURE 2

FIGURE 3

Direct Settlement

Ring Settlement

Ace

Bull
Cons

10

$

$

20
15

$

4 Q1 2010 Business Review

Bull

Cons
Ace

Ace

10

$

15

$

Bull

$

Cons

$5 (it promised to pay $10 and has
been promised $15) and similarly for
Conservative. Therefore, Bull pays Ace
and Conservative $5 each (Figure 3).
Ring settlement requires standard
or fungible — that is, easily substitutable — contracts to allow one member to substitute for another. In our
example, Ace, Bull, and Conservative
can form a ring, since they all trade
the same contract (albeit a different number of contracts). The main
benefit of netting is that it reduces the
cost of open positions and, thus, the
costs of a counterparty defaulting. If
Direstraits defaults, Bull has to find
only $10, while Ace and Conservative
do not need any cash at all.
Ring settlement has three main
drawbacks. First, each member must
monitor all of the others, since any
member may be a substitute for the
original counterparty. Second, since
ring members may have to monitor
each other’s positions, traders cannot
keep their positions secret; that is,
they cannot trade anonymously.
This is a problem because revealing
information about your position allows
other traders to copy your trades or to
profit by trading against you. Finally,
rings can be fragile and susceptible
to systemic failure, in the sense that
the failure of one member may cause
the failure of other members and, in

20

Ace

$

5

Bull

$

Netting

5

Cons

www.philadelphiafed.org

turn, the collapse of the whole ring
arrangement. To control default risk,
ring arrangements often require traders
to maintain margin requirements.
Central Counterparty. The final
settlement mechanism is through a
CCP, using CCP clearing. To improve
on the ring, a CCP replaces each
existing bilateral contract with two
contracts and becomes the sole
counterparty (Figure 4).
CCP clearing preserves trading
anonymity, since only the CCP knows
the overall positions of a trader.5 Since
traders do not reveal their information
to other traders, they can profitably
conduct more trade. Therefore, a CCP
fosters market liquidity. A CCP can
also foster liquidity by standardizing
the contracts it clears.6
The CCP, however, is not immune
to the failure of one trader to pay at
the settlement date. For example,
Bull might not be able to pay $10
to the CCP when Direstraits goes
bankrupt, but the CCP still has to
satisfy its obligation to pay $5 to Ace
and Conservative. To cover potential
losses, CCPs use three instruments:
margin requirements, position limits,
and default funds. Position limits are
limits on the number of positions that
a trader can take.7 A CCP can also use
default funds. The CCP may require
traders, before they trade, to pledge
$2 each to a fund the CCP manages.
The CCP then has $6 available in
case Bull does not pay. If Bull pays,
the CCP returns $3 each to Ace and
Bull. As a consequence, traders face
lower counterparty risk as long as the
CCP manages its risk well. Another

5
Of course, this requires that the CCP not
reveal each member’s positions.
6

See the article by Randall Kroszner.

7

See the paper by Yaron Leitner for a theory of
position limits.

www.philadelphiafed.org

FIGURE 4
CCP Clearing
Ace

Bull
10

$

15

$

Ace

Bull

10

$

CCP
15

$

$

20

10

$
$

20

$

5

CCP

Netting

Cons

advantage of a CCP is to reduce
monitoring costs. Indeed, contrary to
the clearing mechanisms previously
described whereby traders had to
monitor each other, here only the CCP
has to monitor traders.
As in the settlement ring, the
CCP works best if contracts are
completely standardized. This is the
case in our example, since all traders
make payments in case Direstraits goes
bankrupt. However, netting is limited
if contracts are only imperfectly
substitutable. To see this, suppose
traders also care about whether
Endgame, Inc. (E) goes bankrupt.
In particular, suppose Bull sells two
CDS to Ace against the event that
Endgame goes bankrupt (say, Bull
promises to pay $10 to Ace), while
other contracts remain in place. When
the bankruptcies of Direstraits and
Endgame are not perfectly correlated
events, a CCP will not be able to fully
net all positions.
Netting is not totally excluded,
even if contracts are imperfect
substitutes. Rather than the exposure
itself, the dollar value of the exposures
can be netted. For instance, if two
contracts contain obligations in
different currencies, it is impossible
to net the two contracts directly.
However, it is possible to net them

$

5

Cons

once the obligations have been
converted into a single currency.
This process, however, is left to the
discretion of the rules governing a
CCP or in the master agreement of a
particular industry.8
Unfortunately, limited netting
possibilities increase counterparty risk
for the CCP. If the CCP can net all
positions, it needs collateral only from
Bull, who owes $10.9 However, when
netting is limited, a CCP may have to
impose larger margin requirements,
larger contributions to the default
funds, or stricter position limits.
Finally, in addition to reducing
counterparty risk, the CCP can also
produce useful information for traders
and can do so without compromising
anonymity. If all trades have to be
cleared through the CCP, the CCP has
access to the specifics of all contracts.
Therefore, it can gather information
and release aggregate statistics on the
price or quantities of the contracts
traded. This is valuable because prices

8
See, for instance, the protocols set forth
by the International Swaps and Derivatives
Association.
9
Also note that Ace and Conservative are each
owed $5, so that with full netting, their financial
condition does not affect counterparty risk.

Business Review Q1 2010 5

collectively sum up the information
of all traders. For example, each
trader may know something about the
prospect of default by Bull. Someone
observing a rising price for the credit
default swaps may infer that traders
have raised their forecast of the
likelihood of default.
LIMITS TO CCP CLEARING
CCPs are, however, not immune
to failing on their obligations: If many
of its counterparties default, a CCP
may not have enough resources to
cover all its positions. In this case,
a CCP is not financially viable. For
example, when Lehman Brothers failed
in September 2008, markets were
under particular stress. All of Lehman’s
positions had to be unwound, leaving
market participants speculating on
what the outcome would be. Were
CCPs in jeopardy? Could they cover
all of their obligations following
Lehman’s default without tapping
into their default funds? Fortunately,
CCPs around the world successfully
conducted the unwinding process in a
timely manner. (See The Performance
of Central Counterparties Clearing
Following Lehman’s Failure.) Given the
resilience of markets that operate with
a CCP, many authorities have recently
advocated in favor of extending the
use of CCP clearing to other markets.
To fully evaluate these proposals, we
need to take account of the limits of
CCPs, in particular, the difficulties
of clearing over-the-counter (OTC)
trades.
We can contrast two types of
markets in which CCP clearing
can take place: centralized markets
and OTC markets. In a centralized
market, contracts are very uniform,
since the terms (products, quality, and
settlement date) are fixed, and the
only missing information to buy or sell
a contract is its price. 10 All traders
look at their computer screens to get

6 Q1 2010 Business Review

price quotes, and they can buy or sell
contracts with the push of a button
(literally), without even knowing the
identity of the seller or buyer.
One problem with standardized
contracts is that they are not tailored
to the needs of each trader. Traders
looking for specifically tailored
contracts will access an OTC market.11
Since the terms are idiosyncratic,
traders have to make phone calls or

CDS from Conservative, but suppose
the contract specifies that if Direstraits
goes bankrupt, Conservative should
pay 10 Swedish krona (and not $10). In
the unlucky event that Conservative
himself defaults, the CCP still has
to fulfill its side of the contract to
B. Therefore, the CCP has to find
another trader willing and able to
provide 10 Swedish krona if Direstraits
defaults. This may be difficult and

If an OTC trader defaults on its promise to pay
the CCP, the CCP faces a large replacement
cost risk.
send e-mails to other traders to find
out how much a specific contract costs.
One drawback of an OTC market is
that it is not transparent; the terms of
the contract remain largely undisclosed
to other participants. The lack of
transparency impairs the information
aggregation process that prices would
normally perform.
There are two main limits
to a CCP operating on an OTC
market. First, Darrell Duffie and
Haoxiang Zhu, in their study, show
that multilateral netting is the main
advantage of a CCP in reducing
counterparty risk. But as we saw
earlier, multilateral netting can be
limited, or even impossible, when the
contracts traded are not uniform.
Also, if an OTC trader defaults on
its promise to pay the CCP, the CCP
faces a large replacement cost risk.
The less standardized the contract,
the larger the cost. To understand this,
suppose once again that Bull buys a
10

To some extent, the degree of standardization
is a policy variable, since the government can,
for example, outlaw or tax nonstandardized
agreements.

11
See the study by Darrell Duffie, Nicolae
Garleanu, and Lasse Pedersen, and the one by
Ricardo Lagos and Guillaume Rocheteau.

expensive if the Swedish currency is
not commonly traded in the U.S. This
is an example of the replacement cost
risk that a CCP faces, and the more
specific a contract is, the higher the
replacement cost risk will be.
This is similar to a loss of a hedge
by a trader. A hedge is a position
with another trader in order to offset
the risk originating from an initial
trade. For example, wheat producers
can hedge against the fluctuations of
wheat prices by selling the promise to
deliver wheat at a given price. If the
buyer of the hedge fails, sometime
before the hedge matures but after
some information on aggregate wheat
production is revealed, the wheat
producer may find it impossible to
convince another trader to buy his
hedge.
To cover these costs, a CCP
operating on an OTC market
will naturally increase collateral
requirements and the contributions
to its default funds. However, the cost
could be so high, and the collateral
so costly to pledge, that OTC traders
known to always fulfill their promises
(low-risk traders) may reduce their
trades or simply opt out of the CCP
clearing arrangement altogether.

www.philadelphiafed.org

  

A

      
    

s reported in the Bank of England’s
Financial Stability Report (October
2008), the London-based clearinghouse
LCH.Clearnet was exposed, through
Lehman’s interest rate swap portfolio, to
the risk of sharp market movements across a wide range
of products. Indeed, the total notional value of the
portfolio was $9 trillion, encompassing a total of 66,390
trades across five major currencies. The unwinding
process was achieved through the competitive auctioning
of the Lehman OTC interest rate swap portfolio. The
default was managed well within the margins posted by
Lehman, and LCH.Clearnet did not have to resort to its
default fund.
The Depository Trust and Clearing Corporation
(DTCC), the largest clearing agent for the U.S.,
announced in October 2008 that it had successfully
closed out over $500 billion in market participants’
exposure from the Lehman Brothers bankruptcy. The
unwinding process was carried out by netting Lehman’s
positions and liquidating any remaining positions, by asset
class. The largest of Lehman’s positions was in securities
based on mortgages, amounting to $329 billion. DTCC’s
Fixed Income Clearing Corporation (FICC) had plans to
launch a CCP that could net mortgage-backed securities.
Although it was not in operation at the time of Lehman’s
bankruptcy, the FICC put the idea to work and netted
out $300 billion in Lehman trades related to mortgage-

backed securities, or 90 percent of the outstanding value.
Lehman also held trades for $190 billion in
government securities and $5.85 billion in equities,
municipal bonds, and corporate debt. Subsidiaries of
DTCC processed $3.8 billion in options exercises and
assignments that were expiring and arranged for the
release of $1.9 billion in securities with Lehman’s bank to
satisfy Lehman’s open trades. The remaining positions
were liquidated in the market. The unwinding process
was therefore conducted swiftly and without resorting to
DTCC’s subsidiaries’ default funds.*
Lehman's bankruptcy also highlights the role of
information anchor that a clearing agent can play for
OTC markets. With Lehman's bankruptcy, market
participants speculated that the CDS market had
exposure of as much as $400 billion for payments on a
Lehman default. However, as DTCC announced in a
press release on October 11, 2008, the payment
calculations performed by the DTCC Trade Information
Warehouse relating to the Lehman Brothers bankruptcy
indicated that the net fund transfers from net sellers of
protection to net buyers of protection were expected to be
in the range of $6 billion. At the end of the unwinding
process, DTCC calculated and bilaterally netted all
amounts due on credit default swaps written on Lehman
for $72 billion. This resulted in approximately $5.2 billion
owed from net sellers of protection on Lehman to net
buyers of protection.

*Source: DTCC Annual Report 2008.

If only higher risk traders use CCP
clearing, the CCP may become
financially unsound, unless it raises its
collateral requirements, thus deterring
even more traders from CCP clearing.
In the end, only very high-risk traders
may be willing to use the CCP, which
obviously limits the insurance benefits
the CCP should provide. Also, if only
high-risk traders use CCP clearing, the

www.philadelphiafed.org

aggregate price that the CCP would
announce would not reflect all trades
and would therefore limit the diffusion
of the information. The bottom line
is that the participation of low-risk
traders in markets that trade overthe-counter and use CCP clearing is
important to ensure that the market is
efficient and safe.

In an article with co-authors
Thorsten Koeppl and Ted Temzelides, I
examine one solution to the problem of
inducing low-risk traders to participate
in CCP clearing. Clearly, they will
participate only if the costs of using
CCP clearing are sufficiently low. To
reduce the cost incurred by low-risk
traders, a CCP can either limit the
participation of high-risk traders — for

Business Review Q1 2010 7

example, through stringent position
limits — or shift the cost elsewhere.
Therefore, the CCP has to use another
source of finance to keep contributions
to the default fund and margin
requirements low and position limits
relatively high.
The CCP can achieve this by
establishing CCP clearing that is
common to both centralized and OTC
markets. Suppose the CCP operates in
both an OTC market and a centralized
market in which traders must clear
through the CCP. Then the CCP could
increase the default fund contributions
of traders in the centralized market
and use it to finance a lower default
fund for OTC market trades. The
fund’s contributions for OTC traders
can be adjusted so that they are willing
to clear through the CCP. While this
hurts traders on centralized exchanges,
one has to recognize that many
participants are active in both types
of markets, so that the overall gains
from introducing a single CCP clearing
arrangement can be positive.
For example, according to our
analysis, it may be most efficient for a
clearinghouse to clear both CDS index
swaps,12 which are standardized and
could easily be traded on a centralized
exchange, and bespoke CDS, which

12

In contrast to a simple CDS, a CDS index
swap gives insurance on a fixed basket of
credit entities. In a simple CDS, the buyer gets
insurance on any credit entity of his choice.
A CDS index swap is therefore a much more
standardized product than a plain CDS.

8 Q1 2010 Business Review

are very idiosyncratic. Collateral
requirements or default funds might be
set somewhat higher for those trading
index swaps so that they can be set
lower for those trading bespoke CDS.
To summarize, a CCP operating
in several markets could subsidize its
risk management activities in the OTC
market using its clearing activities
conducted in a centralized market.
In the end, this could induce lowrisk traders to participate in the CCP
clearing arrangement in the OTC
market
CONCLUSION
Given the large growth in trades
of credit default swap contracts in
the last decade, regulators and some
market participants have pressed for
the establishment of CCP clearing
in this market. In the last year or so,
considerable progress has been made,
and industry participants have taken
a number of steps: Multiple CDS CCP
platforms are now close to starting or
have already started operations. For
example, NYSE Euronext through
Liffe’s BClear platform has been
operating in Europe since October
2008. ICE Clear US has been clearing
agricultural swap contracts since
February 2009, and in March 2009,
the Fed approved its application to
become a member of the Federal
Reserve System, which moves it a step
closer to operating as a CCP for CDS
transactions. Also in March 2009,
CME Group and its associated joint
venture, CMDX, announced that they
have received regulatory approvals

from the Federal Reserve and the
Securities and Exchange Commission
for clearing and trading credit default
swaps through CME Clearing and
the CMDX platform. Finally, SwissGerman futures exchange Eurex is also
planning to launch a CCP for CDS in
Europe.
In April 2009, at a meeting
hosted by the New York Fed, market
participants also supported broadening
the use of CDS CCPs to include a
wider set of firms and CDS products.
They also agreed to report all CDS
trades not cleared through a CCP
to a central trade repository. CCPs
and their members agreed to release
information about their activities
as they go live. In November 2008,
the Depository Trust and Clearing
Corporation began releasing weekly
data about aggregate volume on the
CDS market.
I have tried to shed light on
the economic forces that lead to
CCP clearing and, to some extent,
explain the recent push toward the
establishment of CCP clearing in CDS
markets. I have also highlighted some
of the difficulties of CCPs for OTC
markets. Despite the clear benefits
of CCP clearing, it is not obvious
that this clearing arrangement fits all
financial instruments. Some degree
of uniformity in traders’ risk profile
and instruments appears to be needed
to extract all of the benefits of CCP
clearing. Whether we will observe a
specialization of CCPs in clearing only
a certain kind of trade remains to be
seen. BR

www.philadelphiafed.org

GLOSSARY OF TERMS
The Bank for International Settlements, an international organization that fosters communication and cooperation
among central banks, has explained a number of terms relevant to central counterparty clearing arrangements. The
glossary has been published by the BIS’ Committee on Payment and Settlement Systems (CPSS) and can be found at
http://www.bis.org/publ/cpss00b.htm.
Central counterparty (CCP): an entity that is the buyer to
every seller and the seller to every buyer of a specified set
of contracts, e.g., those executed on a particular exchange
or exchanges.

a large number of individual positions or obligations to
a smaller number of obligations or positions. Netting
may take several forms that have varying degrees of legal
enforceability in the event of default of one of the parties.

Clearing: the process of transmitting, reconciling, and,
in some cases, confirming payment orders or security
transfer instructions prior to settlement, possibly
including netting and the establishment of final positions
for settlement. Sometimes the term is used (imprecisely)
to include settlement.

Master agreement: an agreement that sets forth the
standard terms and conditions applicable to all or a
defined subset of transactions that the parties may
enter into from time to time, including the terms and
conditions for closeout netting.

Counterparty: the opposite party to a financial
transaction, such as a securities trade or swap agreement.
Default funds (also called Loss-sharing pools): cash,
securities, or possibly other assets that are provided by
the participants in advance and are held by the system
to ensure that commitments arising from loss-sharing
agreements can be met.
Margin: margin has at least two meanings. In the futures/
commodity markets, margin is a good faith deposit (of
money, securities, or other financial instruments) required
by the futures clearing system to ensure performance. In
the equities markets, margin is a sum of money deposited
by a customer when borrowing money from a broker to
purchase shares. The money deposited with the broker is
the difference between the purchase value of the shares
and the collateral value of the shares.

Multilateral netting: an arrangement among three or more
parties to net their obligations. The obligations covered
by the arrangement may arise from financial contracts,
transfers, or both.
Novation: satisfaction and discharge of existing
contractual obligations by means of their replacement by
new obligations (whose effect, for example, is to replace
gross with net payment obligations). The parties to the
new obligations may be the same as those to the existing
obligations or, in the context of some clearinghouse
arrangements, there may additionally be substitution of
parties.
Position limit: a restriction on the number of contracts or
share of a contract’s open interest that a single entity may
hold.
Settlement: an act that discharges obligations in respect to
funds or securities transfers between two or more parties.

Netting: an agreed offsetting of positions or obligations
by trading partners or participants. The netting reduces

www.philadelphiafed.org

Business Review Q1 2010 9

REFERENCES

Bank for International Settlements. “A
Glossary of Terms Used in Payments
and Settlement Systems,” Committee on
Payment and Settlement Systems (2003).

Lagos, Ricardo, and Guillaume Rocheteau.
“Liquidity in Asset Markets with Search
Frictions,” Econometrica, 77:2 (March
2009), pp. 403-26.

Bernanke, Ben. “Clearing and Settlement
During the Crash,” Review of Financial
Studies, 3:1 (1990), pp. 133-51.

Koeppl, Thorsten, Cyril Monnet, and
Ted Temzelides. “Optimal Clearing
Arrangements for Financial Trades,”
(2009).

Duffie, Darrell, Nicolae Garleanu, and
Lasse Pedersen. “Over-the-Counter
Markets,” Econometrica, 73 (2005),
pp. 1815-47.
Duffie, Darrell, and Haoxiang Zhu. “Does
a Central Clearing Counterparty Reduce
Counterparty Risk?,” Stanford University
(2009).
Gorton, Gary. “The Panic of 2007,” Federal
Reserve Bank of Kansas City, Jackson Hole
Conference (2008).

10 Q1 2010 Business Review

Moser, James. “Contracting Innovations
and the Evolution of Clearing and
Settlement Methods at Futures
Exchanges,” Federal Reserve Bank of
Chicago, Working Paper 98-26 (1998).
New York Fed (2008). newyorkfed.org/
newsevents/news/markets/2008/an081031.
html

Kroszner, Randall. “Can the Financial
Markets Privately Regulate Risk? The
Development of Derivatives Clearing
Houses and Recent Over-the-Counter
Innovations,” Journal of Money, Credit, and
Banking, (1999), pp. 569-618.
Leitner, Yaron. “‘Inducing Agents to
Report Hidden Trades: A Theory of an
Intermediary,” Federal Reserve Bank of
Philadelphia, Working Paper 09-10 (April
2009).

www.philadelphiafed.org

How Much Is That Home Really Worth?
Appraisal Bias and House-Price Uncertainty*
BY LEONARD NAKAMURA

W

ith house prices often below the face value
of mortgages these days, the expected return
on many mortgages has tumbled, since one
of the major forces supporting mortgages, the
collateral, has weakened. One source of these mortgage
problems has been the validity of the home appraisal,
which is supposed to be an objective and expert dollar
valuation of the house that should help make a mortgage
less risky. Unfortunately, the appraisal process can go
awry and often has. As Leonard Nakamura shows in
this article, appraisals have been biased upward, making
mortgages riskier. Now a reverse risk is at work: The bias
is going the other way, causing home valuations to be
underestimated, possibly making new mortgages harder
to obtain. In addition to problems of bias, Nakamura
discusses the appraisal process, how it’s supposed to work,
and how it can go awry.

When housing prices fall and
mortgage borrowers lose their jobs and
fall behind on mortgage payments, an
important question arises: How much

Leonard
Nakamura is
an assistant vice
president and
economist in
the Philadelphia
Fed’s Research
Department, where
he is also head
of the Regional
and Applied Microeconomics section. This
article is available free of charge at www.
philadelphiafed.org/research-and-data/
publications/.
www.philadelphiafed.org

is any given house worth if it were to
be sold? In the not-too-distant past,
say, 2005, when house prices were
still spiraling upward, the answer was
almost always “more than the amount
borrowed.” However, more recently,
a typical answer has been, “not so
much.” With many house prices
below the face value of mortgages,
the expected return on many of these
mortgages has tumbled since one of

*The views expressed here are those of the
author and do not necessarily represent
the views of the Federal Reserve Bank of
Philadelphia or the Federal Reserve System.

the major forces supporting mortgages,
the collateral, has weakened.
As we now know, that situation
fed the creation of a major world
financial crisis. As we pick ourselves
up from the crisis, we see that one
source of these mortgage problems
has been the validity of the home
appraisal, which is supposed to be an
objective and expert dollar valuation
of the house that should help make a
mortgage safer and more marketable.
Unfortunately, the appraisal process
can go awry and often has. As we
shall see, appraisals have been biased
upward. This made mortgages riskier,
since too much was lent out on homes.
One of the safeguards, the appraisal,
failed to perform its role of limiting
mortgages to the underlying value of
the houses.
Now a reverse risk is at work: The
bias is going the other way, causing
home valuations to be underestimated,
and this may make new mortgages
harder to obtain. If so, this could delay
improvement in housing markets,
which, in turn, could cause house
prices to fall more than they otherwise
would, possibly causing additional
losses to mortgage lenders.
One way in which an appraisal
can go awry is that the information
upon which the house is valued
may be very thin; recent nearby
comparable house sales may be so few
that the price at which the house is
likely to be resold may be difficult to
predict precisely. A second reason the
appraisal process can go awry is that
all parties may not want a genuinely
independent appraisal.
As we reform our system of
mortgage lending, one piece we might
Business Review Q1 2010 11

wish to focus on is the appraisal
system. Indeed, some steps have
already been taken in this direction.
HOW APPRAISALS ARE
SUPPOSED TO WORK
A standard part of a home mortgage is an appraisal, an independent
evaluation of the home’s value. After
the seller and buyer have agreed on
a price, the mortgage lender usually
requires an appraisal. This is an estimate of the value of the house, made
by a professional appraiser and based
on local market conditions; the appraiser examines nearby recently sold
houses and compares them in terms of
characteristics such as size, location,
and condition with the house to be
mortgaged.
A typical appraisal costs $250 to
$400. In a boom year like 2005, when
there were more than 7 million new
home mortgages on purchases of new
and existing one- to four-unit family
homes and a similarly large number
of refinancings, roughly $4 billion was
spent on appraisals in the U.S. These
appraisals are part of an underwriting process whose aim is to determine
whether lenders accept mortgage
applications. This is a serious process
with trillions of dollars at stake. In that
same year, more than 5 million mortgages were denied, representing nearly
$1 trillion of loans applied for.1
The appraisal further certifies to
the mortgage originator and — if the
loan is securitized — to the ultimate
lender the value of the collateral for
the mortgage. The appraisal addresses
the lender’s worries about whether the

1
National mortgage data are from the HMDA
National Aggregate Report, 2005, available
at http://www.ffiec.gov/hmdaadwebreport/
NatAggWelcome.aspx. It is difficult to know
from these data how many of the denials were
due to appraisals, but the limited data suggest
that appraisals were responsible for only a small
proportion of denials.

12 Q1 2010 Business Review

loan will be repaid. In the past, mortgages have generally been relatively
safe loans because the borrower’s home
backs the promise to repay.2 A house
as collateral has two advantages for the
lender: First, the borrowing household is usually loath to lose its home:
Moving is costly and so is the loss of
concomitant personal ties to neighbors
and schools. So if a family can make
the payments, it generally will. Second,
even if the household cannot make
the payments, the house can be resold,
and the loan usually can be mostly or
entirely repaid out of the proceeds.
The typical mortgage loan’s safety
is connected to the down payment
made by the borrower; this fact is wellestablished by empirical research on
U.S. data. Briefly, the down payment
provides an equity stake for the borrower — a commitment of dollars by
the borrower that the borrower loses if
he or she defaults — as well as security
for the mortgage lender.3 One cause
of recent mortgage losses has been
house values that have fallen below
the amount borrowed, a case in which
the borrower’s home equity stake has
disappeared as a consequence of borrowing too much and the price of the
house falling.
While most homeowners will
continue to pay their mortgages even
after their home equity has disappeared, many find themselves unable
to keep up with payments, often as a
result of unemployment or illness, and
some of them will eventually lose their
homes to foreclosure. In addition, in
recent years, a significant proportion
of homes were bought by investors,

many of whom are more likely to
default as home equity is lost.4 During the recent housing boom, housing
market participants lost sight of the
importance of the down payment, in
many cases because house prices kept
rising so consistently. If house prices
rise continuously, the down payment
may not matter. If a house is purchased
without a down payment, the mortgage
loan is worth the same as the house,
and the lender has no margin of safety.
But if the house price goes up 20
percent, the margin of safety will have
reappeared, and the loan will turn
out to be safe. In the U.S., during the
six years from the end of 1999 to the
end of 2005, house prices rose at an
annual rate of 11.3 percent (according
to the Case-Shiller U.S. house-price
index). During that period, on average,
house-price appreciation created more
than a 20 percent margin of safety in
two years’ time. During this period, it
appeared as if mortgages made with
no down payments were reasonable
investments. By contrast, in the longer
period from 1970 to 1999, house prices
appear to have risen about 5 to 6 percent annually.
In more normal times, when
prices aren’t rising quite so quickly, the
precise value of the home on the market, and whether it will be sufficient to
repay the mortgage, is crucial information for the mortgage lender, since it
influences both the likelihood that the
mortgage will lose value and how the
mortgage lender will approach legal
options if the borrower falls behind in
payments.
One way the lender attempts
to gauge the underlying value of the
house at the time the loan is made is

2

In the 1980s, the savings and loan crisis also
had mortgage lending at its root, but this had
less to do with mortgage defaults and more to do
with unusually high long-term interest rates.

3
For a fuller discussion of the risks of loans and
the value of the down payment, see Ronel Elul’s
Business Review article.

4
See, for example, Shane Sherland’s working
paper on default rates of subprime mortgages
and Yuliya Demyanyk and Otto Van Hemert’s
forthcoming article on the decline in mortgage
lending standards in recent years.

www.philadelphiafed.org

from the sale price of the house: What
the borrower is willing to pay for the
house is usually a good measure of its
worth. But the buyer may have overpaid. Worse yet, the buyer may have
deliberately overpaid to a partner, with
the pretend “transaction” intended
to fraudulently extract money from
the lender. In a classic “land flip,”
criminal A sells a house to criminal
B at an inflated price, and the two
then abscond with the cash lent by the
mortgage lender.
To collect more information
about the underlying value, the
lender obtains an appraisal of the
house’s value, that is, an estimate by a
professional appraiser, based on prices
paid for local comparable houses. This
additional information may be needed
because the borrower may have overbid
for the house, in which case the lender
may be leery of financing it. Moreover,
even if the borrower has paid the right
price for the house, other sales testify
that the market for houses in that
neighborhood is active, and that if the
house needs to be sold, the market is
not so thin that an additional house
for sale will result in a large drop in
price.
How is the information from the
appraisal used in the mortgage? First,
if the information from the appraisal
does not give the lender confidence
in the appraisal valuation, the lender
may refuse to make the mortgage loan.
For example, if the comparable houses
used in the appraisal are in a different
neighborhood from the house being
appraised, the loan may be refused.
Second, a conservative rule is used to
determine the value of the house for
the purposes of the mortgage. The
lender bases whether to approve the
mortgage on whichever is lower, the
appraised value or the transaction
price. The standard conventional
prime mortgage must have a loanto-value (LTV) ratio of 80 percent

www.philadelphiafed.org

to qualify for a low interest rate; the
valuation used for this purpose is the
lower of the appraised value or the
transaction price.5
Suppose a prospective home
buyer reaches a purchase agreement to
buy a house for $100,000. The buyer
has $20,000 with which to make a
down payment, so she just qualifies
for the lowest interest rate, borrow-

do cash-out refinancing, where they
increase the size of the mortgage
loan and reduce their implied home
equity. Freddie Mac has estimated
that from 2002 to 2008, over $1
trillion in cash was taken out of
prime mortgages. While in many
cases this cash was used to improve
the properties — improvements
that may raise the properties’ value

One way the lender attempts to gauge the
underlying value of the house at the time the
loan is made is from the sale price of the house:
What the borrower is willing to pay for the
house is usually a good measure of its worth.
ing $80,000. However, suppose the
appraisal comes in at $95,000. In
calculating the loan-to-value ratio, the
mortgage lender will set the value of
the house at the lesser of the appraisal
valuation ($95,000) or the sale price
($100,000). Thus, the mortgage document records a house value of $95,000
and a loan of $80,000, so the loan-tovalue ratio is 84 percent, too high to
qualify for the best interest rate.
Appraisals are also used by lenders
when the borrower wants to refinance
an existing mortgage or take out a
second mortgage, also called a home
equity loan. Whenever mortgage rates
have fallen, as they did dramatically
in 2003, households have refinanced
their homes to take advantage of
lower interest rates. Many households
have taken these opportunities to

5

In its guide to mortgage originators (known
as underwriters), Fannie Mae states, “For a
purchase mortgage, the LTV ratio is calculated
by dividing the amount of the mortgage by the
lower of the appraised value or the sales price
of the property” and that “an LTV ratio greater
than 80 percent requires credit enhancement,
such as primary mortgage insurance.”

and thus only partially reduce home
equity — research shows that many
of these cash-outs were used to
finance consumer expenditures or
to reduce other debts.6 The high
loan-to-value ratios resulting from
cash-out refinancing are by no means
limited to low- and moderate-income
populations; many examples come
from expensive houses in wealthy
neighborhoods.
APPRAISALS, MORTGAGES,
AND LOCATION
Location and Valuation. Let
us briefly explore the relationship
between location and value that
underlies the appraisal and justifies the
real estate motto: location, location,
location. One way that houses differ
from mass-produced goods is that each
house’s value is in part based on its
unique location. Location affects various attributes of the house, in particular its distance to other locations, such
as work sites, shopping, transportation,

6
See the article by Alan Greenspan and James
Kennedy.

Business Review Q1 2010 13

and leisure amenities. Houses together
constitute neighborhoods, united by
schools, social networks, building
codes, and political units. Houses close
to one another are relatively substitutable, and their prices will tend to move
together; houses distant from one
another are not such easy substitutes
for one another, and their prices may
not move together.
Put another way, a house consists
of a structure and a piece of land. The
structure can be valued at its replacement cost, which is likely to be similar
from one location to the next. As a
result, structures are more like massproduced goods than unique goods.
The value of the land, which differs by
location, can differ very substantially
from place to place.7
Economists group the determinants of land valuation into amenities
and work opportunities. Although
labor economists often see work as
the main determinant of wages, urban
economists see amenities and work opportunities as jointly determining both
wages and land prices.8 In particular,
the greater the amenities and the
higher the productivity of nearby work
opportunities, the greater the price of
land. By contrast, greater amenities
tend to lower the wage rate because
workers may be willing to work for
lower pay to live in a nice location.
The House Sale. A homeowner
will typically have a general idea of
what the house is worth. However, ex-

7

This distinction is not absolute, of course, and
structures can become highly idiosyncratic,
while plots of land within a single development
or homes within an apartment building may
be quite similar. Moreover, a structure may
be unsuited to its location, in which case the
structure does not add its full value to the land.
In this case, it is inappropriate to value a house
as the sum of its value as a structure and a piece
of land, which can be seen as an upper limit on
the value of the house.

8

See Gerald Carlino’s Business Review article
and the chapter by Glenn Blomquist.

14 Q1 2010 Business Review

actly what the house will fetch on the
market from an actual sale may depend
on many factors. The potential buyers
of a given unit have some knowledge
of the house’s value to themselves as
specific households relative to other
units. In addition, they may know the
prices of recently purchased nearby
units and the offering prices of nearby
for-sale units. They then bargain with
the seller over the particular unit, and
a sale may take place.9 The price paid
will depend on bargaining skill, the
availability of substitute units, characteristics of the particular unit, and the
buyer’s and seller’s tastes for the amenities offered by the particular unit. For
example, committed sellers, that is,
those who must sell because they are
moving to another city or have already
agreed to purchase another house, are
more likely to accept a sale price below
the expected value than sellers who
are waiting to see what their home will
fetch.
All of this matters to the mortgage
lender because the fact that a house
has sold at a given price may not be a
strong guarantee that the house can
be resold at that same price, should a
resale prove necessary. In a foreclosure
sale, that might mean that the lender
will not be fully repaid for the loan. To
get a better fix on the underlying value
of the house, the mortgage lender
turns to an appraiser.
The Appraisal. In making a
home appraisal, the appraiser typically
presents the lender with sales data on
recent comparable house sales. As part
of this process, the appraiser will note
whether these sales are indeed recent
and closely comparable. All this information helps the lender know how
accurate the appraisal is likely to be.
The lender wants to know how much

9
A formal model that describes a housing
market in this way is set forth in the article by
Daniel Quan and John Quigley.

the house in question is likely to sell
for if a resale is necessary, that is, how
much the collateral is worth. If a lot of
similar houses have been sold in the
neighborhood for similar prices, the
lender can be reasonably sure that the
house can be resold, if necessary, for a
price close to the sale price. However,
when there aren’t many comparable
sales, it is possible that no other buyers
will be found for this particular house
at or near the sale price.
In a typical appraisal, the appraiser is expected to give an appraised value and to document the
basis of the valuation. Appraisers are
subject to state regulation; typically,
they have certification that they have
met both education and experience
requirements. In addition, appraisers
are expected to be objective and not
be swayed by the participants in the
transaction. Yet the participants have
an important stake in the success of
the transaction.10
From the buyer’s perspective, the
down payment represents the difference between the sale price and the
amount the buyer must borrow. If the
house costs $180,000 (the median sale
price in the fourth quarter of 2008)
and the buyer can put $36,000 down
after meeting transaction costs so the
down payment is 20 percent of the
house value, the amount the buyer
needs to borrow is $144,000. However,
if the resale value of the house is really,
say, $160,000, from the perspective of
the lender, $20,000 has been lost due
to the borrower’s overpaying for the
house, and the effective down payment
is only $16,000, or 10 percent of the
house value.

10

“Today, many appraisers feel that their ethics
are under assault from clients who expect
favorable assignment results in return for future
business…Even so, the pressures appraisers
feel today are little different from those of the
past…” See Bruce M. Closser’s article.

www.philadelphiafed.org

Discrepancies between the sale
price and the appraised value thus
create a problem for the lender. If the
appraisal comes out to be less than the
agreed sale price, the down payment
may be insufficient for the loan, and
the loan may be canceled or lose its
prime status.
Because each house is unique,
there is no perfect estimate of its
underlying true value. What the lender
and the borrower both want to know
is: What would the house sell for if it
were sold again? The answer to that
question can only be an estimate,
subject to some uncertainty.
HOUSE APPRAISALS ARE
SYSTEMATICALLY BIASED
Empirical Evidence Shows
That Appraisals Have Been Biased
Upward in the U.S. Modern studies
of the accuracy of home mortgage
appraisals in the U.S. began with
an article by Man Cho and Isaac
Megbolugbe, economists at Fannie
Mae’s Office of Housing Research,
who studied the 1993 Fannie Mae loan
acquisition file, which contained over
600,000 home-purchase mortgages.
They found that in this group of
prime mortgages, only 5 percent
had appraisals that were lower than
the transaction price, while over
30 percent had appraisals that were
exactly the transaction price. The
other 65 percent were above the
purchase price. On its face, these data
suggest that appraisals may be biased.
Too many mortgage appraisals are
exactly at the transaction price, and
the distribution is highly asymmetric
(Figure 1).
Similar evidence is found in the
article by Terry Loebs, published
by the Collateral Assessment and
Technologies Committee, a group
founded by real estate information
companies. The article takes a sample
of 2.9 million home appraisals, from

www.philadelphiafed.org

FIGURE 1
Appraisal Bias

Positive bias means appraisal higher than transaction price

30.7%

2.4%
0.4%
Below
-10

0.7%
Between
-10 and -5

Between
-5 and -1

25%

27.8%

1.6%

Between
-1 and 0

0

Between
0 and 1

Between
1 and 5

6.9%

5.5%

Between
5 and 10

Above
10

Percent by which appraisal is higher than transaction price

Appraisal bias is defined as appraised value less transaction value as a percent of transaction
value. When the bias is positive, the appraised value is greater than the transaction value, and
there is no impact on the mortgage loan-to-value ratio. On the other hand, when the bias is
negative, the appraised value is less than the transaction price, and the mortgage loan-to-value
ratio will be higher (see text).
Source: Cho and Megbolugbe, 1996, Table 1, p. 48

1977 to 2004, and finds that the
appraisal price is greater than or equal
to the transaction price more than 97
percent of the time.
The reason for this asymmetry
is that appraisals below the sale price
have a different impact from appraisals
above the sale price. Specifically, the
home valuation, for the purposes of
calculating the loan-to-value ratio, is
equal to the lower of the sale price or
the appraisal. An appraisal above the
sale price does not affect the loan-tovalue ratio, but one below the transaction price does. If the loan-to-value
ratio rises, this may influence whether
the mortgage lender makes the loan.
To quote Cho and Megbolugbe, “The
way to ensure the deal is to appraise
slightly high. The appraiser asks for or
receives the transaction price and then
adds a bit to it. Since the mortgage
lenders employ the lesser of the sale
price or the appraisal, whichever is

lower, in determining the loan value,
no further information is added because of the appraisal.”
It is clear that, in some cases,
when the appraiser reports that the
appraised value of the house is below
the transaction price, the seller lowers
the price, and so the transaction
price and the appraised value of the
house come out exactly the same. In
addition, it is possible that when the
appraised value is below the sale price,
the borrower may withdraw from the
sale, since the mortgage becomes
harder to complete.11
This would account for some of
the bias and some of the large propor11
Note a further asymmetry here. An appraisal
that is too low may cause the mortgage to be
turned down and may allow the borrower to
back out of the transaction. An appraisal
that is too high doesn’t affect the mortgage
contract directly and doesn’t allow the seller to
renegotiate.

Business Review Q1 2010 15

tion of appraisals in which the bias
is exactly zero. However, as shown in
Figure 1, 25 percent of mortgages were
between zero and 1 percent above the
purchase price, while only 1.6 percent
were between zero and 1 percent below
the purchase price. If this were resulting from the transaction price being
changed or the mortgage being denied,
it would imply that roughly one-fourth
of all mortgages were being changed or
lost due to a 1 percent difference in appraisals. This seems unlikely on its face
and is not confirmed by professionals.12
Why Are Appraisals Likely to
Be Biased Upward? What appears to
be occurring is that the parties directly
involved in the transaction have a mutual interest in a somewhat upwardly
biased appraisal. A difficulty with the
underlying contract is that if a house’s
value is taken to be the lesser of the
sale price or the appraisal, and both
are good but imperfect estimates of the
true value of the house, the lesser of
the two will be biased low.
If the house value was taken to
be the average of the two values, and
both the appraisal and the sale price
reflected the underlying value of the
house but with some error, the house
value would be unbiased. The lesser of
either value, however, is always going
to be less than the average of the two
and hence biased downward.
As we have seen, when the
appraiser typically errs by setting
the appraised value at or above the
sale price, the loan-to-value ratio is
12

A further indication of the bias is that houseprice indexes that were created using both sale
prices and refinancing appraisals are now widely
considered to be biased relative to house-price
indexes constructed using only sale prices,
despite the fact that a lot of observations are
lost when refinancing appraisals are ignored.
Indeed, Andrew Leventis has written a paper
on how to eliminate the bias from the Federal
Housing Finance Agency’s (formerly the Office
of Federal Housing Enterprise Oversight, or
OFHEO) house-price index while continuing to
incorporate information from appraisals.

16 Q1 2010 Business Review

unaffected. This appears to be what
happens overall; only in relatively
few cases (perhaps 5 percent) are the
appraisals below the sale price. Such
a practice deprives most appraisals of
having independent value as measures
of the value of the house. Only if the
appraiser is convinced that the home
buyer has substantially overpaid for
the house will the appraiser signal this
to the lender by setting an appraisal
below the sale price.

According to Loebs’ report, refinance
transactions had a somewhat greater
appraisal bias (5.6 percent) than
purchase transactions (3.6 percent),
when median values are compared.
Apparently, this was a particular
problem during the recent subprime
boom. Many subprime mortgage
loans were refinanced from prime to
subprime mortgages. When borrowers
who had originally had good credit
and prime mortgages ran into financial

A somewhat different issue arises with
 
       
there is no sales transaction, since the
homeowner stays in place.
With this criterion for estimating
house value, the mortgage contract
gives the appraiser too much power
to accidentally prevent house sales
from concluding. This creates a strong
incentive for the appraiser to bias the
appraisal upward and for the other
parties — the mortgage lender and the
real estate broker — to want to hire
biased appraisers.
Note that typically the buyer is
not a “victim” of an appraisal that
is biased high. If the appraisal is too
low, and if the seller will not lower the
price, the buyer will have to come up
with a larger down payment.
Appraisals for Refinancings May
Be Even More Biased. A somewhat
different issue arises with appraisals
to refinance mortgages because
there is no sales transaction, since
the homeowner stays in place. Thus,
when a house is refinanced, there is
no sale price with which to compare
the appraisal. However, there may be
a “target” price the borrower is hoping
for. In any case, it is generally believed
that, in recent years, the appraisals
for refinancing have been more biased
than those for home-purchase loans.

difficulties, perhaps because of job loss,
illness, or divorce, these borrowers
were faced with a choice: They could
sell their homes and pay off the
mortgage, or they could refinance.
But, as mentioned before, homeowners
generally will avoid having to move
if at all possible. Such borrowers
were encouraged to refinance their
mortgages with a subprime loan while
taking cash out. The cash-out would
then permit the borrower to become
current on the new but more expensive
and larger mortgage and thus to
remain in their homes rather than be
forced to sell or be foreclosed on.13 As
we can see in Figure 2, as long as home
prices kept rising ever faster, through
late 2005, foreclosure rates were
kept artificially low, even though the
underlying mortgages were increasingly
risky.
These subprime mortgages made
sense as long as house prices kept
rising; however, they become highly

13
See the article by Kristopher Gerardi, Adam
Hale Shapiro, and Paul Willen for a discussion
of the history of borrowers who wind up using
subprime loans.

www.philadelphiafed.org

15
As of mid-2009, reports say that half of all
subprime loans are either in foreclosure or are
delinquent, that is, at least 30 days behind in
payment.

www.philadelphiafed.org

Foreclosure Rates Remained Low As Long

   
20.0%
15.0%

House-price inflation
(right-hand scale)

6

10.0%
5.0%

5

0.0%
4

-5.0%
-10.0%

3

-15.0%

Foreclosure rate
(left-hand scale)

2

-20.0%
-25.0%

1

-30.0%

House-price inflation rate (Case-Shiller index)

Foreclosure annual percent

7

-35.0%
19901
19903
19911
19913
19921
19923
19931
19933
19941
19943
19951
19953
19961
19963
19971
19973
19981
19983
19991
19993
20001
20003
20011
20013
20021
20023
20031
20033
20041
20043
20051
20053
20061
20063
20071
20073
20081
20083
20091

0

Annualized foreclosure rate

House-price inflation

Sources of data: (1) Foreclosure rate: Mortgage Bankers Association, mortgage foreclosures started,
quarterly, seasonally adjusted, and annualized, Haver Analytics; (2) House-price inflation: S&P/
Case-Shiller U.S. National House Price Index, seasonally adjusted, quarterly, at annual rates, Haver
Analytics.

FIGURE 3
   
12.00

1200.000

Mortgage rates (right-hand scale)
fell to new lows in 2002 and 2003

10.00

1000.000
8.00
800.000
6.00
600.000

Refinancings (left-hand scale)
rose as mortgage rates fell

4.00

400.000

200.000

2.00

0.000

0.00

All mortgage originations

Refinanced mortgage originations

30-year mortgage interest rate, percent (Freddie Mac)

Percent
1400.000

19911
19913
19921
19923
19931
19933
19941
19943
19951
19953
19961
19963
19971
19973
19981
19983
19991
19993
20001
20003
20011
20013
20021
20023
20031
20033
20041
20043
20051
20053
20061
20063
20071
20073
20081
20083
20091

14
An article by Yongheng Deng, John Quigley,
and Robert Van Order provides the best
evidence of the size of this default impact, and
Ronel Elul’s Business Review article provides a
more accessible qualitative view.

FIGURE 2

Billion $ of mortgage originations

risky when house prices began falling.
They were even more tempting during
the period from 2003 to 2005, when
long-term interest rates, and mortgage
interest rates in particular, were
unusually low. In Figure 3, we see that
beginning at 2003, 30-year mortgage
rates (as measured by Freddie Mac)
fell below 6 percent for the first time
in over 30 years. As a consequence,
the rate of mortgage originations
rose to about $1 trillion a quarter!
Refinancings drove these record rates
of originations.
Consequences of Bad Appraisals. If appraisals are not trustworthy,
lenders may wind up lending too much
money relative to the home’s value.
When this happens, defaults are more
likely to occur.14 Unfortunately, there
has been very little academic work on
the impact of biased appraisals despite
the importance of the subject.
The lone published academic
article, by Michael Lacour-Little and
Stephen Malpezzi, uses a small data set
from Alaska in the 1980s to show evidence that for a single thrift institution
in Alaska, appraisal bias was positively
associated with more frequent default.
If, indeed, appraisal bias has
been larger for subprime loans, then
since we know that subprime loans
have experienced a very high rate
of delinquency and loss,15 there
may be a substantial relationship
between appraisal bias and poor loan
performance. But, in general, one
might expect a relationship between
appraisal bias and subsequent loan
performance, not only because

30-year mortgage rate

Sources of data: All data, Freddie Mac, Primary Mortgage Market Survey, Haver Analytics. (1)
Mortgage rates: 30-Year Fixed Rate Mortgage Interest Rate, percent; (2) Total Mortgages and
Refinancings: Mortgage Originations, 1-4 Family: Total and Refinance, billions of dollars, nominal.

Business Review Q1 2010 17

appraisal bias may be evidence of poor
lending practices but also because
appraisal bias may permit weak or
fraudulent loans. Disentangling the
role of appraisal bias in the recent
housing crisis is an important avenue
for research.
A FEEDBACK LOOP IN
APPRAISAL ACCURACY
The Current Situation.
Beginning in 2008, we have entered
a period of high home foreclosures in
which many homeowners have lost
their homes due to nonpayment of
their mortgages. A large proportion of
all house sales in 2009 appear to have
been homes that had been foreclosed
in the 12 months before sale, as much
as 20 percent, according to zillow.
com.16 While this report is difficult to
verify, it is clear that total foreclosures
— whether they are soon sold or
not — are indeed very substantial.
According to the Mortgage Bankers
Association, as we can see in Figure 1,
the annual foreclosure rate has risen
to over 5 percent. Since in 2007,
according to the American Housing
Survey, there were an estimated 50
million mortgages held by households
who occupy their own homes, and that
number is unlikely to have fallen much
by 2009, that implies over 2 million
foreclosures. With total single-family
home sales running less than 5 million
annually in 2009, this suggests that the
zillow.com rate of foreclosures is by no
means implausibly high.
Why does the proportion of
foreclosure sales matter? Because
they could be reducing even further
the appraised value of homes. In
some areas, many of the house prices

16
Zillow.com is a website that seeks to aggregate
information about home sales. This estimate
is taken from Dan Levy, “U.S. Underwater
Mortgages May Reach 30%, Zillow Says,”
Bloomberg News, August 11, 2009.

18 Q1 2010 Business Review

available for comparison in appraisals
may be from foreclosure or otherwise
distressed sales. Many of these houses
are being sold at foreclosure auctions.

In addition to these distressedmarket price distortions, the volume of
sales affects the accuracy of appraisals.
This is a network effect that generates

While auctions are often a good way to sell
objects, it is not clear that they fetch the best
price in real estate, where information costs
            
  
This may cause some of them to be
sold below their usual market price
and may cause a downward drag on
estimates of house prices.
While auctions are often a good
way to sell objects, it is not clear that
they fetch the best price in real estate,
where information costs are high, and
when obtaining finance is often difficult. Indeed, there is some evidence
that an impatient seller does not get
the best price.17 According to a study
by John Campbell, Stefano Giglio, and
Parag Pathak, using data from Massachusetts, they find that foreclosed
homes sell for nearly 30 percent less
than they otherwise would. If foreclosure sales, on average, produce low
prices, this may make it more difficult
to ascertain what the true underlying
value of homes is.
Uncertainty and foreclosure may
now be causing house appraisals to be
biased too low. Under current arrangements, low mortgage appraisals will
tend to cause too few mortgage loans
to be approved. This, in turn, lowers
the demand for homes and may cause
the price of homes to sink lower than
they otherwise would.

17
See the article by David Genesove and
Christopher Mayer.

economies of scale18 — the more
participants, the better — that can
create a feedback loop: Fewer sales
mean less accurate appraisals, thereby
making lenders leery of lending, which
further reduces sales. William Lang
and I developed a model of home sales
and appraisals back in 1993 in which
a reduction in completed home-sale
transactions can feed on itself.
A Possible Vicious Cycle. The
reason for this particular feedback loop
is that if the pace of home sales slows,
the appraisal becomes less precise.
This makes the mortgage riskier, making it more likely the lender will reject
it. If the home mortgage application
is rejected, the transaction may fall
through and thus no sale will be made.
This further reduces the precision with
which the underlying value of houses
in that neighborhood is known and
possibly induces more mortgage rejections.
Our model identifies two somewhat separate issues. One is that small
changes in, say, borrowers’ risk, which

18

A classic economy of scale exists when a firm’s
per unit costs fall the more output increases.
When this is true, the most efficient way to
produce is to have a single firm produce for
the entire market. Another kind of economy
of scale is a network economy of scale: The
more participants there are, the more valuable
participation is.

www.philadelphiafed.org

may cause a given loan to be rejected,
can lead to large and persistent changes
in the market equilibrium. The feedback effect can cause mortgages to become much riskier and therefore make
a real estate market face lower transactions and lower prices for a sustained
period of time. The second issue is
that these effects may be inefficient
because they are caused by a market
failure and therefore may call for some
form of public intervention.
The problem is that one person’s
transaction provides information
(about the local value of homes) that
is useful to others’ ability to complete
their own transactions on nearby
homes. In an ideal world, the buyers
who come later would be able to compensate earlier buyers for providing this
information. But there is no simple way
for a potential buyer to compensate
an earlier buyer. In turn, the number
of transactions will typically be lower
than would occur if some system of
compensation were feasible.
This type of market failure is
called an externality: An activity external to a given economic action affects
the value of the action. Other, more familiar examples of externalities are air
pollution (such as carbon emissions)
and pollination by insects such as bees.
When an externality occurs, existing
markets may not be efficient, and it
is possible that a government policy
intervention could improve economic
outcomes. For example, the Internet
presents a network externality: The
greater the number of people who use
the Internet, the more valuable the
Internet becomes. Government assistance helped establish the first Internet
link-up, and we can argue that this
was a good use of public funds because
the first users of a network such as the
Internet do not gain as much value as
those who use it once there is widespread adoption.

www.philadelphiafed.org

But these externalities cut both
ways. Growing networks add value to
all users, but shrinking networks fall
in value. When a given technology
becomes less used, it may become less
efficient for all users. Anyone who has
recently rented a shopworn videocassette of a classic movie has experienced
this effect.
Similarly, a mortgage loan may
be denied because the lender thinks
there is a chance the borrower may
default on the loan. But if the loan is
close to being acceptable, perhaps the
lender would make the loan if the borrower paid a small amount extra. Now
because future buyers and sellers (and
lenders) would benefit from the sale
going through (because it would shed
light on the value of properties in that
neighborhood), this information might
be worth enough to warrant paying the
additional amount the lender would
require to make this loan. That is,
society as a whole might be better off if
the mortgage was accepted, although
private incentives lead to the mortgage
being rejected.
Quantitative Importance of
Appraisal Information Externalities.
Empirical papers, some gathered in
two issues of the Journal of Real Estate
Finance and Economics, have served
to confirm a number of the points
raised in this model. For example, Paul
Calem showed that in white households, the mortgage denial rate rises
as the number of home sales increases.
It does appear that fewer transactions
are associated with a higher rate of
loan rejection. However, an interesting
variation can occur.
The model we have been discussing supposes that borrowers and lenders are individual players in a large,
competitive market, rather than dominant players, so that the price information provided by a transaction is not of
much value to either party: It can be
used by any lender or borrower. As a

result, neither party has an incentive
to go the extra mile to conclude the
deal because of the information value
alone. However, if one lender is a predominant lender in an area, the lender
may take future potential transactions
into account: In a neighborhood where
deals are few, the lender may push
through a mortgage for the sake of providing more information, knowing that
by encouraging future transactions, the
lender may be recompensed for making
a slightly excessively risky loan. That
is, the externality can be internalized by
the lender.19
To the extent that this occurs,
the externality may be mitigated by
the marketplace, and public intervention may not be justified. However,
monopoly lending will itself tend to be
a problem: Ignoring the informational
externality, monopolists tend to charge
higher rates and make fewer loans than
would competitive lenders.
A more recent paper by McKinley
Blackburn and Todd Vermilyea
presents a test of the relevance of
these informational externalities
on mortgage loan data primarily
from 1998. To test for information
externalities, they use a sample of over
2,000 mortgage loans that comes with
detailed data about the borrowers.
They confirm the existence of these
informational externalities and
estimate that 10 percent of the tracts
in their sample are materially affected
by the externality. This is in addition
to the economies of scale by lender
that Avery and co-authors found.
In essence, what Blackburn and
Vermilyea do is show that the probability that a lender will turn down a
particular mortgage application varies

19

A paper by Robert Avery, Patricia Beeson,
and Mark Sniderman argues that all of the
externality was internal to the lender. However,
this paper had the weakness of not having
detailed information about the borrowers.

Business Review Q1 2010 19

with the average number of home
sales successfully completed in the
census tract. Unlike previous studies,
their study has detailed data about the
mortgage applicant and the mortgage
application, including detailed credit
information about the applicant and
the applicant’s income, employment
history, race, sex, and marital status.
These effects on denials apply to
census tracts with 20 or fewer home
sales in the previous year, or about 10
percent of census tracts. In addition,
more denials occur when the lender
has fewer than eight sales in a given
tract.
One reason that house prices
might fall further than they otherwise would is that after a period of
having appraisals that were biased
upward, we have entered a period in
which appraisals are being performed
with less bias and which are now less
precise. This may well have resulted in
a substantial increase in the number of
mortgage applications denied, applications that would have been accepted
a few years earlier. This in turn may
have made it harder for purchasers to
buy houses, reducing effective demand
and resulting in lower house prices.
APPRAISAL INACCURACY:
CAN SOMETHING BE DONE?
Appraisals have become more
inaccurate for three reasons: bias,
fewer home sales, and foreclosures.
Can the contract be rewritten so that
there is more room for variation in the
appraisal, so that the appraisal will
typically be more informative? This is
a matter for future research, but it is an
urgent question.
Negotiations between Fannie
Mae, Freddie Mac, and the New York
attorney general’s office have resulted
in a “Home Value Protection Program
and Cooperation Agreement,” whose
main aim is to prevent lenders from
influencing appraisals.

20 Q1 2010 Business Review

The major impact of the new
agreement is to ensure that appraisers
are not chosen by parties whose only
incentive is to make the loan and who
have little regard for the loan’s safety.
Thus, mortgage brokers are excluded
from choosing appraisers, and restric-

able leeway for the possible error in the
appraisal. Then the house value used
in determining the loan-to-value ratio
would be the sale price or the appraisal
plus 3 percent, whichever was lower. In
most cases, this would mean that the
appraisal (plus 3 percent) was higher

Appraisals have become more inaccurate for
three reasons: bias, fewer home sales, and
foreclosures.
tions are placed on how the “in-house”
appraisers used by mortgage lenders are
chosen; in particular, the process must
be independent of the loan production
staff.
This agreement will tend to ensure that appraisals are arrived at more
objectively. However, it may have the
side effect of making mortgage loans
harder to obtain and may cause some
sound home loans to be rejected.
Moreover, we have emphasized
that if appraisals are unbiased estimates of a house’s value, the house
value — which is based on the lesser
of the sale price and the appraisal
value — is biased downward. So the
downward bias will likely have a larger
impact on causing sound mortgages to
be rejected as appraisals become more
objective.
How to reduce the incentives for
an upwardly biased appraisal is a difficult problem that has not been solved.
The fundamental problem is that a
low appraisal can cause the mortgage
to be rejected, and this may be due
not to the intrinsic value of the house,
but to the fact that the appraisal is an
estimate, and is not exact.
One possible solution to this
problem is to deliberately add a small,
fixed amount, say, 3 percent, to the
appraisal. This would provide a reason-

than the sale price, and the house
value would be affected only when the
appraisal was substantially below the
sale price. This would largely eliminate
the direct incentive for the appraisal
to be biased upward and permit the
appraiser to honestly value the house
without excessively discouraging home
mortgages. If appraisers become used
to unbiased appraisals, this might also
encourage more balanced appraisals of
refinanced properties.20 However, possible changes to the mortgage contract
like this one need much careful study.
Here it would be helpful if more
appraisal data were available. Although
both the appraisal and the sale price
are recorded as part of the mortgage
data required by the lender, many
real estate data sets do not separately
include the appraisal and the sale
price. Rather, what is recorded is the
house value, almost always the sale
price in a home-purchase mortgage
and the appraisal in a refinance. This

20

If the procedures used by the appraisers are
the same for home-purchase mortgages as for
refinancings, lower bias in the home-purchase
mortgage may spill over into the refinanced
mortgage. In either case, lenders and others can
monitor the bias of appraisers using tools such
as automated appraisal systems.

www.philadelphiafed.org

makes it difficult for most researchers
to examine appraisal practices.
For those who have the data,
Fannie Mae, Freddie Mac, and other
processors of mortgage data have
created proprietary loan valuation
products, called automated valuation
models, to estimate the underlying
value of mortgages, that is, to create
an automated second appraisal that
can be used to further judge the value
of a house. These statistical models do
not provide as good an appraisal as the
local appraiser on the ground could,
but they are highly useful in helping
lenders to gauge the risk in valuations
and to detect appraisal bias.
It would be very helpful if the data
sets that include appraisals — such
as those of Fannie Mae and Freddie
Mac and the other mortgage-lending
government entities such as the
Federal Housing Administration and
the Veterans Administration — were
made broadly available to researchers,

www.philadelphiafed.org

analysts, lenders, and appraisers,
subject to standard privacy protections.
These data sets could, for example,
be used to verify that appraisers have
in fact reformed their procedures
and are generally providing unbiased
appraisals.

If we do not act to
improve the appraisal
system, we may end
up with the worst of
both worlds.
Note, however, that basing
appraisals on sales of foreclosed homes
is likely to cause a further downward
bias. On the other hand, appraisers
may not be able to find enough sales of
nonforeclosed homes to provide a good
estimate of normal home sales. To the
extent that more data can be made

easily and quickly accessible, some of
these problems may be overcome.
The current appraisal process
may make it more difficult for sound
borrowers to conclude home purchases.
If so, that could be limiting the
demand for existing homes, which
could result in house values falling
further. And that could worsen
financial losses and delay a return to
normalcy in home real estate markets.
If we do not act to improve the
appraisal system, we may end up with
the worst of both worlds. That is, we
may experience a period of objective
appraisals that cause more mortgages
to fail, but as the current crisis fades
from memory, end up back in a
situation in which all parties desire
biased appraisals. And that might well
mean that biased appraisals could
eventually reappear and help reflate
another housing bubble. BR

Business Review Q1 2010 21

REFERENCES

Avery, Robert B., Patricia E. Beeson,
and Mark S. Sniderman. “Neighborhood
Information and Home Mortgage
Lending,” Journal of Urban Economics, 45
(1999), pp. 287-310.

Deng, Yongheng, John M. Quigley,
and Robert Van Order. “Mortgage
Terminations, Heterogeneity, and
the Exercise of Mortgage Options,”
Econometrica, 68 (2000), pp. 275-308.

Lacour-Little, Michael, and Stephen
Malpezzi. “Appraisal Quality and
Residential Mortgage Default: Evidence
from Alaska,” Journal of Real Estate Finance
and Economics, 27:2 (2003), pp. 211-33.

Blackburn, McKinley, and Todd Vermilyea.
“The Role of Information Externalities
and Scale Economies in Home Mortgage
Lending Decisions,” Journal of Urban
Economics, 61 (2007), pp. 71-85.

Elul, Ronel. “Residential Mortgage
Default,” Federal Reserve Bank of
Philadelphia Business Review (Third
Quarter 2006).

Lang, William W., and Leonard I.
Nakamura. “A Model of Redlining,”
Journal of Urban Economics, 33 (March
1993), pp. 223-34.

Fannie Mae. Underwriting Residential
Mortgages. Cornerstone Series (August
2007).

Lang, William W., and Leonard I.
Nakamura. “Information Losses in a
Dynamic Model of Credit,” Journal of
Finance, 44 (July 1989), pp. 731-46.

Blomquist, Glenn C. “Measuring Quality
of Life,” in Richard J. Arnott and Daniel
P. McMillen, eds., A Companion to Urban
Economics. Malden, MA: Blackwell
Publishing, 2006, pp. 83-501.
Calem, Paul. “Mortgage Credit Availability
in Low- and Moderate-Income Minority
Neighborhoods: Are Information
Externalities Critical?” Journal of Real
Estate Finance and Economics 13 (July
1996), pp. 71-89.
Campbell, John Y., Stefano Giglio, and
Parag Pathak. “Forced Sales and House
Prices,” Working Paper (March 2009).
Carlino, Gerald. “City Beautiful,” Federal
Reserve Bank of Philadelphia Business
Review (Third Quarter 2009).

Ferreira, Fernando, Joseph Gyourko,
and Joseph S. Tracy. “Housing Busts and
Household Mobility,” NBER Working
Paper W14310 (September 2008).
Genesove, David, and Christopher
Mayer. “Loss Aversion and Seller Behavior:
Evidence from the Housing Market,” Quarterly Journal of Economics, 116 (November
2001), pp. 1233-60.
Gerardi, Kristopher, Adam Hale Shapiro,
and Paul S. Willen. “Subprime Outcomes:
Risky Mortgages, Homeownership
Experiences, and Foreclosures,” Federal
Reserve Bank of Boston, Working Paper
07-15 (May 2008).

Closser, Bruce M. “The Evolution of
Appraiser Ethics and Standards,” Appraisal
Journal, 75:2 (Spring 2007), pp. 116-29.

Gordon, Doug. “Accurate AVM Values
Help Monitor Subprime Appraisal Risk,”
presentation at the Mortgage Bankers
Association Subprime Lending Conference, May 13, 2004. The presentation is
available at http://www.mortgagebankers.
org/files/present2004/SUBPRIME/
AccurateAVMshelpsubprimeappraisalrisk.
ppt.

Demyanyk, Yuliya, and Otto Van Hemert.
“Understanding the Subprime Mortgage
Crisis,” Review of Financial Studies
(forthcoming).

Greenspan, Alan, and James E. Kennedy.
“Sources and Uses of Equity Extracted
from Homes,” Oxford Review of Economic
Policy, 24 (2008), pp. 120-44.

Cho, Man, and Isaac F. Megbolugbe. “An
Empirical Analysis of Property Appraisal
and Mortgage Redlining,” Journal of Real
Estate Finance and Economics, 13 (July
1996), pp. 45-55.

22 Q1 2010 Business Review

Leventis, Andrew. “Removing Appraisal
Bias from a Repeat-Transactions House
Price Index,” OFHEO Working Paper 06-1
(February 2006).
Loebs, Terry. “Systemic Risks in
Residential Property Valuations:
Perceptions and Reality,” Collateral
Assessment and Technologies
Committee, June 2005, available at
http://catcommittee.org/catc/index.php
(registration required).
Quan, Daniel C., and John M. Quigley.
“Price Formation and the Appraisal
Function in Real Estate Markets,” Journal
of Real Estate Finance and Economics 4
(1991), pp. 127-46.
Sherland, Shane M. “The Past, Present,
and Future of Subprime Mortgages,”
Board of Governors of the Federal Reserve
System, Finance and Economic Discussion
Series 2008-63 (2008).

www.philadelphiafed.org

Riding the Revenue Roller Coaster:

Recent Trends in State Government Finance*
BY TIMOTHY SCHILLER

T

he fall in state tax revenue during the
current recession and the one in 2001
highlights an increase in the variability of
this source of revenue that has been observed
over the past two decades or so. But states have sources
of revenue other than taxes. However, while providing a
relatively constant portion of total revenue over the past
several years, these sources have generally not damped
variability in state revenue arising from variability in
taxes. Consequently, variation in state tax revenue
remains an important issue for state government finances.
In this article, Tim Schiller looks at the causes of the
increased variation in state tax revenue during recent
business cycles compared with earlier ones. He also
reviews strategies for coping with fluctuations in state
tax collections.

Growth in state government tax
revenue slowed around the start of
the recession that began in December
2007, then declined in late 2008. Although a decline in state tax revenue is
to be expected during a recession, the
current decline in state tax revenue
has been sharper than the decline in

Tim Schiller is a
senior economic
analyst in the
Philadelphia
Fed’s Research
Department.
This article is
available free of
charge at www.
philadelphiafed.org/
research-and-data/publications/.
www.philadelphiafed.org

overall economic activity. A similar relationship was observed in the
2001 recession. In fact, in these two
recessions, state tax revenue exhibited
much more significant weakness than
would have been predicted based on
previous recessions. This has been the
case for the total tax revenue of all
states and for the tax revenue of the
states in the Third District (Pennsylvania, New Jersey, and Delaware). The
fall in state tax revenue in these two

*The views expressed here are those of the
author and do not necessarily represent
the views of the Federal Reserve Bank of
Philadelphia or the Federal Reserve System.

recessions highlights an increase in
the variability of this source of revenue
that has been observed over the past
two decades or so.
States have sources of revenue
other than taxes. However, while
providing a relatively constant portion
of total revenue over the past several
years, these sources have generally not
damped variability in state revenue
arising from variability in taxes (see
Nontax Sources of State Revenue). Even
the funds provided to state governments under the recently enacted
federal economic stimulus program
will go only a short way in counterbalancing the falloff in state revenue
occasioned by the current recession.1
Consequently, variation in state tax
revenue remains an important issue for
state government finances.
This article looks at the causes
of the increased variation in state tax
revenue during recent business cycles
compared with earlier ones. The most
important cause has been the shift by
many states, including the Third District states, toward increased reliance
on more variable tax bases — specifically, individual income taxes — and
decreased reliance on more stable tax
bases, such as sales taxes. In addition,
broad changes in the forms of economic activity from which states derive

1

The American Recovery and Reinvestment
Act of 2009 includes approximately $150 billion
in total for state governments over each of the
three fiscal years beginning with 2008-09. By
comparison, this amount is just a small portion
of the nearly $2 trillion in total state revenue
collected in fiscal year 2007. Even with the
stimulus funds, analysts estimate that states will
face large gaps between projected revenues and
expenditures in the next several years. See the
article by Donald J. Boyd.

Business Review Q1 2010 23

 

    

S

tate governments have sources of revenue other than taxes. They receive revenue from other levels
of government (intergovernmental transfers), chiefly the federal government, although some states
receive funds from local governments. Some states operate utilities (such as water, electric, and gas) and
mass transit systems. States also provide products and services for which they charge fees, for example,
education, hospitals, highways, housing, port facilities, waste management, parks and recreation, sale
of minerals from public lands, and so forth. States obtain funds through fines, rents, and lotteries. States earn interest
on funds held on deposit. States collect contributions from employees for trust funds for state employee pension plans,
retiree pensions and medical insurance, and workers’ compensation insurance. These contributions and the earnings
and capital gains on the funds are sources of state government revenue.
Intergovernmental transfers are a large share of the nontax revenue of the states. This share has been roughly
constant at around one-fifth of total revenue over the past 20 years (see the Table on page 25). Some of the transfers
from the federal to the state governments are programmatic in such a way that they do not vary so as to offset declines
or increases in state tax revenue over the course of the business cycle, although some transfers have that effect.
Specifically related to the business cycle are federal transfers that have been enacted during past national economic
downturns and in the current recession. Although helpful in counteracting shortfalls in state revenue generally, such
transfers tend to be based on broad outlines that do not necessarily take individual state conditions into account, and
the actual disbursement of funds at the state level often comes late or even after a recession ends.a
Beside intergovernmental transfers, the shares of revenue provided by most other nontax sources listed above have
remained roughly constant for the past 20 years or more. However, among other nontax sources of funds, a large and
growing share is accounted for by states’ insurance trust funds. This share has increased from approximately 18 percent
of total revenue in 1987 to 26 percent in 2007. (This amount is included in the “Other” category in the table.) These
funds are not available to help states deal with cyclical fluctuations in revenue because they are dedicated to specific
purposes, mainly state employee and retiree health-care benefits and pensions. And although investment returns on
these funds were high until 2007, recent returns have been low or negative, presenting many states with the need to
replenish the funds. So, instead of adding to states’ financial strength, these insurance programs are actually financial
burdens, and they are becoming more pressing as pension obligations increase.b
Theoretically, the more different sources of funds that states have, the less impact changes in the flow of funds
from any single source will have on the total. However, in fact, nontax revenues are positively correlated with tax
revenue; that is, they tend to vary together in the same way. This is not too surprising because many sources of nontax
revenue are affected by the same national and state economic conditions that affect the sources of tax revenue. Thus,
the increased variation in state tax revenue that has resulted from the changes in taxation and the economy discussed
in this article has not been mitigated by nontax revenue. Despite nontax sources of revenue, fluctuating tax revenue
remains a problem for state governments’ fiscal management.
a

See the article by Richard H. Mattoon and the article by Daniel Wilson.

b

See my previous Business Review article.

their tax revenue, mostly income and
retail sales within their borders, have
affected tax collections from these
sources. Coping with fluctuations
in state tax collections has become
increasingly important, and this article
reviews strategies for doing so.

24 Q1 2010 Business Review

CHANGES IN SOURCES OF
STATE TAX REVENUE
The major sources of tax revenue
for the states are individual income
taxes, sales taxes, and corporate
income taxes. Over the past several
decades, the percentage of total tax

revenue raised by individual income
taxes has increased, and the percentage raised by sales taxes has decreased.
Because taxes are based on these
and other economic activity within
a state, tax revenue varies with state
economic conditions. This has always

www.philadelphiafed.org

been the case. However, in the past
two decades, state tax revenue has
varied more over the course of the
business cycle than it did in post-World
War II business cycles before the 2001
recession.2 Changes in sources of state
tax revenue over the past 40 years or
so have been the cause of the greater
variation. Perhaps the most important
of these changes has been the shift toward increased reliance on individual
income taxes and less on sales taxes.
Data from the U.S. Census of
Governments provide a consistent
estimate of state tax revenue amounts
and sources. These data are available
for fiscal years from 1961.3 For all
states in total, from 1961 to 2007
(the latest year for which annual data
are available), the tax revenue raised
by individual income tax increased
from 12 percent to 35 percent of total
tax revenue. Sales taxes decreased
from 58 percent to 46 percent. The
corporate income tax was unchanged
at 7 percent. (A range of other taxes,
which varies widely across the states,
make up the balance of total tax
revenue.)
For the three states of the Third
Federal Reserve District, the changes
among tax sources have been greater
than the average among all states.
From 1961 to 2007, individual income
taxes rose from 0 to 32 percent
and 40 percent of total tax revenue
in Pennsylvania and New Jersey,
respectively, following the inception
of the personal income tax in those
states.4 In Delaware, individual income

taxes were practically the same portion
of total tax revenue in 1961 — 36
percent — as they were in 2007 — 35
percent. Sales taxes declined from 64
percent to 47 percent in Pennsylvania,
58 percent to 41 percent in New
Jersey, and 24 percent to 16 percent
in Delaware.5 Corporate income taxes
decreased from 13 percent to 7 percent
of total tax revenue in Pennsylvania
but rose from 7 percent to 10 percent
in New Jersey and were practically
the same in both years in Delaware,
moving up from 9 percent to 10
percent. (The corporate income tax is
very variable year to year in all states,
so its percentage for any individual
year must be interpreted cautiously.)
Since about 1960, revenue in the
states in the region as well as across the
country has gradually shifted toward
greater reliance on income taxes and
less reliance on sales taxes, and the
shift has continued strongly in the past
10 years. Two factors are responsible
for these changes in sources of state

4
Personal income taxes were first collected in
fiscal year 1962 in New Jersey and fiscal year
1971 in Pennsylvania.
5
Delaware does not have a general sales tax
but does tax certain items, such as tobacco and
motor fuels.

The Census Bureau conducts two surveys of
state taxes and spending. The quarterly survey
covers estimates of revenues received by state
revenue departments. The annual survey covers
revenues and spending for all state government
departments and agencies. The quarterly data
are collected by calendar quarter; the annual
data are collected for fiscal years (beginning in
July for most states).

www.philadelphiafed.org

See the article by William F. Fox.

Percent of Total Revenue
Taxes

Other

1987

2007

1987

2007

1987

2007

All States

19.8

21.6

47.7

37.6

32.5

40.8

Pennsylvania

19.8

19.6

47.8

37.0

32.4

43.4

New Jersey

15.4

17.5

48.6

44.4

36.0

38.1

Delaware

14.1

16.7

47.8

39.1

38.1

44.2

2

3

6

TABLE

Intergovernmental
See the paper by Richard Mattoon and Leslie
McGranahan.

tax revenue: One reflects a policy
choice by state governments; the other
is a consequence of changes in the
economy that have altered the ways in
which workers are compensated and
the ways in which consumers spend
their money.6
Both of these factors contributed
to increased state revenue from
individual income taxes during
this period. The policy factor was
the implementation or increase in
individual income taxes. Many states,
including Pennsylvania and New
Jersey, instituted individual income
taxes, raised rates in existing income
taxes, and expanded the range of
incomes subject to tax, leading this
form of taxation to account for a
growing share of tax revenue over the
years. This policy-induced change was
compounded by changes in the ways
in which workers are compensated
that affected both the amounts and
types of individual income. During
the 1990s capital gains income rose
both absolutely and as a share of
individual income. This happened for
two reasons. One is that individuals
sold financial assets during a period

Source: U.S. Census Bureau, State
Government Tax Collections

Business Review Q1 2010 25

of rising prices for stocks and bonds,
generating taxable income. The
other is that stock options became
more common as a form of employee
compensation, and the exercise of
these options generated taxable
income.
While these changes were
boosting state tax revenue from
individual income taxes, several factors
were diminishing the relative amounts
raised by sales taxes. One factor was a
policy change: States exempted some
goods, mainly food and medicine, from
sales taxes. Other changes that tended
to reduce the relative amounts raised
by sales taxes resulted from changes
in consumer spending patterns. One
of these changes was a gradual shift
toward more consumption of services
and less consumption of goods. The
decline of sales tax revenue from this
cause is due to the fact that many
services are exempt from state sales
taxes and that states have difficulty
enforcing compliance with taxation of
services. Another, more recent change
is the growth in shopping across state
borders, which has been facilitated by
the Internet.
VARIABILITY OF STATE TAX
REVENUE HAS INCREASED
Income tends to vary more over
the business cycle than consumption:
People tend to maintain consumption
through borrowing or drawing on their
savings when their income declines
during economic slowdowns, and
they tend to save at least a portion of
their income when it increases during
economic expansions. Consequently,
tax revenue derived from income
varies more than tax revenue derived
from consumption (sales tax). (See
Figures 1 to 4.) Therefore, the shift in
sources of state tax revenue to greater
reliance on the income tax base and
less reliance on the sales tax base has
increased the variation of state tax

26 Q1 2010 Business Review

revenue over the course of the business
cycle. This variability is absolute; that
is, tax revenue in any given period
varies compared to its average over a
number of periods. It is also relative;
that is, variation in tax revenue is
greater than the variation in economic

As reliance on the
less stable income
tax has grown, states
have experienced a
two-thirds increase
in the standard
deviation of annual
total tax growth
from the 1960s to
the early 2000s.
conditions in each state. The overall
variability in tax revenue occurs even
when states have not enacted increases
or decreases in taxes (although many
states, including those in the Third
District, have during the years under
review here).
Variability, as measured by
standard deviation, in the annual
growth rate of individual income taxes
is nearly twice that of sales taxes. As
reliance on the less stable income tax
has grown, states have experienced
a two-thirds increase in the standard
deviation of annual total tax growth
from the 1960s to the early 2000s. For
the Third District states, the standard
deviation of annual growth was less
in the early 2000s than in the 1960s,
but — as is the case for the national
average — the standard deviation
increased from the 1980s to the 1990s
and early 2000s. (These annual data
are not adjusted for occasional legisla-

tive changes that raised or lowered
taxes, but other research that takes
these changes into account still finds
increased variability. See below.)
Besides the increase in absolute
variability, state tax revenue has also
become more variable with respect
to state economic conditions. That
is, changes in measures of economic
activity in a state, such as employment,
output, and income, have become
associated with proportionately larger
changes in state tax revenue in recent
years, mainly the past 10 years, than in
earlier years. (For example, see Figures
5 to 8, which illustrate that state tax
revenue varies more than total income
within a state.) The increased variability in total state tax revenue is
almost wholly due to a large increase in
the variability of income tax revenue.
Research cited earlier (Mattoon and
McGranahan) indicates that changes
in a state’s economic conditions as
measured by state employment or a
composite index of state economic
conditions have been associated with
twice as much change in income tax
revenues in the years since 1998 than
in the years before 1998.7 This research
controls for large changes in taxation
and the timing of collections in individual states. It finds that the increase
in cyclical variability of income tax
revenue since 1998 is measurable in
36 of the 43 states with an income tax
and statistically significant in 10, including two Third District states, New
Jersey and Pennsylvania.
As noted earlier, income taxes
have become a larger share of total
state tax revenue in recent years, and
capital gains have become a larger
portion of income. In combination,

7

The composite index is the state coincident
index computed by the Federal Reserve Bank of
Philadelphia. The components of the index are
employment, the unemployment rate, average
hours worked in manufacturing, and wages and
salaries adjusted for inflation.

www.philadelphiafed.org

FIGURE 1
Annual Change in Tax Revenue
Percent
Sales

40

All States

30

Individual Income
Corporate Income

20
10
0
-10

-30

19
87
19
88
19
89
19
90
19
91
19
92
19
93
19
94
19
95
19
96
19
97
19
98
19
99
20
00
20
01
20
02
20
03
20
04
20
05
20
06
20
07

-20

Source: U.S. Census Bureau, State Government Tax Collections.
Data are actual tax collections not adjusted for changes in tax laws.

FIGURE 2
Annual Change in Tax Revenue
Percent
Sales

80

Delaware

Individual Income
Corporate Income

60
40
20
0

19

-40

87
19
88
19
89
19
90
19
91
19
92
19
93
19
94
19
95
19
96
19
97
19
98
19
99
20
00
20
01
20
02
20
03
20
04
20
05
20
06
20
07

-20

Source: U.S. Census Bureau, State Government Tax Collections.
Data are actual tax collections not adjusted for changes in tax laws.

FIGURE 3
Annual Change in Tax Revenue
Percent
140

New Jersey

120

Individual Income
Corporate Income

100
80
60
40
20
0

COPING WITH THE
INCREASED VARIABILITY OF
STATE TAX REVENUE
During much of the time when
the variability of state tax revenue was
rising, it was not a problem because the
variability was mostly positive; that is,
state tax revenue was rising, usually by
as much as or more than state income.
But variability showed its other face
when a national recession occurred in
2001. In fiscal year 2002, total state
tax revenue for the 50 states declined
4 percent in nominal terms, the first
decline in the history of the census
data series on annual state revenue
since its inception in 1962. Besides the
usual recession-related weakness in
state revenue, a decline in investmentrelated income was a significant cause
of a drop in individual income tax
revenue. This was in sharp contrast to
the late 1990s when rising investment
returns boosted individual income tax
revenue.8
Because the 2001 recession was
relatively mild and brief, it did not
prompt much change in state tax poli-

19

87
19
88
19
89
19
90
19
91
19
92
19
93
19
94
19
95
19
96
19
97
19
98
19
99
20
00
20
01
20
02
20
03
20
04
20
05
20
06
20
07

-20
-40

Sales

these two factors have made capital
gains a larger share of taxable income.
Furthermore, most states, including
Delaware and New Jersey in the Third
District, have progressive income tax
rates, so variations in capital gains
income that move taxpayers across tax
brackets tend to have magnified effects
on the variation of income tax revenue. Because capital gains are more
variable than wage income, especially
over the course of a business cycle, and
because they can have a more than
proportional effect on income taxes,
the increase in their share of total income has been a primary factor in the
increase in the variability of income
tax revenue and total revenue.

Source: U.S. Census Bureau, State Government Tax Collections.
Data are actual tax collections not adjusted for changes in tax laws.

www.philadelphiafed.org

8

See the article by Nicholas Jenny.

Business Review Q1 2010 27

FIGURE 4
Annual Change in Tax Revenue
Percent
Sales

70

Pennsylvania

60

Individual Income
Corporate Income

50
40
30
20
10
0
-10
-20
19
87
19
88
19
89
19
90
19
91
19
92
19
93
19
94
19
95
19
96
19
97
19
98
19
99
20
00
20
01
20
02
20
03
20
04
20
05
20
06
20
07

-30

Source: U.S. Bureau of Economic Analysis, State Annual Personal Income: U.S. Census Bureau,
State Government Tax Collections.
Data are actual tax collections not adjusted for changes in tax laws.

FIGURE 5
Annual Change in Tax Revenue and Income
Percent
15

All States
Taxes
Income

10
5
0
-5

19

87
19
88
19
89
19
90
19
91
19
92
19
93
19
94
19
95
19
96
19
97
19
98
19
99
20
00
20
01
20
02
20
03
20
04
20
05
20
06
20
07

-10

Source: U.S. Bureau of Economic Analysis, State Annual Personal Income: U.S. Census Bureau,
State Government Tax Collections.
Data are actual tax collections not adjusted for changes in tax laws.

FIGURE 6
Annual Change in Tax Revenue and Income

Percent
20

Delaware

Taxes
Income

15
10

cies in response, although some states
enacted tax increases to compensate
for the shortfall in tax collections.
The recession that began in 2007 appears to be having the same negative
influence on state individual income
tax revenues as the 2001 recession.
Furthermore, the current recession has
also brought a larger drop in consumption spending than the 2001 recession.
The decline in consumption spending
has been especially sharp for expensive
durable goods, such as motor vehicles
and home appliances. Consequently,
state sales tax revenues have fallen
more than in the 2001 recession.9
Most state governments are legally
required to balance expenditures and
revenues for each fiscal year.10 Consequently, when actual revenues fall short
of the amounts needed for budgeted
expenditures, there are only a few ways
the gap can be closed.11 First, taxes can
be increased. Second, spending can be
cut. Third, temporary strategies can
be used, such as reassignment of funds
in state accounts. For example, some
states have a limited ability to record
expenditures and revenues in prior or
subsequent fiscal years, most states can
postpone capital expenditures, and
some states might be able to restructure payment schedules for long-term
debt. Fourth, states can use their rainy
day funds: savings accumulated from
prior years and reserved for recourse
when revenues fall below budgeted
amounts.
All of these ways of coping with
gaps between budgeted expenditures
and actual revenues were implemented
among the states as they formulated
budgets in 2009.12 According to a

5
9

See the article by Donald Boyd and Lucy
Dadayan.

0

19

87
19
88
19
89
19
90
19
91
19
92
19
93
19
94
19
95
19
96
19
97
19
98
19
99
20
00
20
01
20
02
20
03
20
04
20
05
20
06
20
07

10

-5

Source: U.S. Bureau of Economic Analysis, State Annual Personal Income: U.S. Census Bureau,
State Government Tax Collections.
Data are actual tax collections not adjusted for changes in tax laws.

28 Q1 2010 Business Review

For most states, borrowing may be used to
fund capital spending projects, but borrowing
cannot be used to fund operating expenditures.

11

See the article by Janet Stotsky.

www.philadelphiafed.org

FIGURE 7
Annual Change in Tax Revenue and Income
Percent
20

New Jersey

Taxes
Income

15
10
5
0
-5

19
87
19
88
19
89
19
90
19
91
19
92
19
93
19
94
19
95
19
96
19
97
19
98
19
99
20
00
20
01
20
02
20
03
20
04
20
05
20
06
20
07

-10

Source: U.S. Bureau of Economic Analysis, State Annual Personal Income: U.S. Census Bureau,
State Government Tax Collections.
Data are actual tax collections not adjusted for changes in tax laws.

FIGURE 8
Annual Change in Tax Revenue and Income
Percent
30

Pennsylvania

Taxes
Income

25
20
15
10
5

88
19
89
19
90
19
91
19
92
19
93
19
94
19
95
19
96
19
97
19
98
19
99
20
00
20
01
20
02
20
03
20
04
20
05
20
06
20
07

19

19

-5

87

0

Source: U.S. Bureau of Economic Analysis, State Annual Personal Income: U.S. Census Bureau,
State Government Tax Collections.
Data are actual tax collections not adjusted for changes in tax laws.

survey conducted at mid-year, 22 states
had cut spending (including Delaware,
New Jersey, and Pennsylvania), 11
had raised taxes (including Delaware,
New Jersey, and Pennsylvania), 12
had raised fees (including New Jersey),
12 had used other funds to replace
general revenue, and eight had tapped
rainy day funds.
All of these means of coping
with tax revenues that do not meet
projected amounts have limitations.
12

See the report by the National Conference of
State Legislatures.

www.philadelphiafed.org

Tax increases and spending cuts
require legislative or executive action,
or both, and are usually politically
difficult to accomplish. Temporary
strategies are often limited in scope
and, by their nature, are often
insufficient to compensate for large
gaps between current revenues and
expenditures or long periods of low
revenue. Rainy day funds are prudent
and potentially adequate for emergency
situations, but in most states, they have
not been adequate to compensate for
revenue shortfalls during economic
contractions; in fact, estimates of the

amounts required for this purpose are
much larger than most states have
amassed heretofore.13 Several strategies
have been suggested for smoothing
state tax revenue or otherwise coping
with its fluctuations.14 These could be
used individually or in combination.
First, states could be more conservative
in planning expenditures so that
they would not be left with spending
programs that would require radical
curtailment when tax collections
decline. Second, states could assign
larger amounts of revenue to rainy day
funds when revenues are high, to be
tapped when revenues declined. Third,
states could designate tax collections
from capital gains income as windfalls,
not to be used to fund large ongoing
spending programs. Fourth, states
could expand the sales tax base in
order to decrease the share of tax
collections derived from other, more
variable sources.
More comprehensive approaches
to state government finances are also
possible. For example, states could
model both tax revenue and expenditure needs over the course of statespecific business cycles (that is, using
economic data, such as employment
and income, at the state level to chart
the business cycle rather than using
time frames and data related to the
national cycle). Ideally, this modeling
would produce a picture of how each of
a state’s different types of taxes varies
over its business cycle and how each
type of spending varies. This information could be used to calculate the
amount needed for a rainy day fund.
It could also be used to reorient taxes
toward less variable sources. Additionally, knowledge of the cyclical variation of tax revenue and expenditure
13

See the article by Gary Wagner and Erick
Elder.

14
See the article by Elaine Maag and David
Merriman.

Business Review Q1 2010 29

needs could be used to match more
dependable revenue sources with those
expenditure categories that are most
necessary year to year and to match
more variable revenue sources with
programs that can be scaled back
or postponed with the least adverse
consequences.15
SUMMARY
Over the past two decades or so,
state tax revenue has grown, but it
has become more variable, especially
over the course of the business cycle.

15
See the article by Gary Cornia and Ray
Nelson and the one by Russell Sobel and Gary
Wagner.

In part, this has been the result of
policy actions such as growing reliance on individual income taxes and
the reduction in the sales tax base.
Additionally, economic changes have
tended to increase the variability in
state tax revenue. Significant changes
have been the growth of nonwage
income, particularly capital gains, as
a share of total taxable income, which
has increased the variability of the
income tax base, and the growth of
service consumption relative to goods
consumption, which has reduced the
revenue-generating potential of state
sales taxes, a relatively stable source of
revenue.

States could implement tax
policies to reverse some of the consequences of these changes by moving
toward greater reliance on more stable
revenue sources. Alternatively, they
could establish procedures for managing funds in order to cope with fluctuating revenues. Or they could do both
of these things. Either approach or
both combined would require an effort
of political will because implementing
these approaches would necessitate
more conservative policies on spending, higher levels of taxation, changes
in the incidence of taxation (the
relative share of total taxes different
population groups or industries pay), or
all of these. BR

Maag, Elaine, and David Merriman. “Tax
Policy Responses to Revenue Shortfalls,”
State Tax Notes (November 3, 2003), pp.
393-403.
Mattoon, Richard H. “Should the Federal
Government Bail Out the States? Lesson
from Past Recessions,” Federal Reserve
Bank of Chicago Fed Letter 265 (August
2009).

Sobel, Russell S., and Gary A. Wagner.
“Cyclical Variability in State Government
Revenue: Can Tax Reform Reduce It?”
State Tax Notes (August 25, 2003), pp.
569-76.

REFERENCES

Boyd, Donald J., and Lucy Dadayan. “State
Tax Revenue Falling Sharply in Fourth
Quarter, Early Data Show,” State Revenue
Report, 74 (January 2009).
Boyd, Donald J. “What Will Happen to
State Budgets When the Money Runs
Out?” Fiscal Features, The Nelson A.
Rockefeller Institute of Government
(February 19, 2009).
Cornia, Gary C., and Ray D. Nelson.
“Rainy Day Funds and Value at Risk,” State
Tax Notes (August 25, 2003), pp. 563-67.

Mattoon, Richard, and Leslie
McGranahan. “Revenue Bubble and
Structural Deficits: What’s a State to Do?”
Federal Reserve Bank of Chicago, Working
Paper 2008-15 (2008).

Fox, William F. “Three Characteristics
of Tax Structures Have Contributed to
the Current State Fiscal Crisis,” State Tax
Notes (November 3, 2003), pp. 369-77.

National Conference of State Legislatures.
“State Budget Update: July 2009.”

Jenny, Nicholas W. “Tax Increases Shore
Up State Revenue,” State Revenue Report,
53 (September 2003).

Schiller, Timothy. “Growing Slowly,
Getting Older: Demographic Trends in
the Third District States,” Federal Reserve
Bank of Philadelphia Business Review
(Fourth Quarter 2008), pp. 21-28.

30 Q1 2010 Business Review

Stotsky, Janet G. “Coping with State
Budget Deficits,” Federal Reserve Bank
of Philadelphia Business Review (January/
February 1991), pp. 13-24.
Wagner, Gary A., and Erick M. Elder.
“Revenue Cycles and the Distribution of
Shortfalls in U.S. States: Implications of an
‘Optimal’ Rainy Day Fund,” National Tax
Journal, 60 (December 2007), pp. 727-42.
Wilson, Daniel. “Are Fiscal Stimulus
Funds Going to the ‘Right’ States?” Federal
Reserve Bank of San Francisco Economic
Letter 2009-14 (April 2009).

www.philadelphiafed.org

Recent Developments in
Consumer Credit and Payments*
BY MITCHELL BERLIN

O

n September 24-25, 2009, the Research
Department and the Payment Cards Center
of the Federal Reserve Bank of Philadelphia
held their fifth joint conference to present
and discuss the latest research on consumer credit and
payments. Sixty participants attended the conference,
which included seven research papers on topics such as
securitization and distressed loan renegotiation, consumer
disclosure, data breaches and identity theft, and the
effects of the U.S. financial crisis on global retail lending.
In this article, Mitchell Berlin summarizes the papers
presented at the conference.

In his opening remarks, Mitchell
Berlin noted that the flow of highquality papers on consumer finance
and payment issues has increased
steadily since the first joint conference
of the Research Department and Payment Cards Center in 2001.
SECURITIZATION AND
RENEGOTIATION
In the first paper, Tomasz Piskorski of Columbia University reported
the results of a study (with Amit
Mitchell Berlin
is a vice president
and economist
and head of the
Banking section in
the Philadelphia
Fed’s Research
Department. This
article is available
free of charge at
www.philadelphiafed.org/research-and-data/
publications/.
www.philadelphiafed.org

Seru and Vikrant Vig) that provided
evidence that frictions impeded the
renegotiation of certain types of
distressed mortgages during the recent
financial crisis.1 In particular, Piskorski
and his coauthors showed that loans
that banks packaged into mortgagebacked securities and placed in trusts
— securitized loans — were foreclosed
on more often than otherwise similar mortgages that remained in bank
portfolios.

1

The conference agenda along with links
to most of the papers presented can be
found on the Philadelphia Fed’s website at: http://www.philadelphiafed.org/
research-and-data/events/2009/consumercredit-and-payments/program.cfm.

*The views expressed here are those of the
author and do not necessarily represent
the views of the Federal Reserve Bank of
Philadelphia or the Federal Reserve System.

The authors examined a large
sample (about 327,000) of first-lien,
nonagency mortgages originated
between 2005 and 2006. This sample
is drawn from the LPS database,
which includes both loans held in the
originating bank’s portfolio and loans
that were securitized. The sample was
restricted to distressed mortgages, that
is, loans that were at least 60 days’
delinquent. They also considered a
subsample of high-quality loans: loans
with full documentation (full doc) with
credit scores above 680.
First, the authors presented
descriptive statistics showing that distressed portfolio loans were foreclosed
less often than securitized loans. Then
they used a logit model to estimate the
probability of foreclosure, depending
on whether the loan was securitized or
held in portfolio, but also taking account of loan and borrower characteristics that might affect this probability.
These included the borrower’s credit
score, the loan-to-value (LTV) ratio,
the size of the loan at origination, the
loan maturity, and the original interest
rate, among other variables.
The logit results provided evidence that delinquent portfolio loans
were less likely to be foreclosed and
that the effect was both statistically
significant and economically large.
Evaluated at the mean values for all of
the other variables, the probability of
being foreclosed was between 3.8 and
7 percentage points lower for portfolio
loans than for securitized loans, depending on the precise model specification. These corresponded to an 18
percent and 32 percent relative decline
in the mean foreclosure rate.
Piskorski and his coauthors
Business Review Q1 2010 31

sought to address concerns
that lenders may have learned
information about borrower quality
subsequent to origination but prior
to delinquency. If this were true, the
sample of securitized loans might
disproportionately include loans that
the originating bank had sold after
receiving information, suggesting a
higher probability of default. First,
they replicated their basic results using
a subsample of high-quality loans,
arguably a set of loans for which it is
less likely that underwriters might have
subsequently learned more information
about borrower quality. They also
addressed this concern more directly
by examining a subsample of loans
for which borrowers’ credit scores or
LTVs at the time of delinquency were
available. In these tests, the authors
continued to find that portfolio loans
were less likely to be foreclosed than
securitized loans.
The authors also addressed
concerns that their results were driven
by differences in variations in state
laws that affect the ease of foreclosure.
Specifically, they divided states into
tough and weak states, depending
on whether average foreclosure times
were low or high. They found that
securitized loans were significantly
more likely to be foreclosed in both
tough and weak states, a finding
inconsistent with the view that
differences in state laws were the
source of their results.
The authors also considered
the possibility that the differences
in foreclosure rates were not driven
by a bias against renegotiation for
securitized loans but by a bias against
foreclosures by banks postponing
recognizing losses on portfolio loans.
Piskorski and coauthors found that
delinquent borrowers resumed making
payments on loans held in portfolio at
a higher rate than for securitized loans,
a finding inconsistent with this view.

32 Q1 2010 Business Review

LIAR’S LOANS
Ashlyn Aiko Nelson of Indiana
University discussed the results of a
study (with Wei Jiang and Edward
Vytlacil) that explored the effects
on loan performance of origination
channel and level of documentation
for mortgage loans. Nelson and her
coauthors found that loans originated
by brokers and loans that required
little or no documentation by the borrower (low doc loans) were particularly

Piskorski and
coauthors found that
delinquent borrowers
resumed making
payments on loans
held in portfolio at a
higher rate than for
securitized loans.
prone to agency problems. Nelson
argued that broker-originated loans
performed badly because brokers had
incentives to make loans to low-quality
borrowers, while low doc loans were
more likely to perform badly because
borrowers overstated income.
Nelson and her coauthors examined the mortgage loans made between
January 2004 and February 2009 by
a large national mortgage bank. The
bank’s files contained a wealth of
data about borrower characteristics,
permitting the authors to take account
of many borrower-specific factors that
might affect loan performance. The
authors divided their sample into four
subsamples: loans originated by brokers
requiring full documentation, loans
originated by brokers requiring low or
no documentation, loans originated by
banks requiring full documentation,

and loans originated by brokers requiring low or no documentation.
Nelson explained that the loan
sample was not representative of the
mortgage loan market as a whole,
which raised some questions about
the extent to which the results could
be generalized. The bank made a
disproportionately large share of loans
originated by brokers and a disproportionately large share of low doc loans.
Nonetheless, the authors’ view was
that this bank represented an extreme
example of tendencies that were common to many banks.
In their main econometric treatment, the authors used two models of
loan delinquency: (1) a probit model,
which estimates the probability of
delinquency, taking into account loan
and borrower characteristics; (2) a
duration model, which estimates the
average time to delinquency, taking
into account the same characteristics.
They estimated separate regressions for
each of the four subsamples and found
that both the origination channel and
the level of documentation affected
delinquency. Specifically, they found
that bank-originated full doc loans had
a delinquency rate of 13 percent and
a five-year survival rate of 86 percent,
while the comparable numbers for
bank-originated low doc loans were 18
percent and 68 percent; broker-originated full doc loans were 24 percent
and 65 percent; and broker-originated
low doc loans were 32 percent and 36
percent.
Nelson explained that particular
types of borrowers might have selected
into particular types of loans. To address this issue, the authors estimated
a model to explain which types of borrowers chose particular types of loans.
Broadly, they found that borrowers
using brokers had lower credit quality
and less experience with mortgages. In
contrast, in addition to self-employed
borrowers, low doc borrowers were

www.philadelphiafed.org

typically more experienced and of
higher credit quality.
The authors then sought to
examine the separate effects of observable and unobservable differences in
borrower risk on delinquency. According to the authors, many of the low doc
loans looked good on paper but did
poorly, suggesting that the poor performance of low doc loans was due to
factors that did not appear on the loan
applications. In contrast, more than
half of the poor performance of broker
loans was due to observable borrower
characteristics.
To explain the poor performance
of low doc loans, Nelson and coauthors
explored the evidence for falsified loan
application information. In one of
their approaches to this question, they
hypothesized that falsified application
information would reduce the predictive power of the empirical model.
To create an out-of-sample test, they
estimated the model over six-month
periods to predict the probability of
delinquency for the subsequent sixmonth period. The model’s predictive
power was substantially lower for the
low doc subsamples, a result consistent
with falsified application information.
The authors also performed a
more direct test for falsified application information in low doc loans.
They identified the particular borrower
attributes most likely to be falsified:
whether the home was a primary residence, employment information, and
information about income, wealth, and
existing debt. The results for income
provided the most striking evidence of
overstatement.
In the full doc samples, income
was negatively related to delinquency.
However, in the low doc samples, stated income was positively related to delinquency, and the effect was strongest
when the loan had been originated by
brokers without an ongoing relationship with the bank. The authors also

www.philadelphiafed.org

attempted to quantify the extent of the
overstatement, comparing stated income with alternative measures of the
borrower’s true income, for example,
average income in the borrower’s ZIP
code. They found that the ratio of
stated income to average income in the
ZIP code was significantly higher for
the low doc sample and estimated that
income was overstated by between 15
and 20 percent.

The authors considered three different disclosures. In one, the APR on
a payday loan was compared with the
APRs on other types of loans likely to
be familiar to payday borrowers. The
second disclosure displayed the dollar
cost of repeated borrowings up to
three months and compared this with
the dollar cost of repeated borrowings
on a credit card. The third disclosure illustrated the likelihood that a

[Borrowers] may focus on the cost
of borrowing once, even though they
are likely to borrow repeatedly.
DISCLOSURE AND PAYDAY
LENDING
Adair Morse of the University
of Chicago reported the results of a
carefully designed field study (with
Marianne Bertrand) that attempted to
determine whether payday borrowers’
decisions were affected by particular
types of cognitive bias and whether
their borrowing decisions might be affected by particular types of disclosures
at the point of the transaction.
In a payday loan, borrowers sign
over their next paycheck and pay a fee
($15-17 for each $100 borrowed) in
exchange for a loan. Payday loans are
quite expensive compared with other
types of loans, with annual percentage
rates (APRs) typically exceeding 400
percent, and customers typically borrowing repeatedly. According to the
authors, one explanation for why borrowers use such an expensive source of
borrowed funds is cognitive bias. For
example, borrowers may not realize
how high the APR on the payday loan
is, if they don’t compare it to relevant
alternatives. Alternatively, they may
focus on the cost of borrowing once,
even though they are likely to borrow
repeatedly.

typical payday borrower will engage in
repeated borrowings, e.g., how many
borrow once, how many renew once or
twice, etc.
The goal of the experiment was
to determine whether a disclosure
reduced either the likelihood or the
amount of subsequent borrowings.
Furthermore, Morse and her coauthor
examined whether the disclosures
worked for particular types of borrowers, for example, whether a borrower’s
level of education or some measure of
the borrower’s degree of self-control
might affect the outcome of a particular disclosure.
The authors performed this experiment with the cooperation of a large
payday lender operating in 10 states.
They had access to all customers entering 77 stores over a two-week period.
Crucially, the experiment was designed
so that the disclosures (including no
disclosure at all) were randomly assigned over borrowers. The goal was
to minimize the possibility that factors
other than the actual disclosures might
affect borrower behavior; for example,
the disclosures were equally distributed
across different days of the week because a Monday borrower might differ
from a Thursday borrower. Morse re-

Business Review Q1 2010 33

ported that subsequent empirical tests
by her and her coauthor verified that
the experimental design had successfully randomized across customers.
The lender’s records also included
demographic data about each borrower, for example, level of education;
financial data; and information about
past and subsequent transactions with
the customer, provided by the lender.
In addition to these data, participating borrowers also answered survey
questions about the intended use of
the loan and the borrower’s own view
of his or her planning and spending
habits. Using the survey answers, the
authors designed an index of borrower
self-control and a gratification index
measuring whether the loan was for
discretionary expenditures.
The authors’ main result was
that there was a statistically significant and economically large effect on
subsequent borrowing behavior for the
disclosure that added up the costs of
subsequent borrowings in dollars. The
effect was to reduce both the likelihood of further borrowing and the subsequent amounts borrowed. In particular, borrowers receiving this disclosure
were 5.5 percentage points less likely
to borrow in subsequent pay cycles (10
percent less likely to borrow compared
with the control group) and they borrowed nearly $40 less (17 percent less
than the control group). Morse noted
that this effect was large for this type
of experimental study, especially since
the disclosure was made only once.
The effects of the other disclosures on the likelihood of subsequent
borrowings were relatively weaker,
both statistically and economically.
In particular, disclosing relative APRs
seemed to have little effect. In both
cases, however, there was evidence of
some reduction in the amounts borrowed.
The authors also found that the
effects differed across different types

34 Q1 2010 Business Review

of customers. The decline in the
probability of borrowing occurred
mainly among individuals without a
college degree. The decline was also
stronger for those borrowers who
reported higher self-control, those who
were not borrowing for discretionary
purposes, and those with lower debtto-income ratios. This last result is
broadly consistent with the authors’
other finding that reduced borrowing
occurred only with a lag. According
to Morse, borrowers may have needed
time to adjust their financial situation,
while others in more financially
strained circumstances may simply
have been unable to adjust, at least in
the time frame considered in the study.
In conclusion, Morse and
coauthor suggested that the success
of the disclosure in the payday setting
justifies further explorations of policies
that might reduce consumer biases in
other contexts.
IDENTITY THEFT
William Roberds of the Federal
Reserve Bank of Atlanta presented
the results of a theoretical model (with
Stacey Schreft) that examined the
incentives for competing networks
— for example, credit card networks
— to adopt policies to reduce identity
theft. The underlying questions
were whether networks collect too
much information and whether they
adopt appropriate levels of security to
protect that data. In general, Roberds
argued that competing networks
have incentives to collect too much
personally identifiable information
(PII)— for example, name, address,
Social Security number, and so forth—
while spending too few resources to
protect it from theft.
In their model, many individuals
are honest and joined one of two
competing networks to facilitate
making transactions to purchase
goods. But some individuals are

fraudulent types; that is, they seek to
join a network and then default on
their payments.
The authors identified two types
of identify theft, both involving opening new accounts, rather than stealing an existing customer’s account,
to purchase goods at the customer’s
network. First, skilled identity thieves
can use sophisticated techniques — for
example, hacking the network’s database — to steal PII from one network
to join another network (high-tech
fraud). A second type of identity
theft (low-tech fraud) simply requires
someone to assemble enough information to create a viable identity to join
a network, for example, by stealing a
wallet and impersonating that person.
This type of theft requires no skill, but
it does require time and effort, and it is
more costly for an impersonator to join
a network if he or she must provide
more information.
Networks have two potential
security strategies. The first is to collect more PII about a customer. By
keeping this information on record,
the network can increase the likelihood that fraudulent customers will be
detected if they attempt to impersonate a new customer applying for credit.
The second security strategy is for the
network to spend resources to protect
its database. In particular, it can make
it more difficult for skilled frauds to
steal PII.
Roberds explained that data
security involves an externality: By demanding a lot of PII to join, a network
makes it more difficult for fraudulent
customers to join. But keeping very detailed information about the network’s
own customers in its database makes
it easier for skilled identity thieves
to use stolen data to join the other
network, because this tends to increase
the likely overlap in the types of PII
required to join each network. And
a network’s costly measures to secure

www.philadelphiafed.org

its own database from skilled identity
thieves reduce fraud at the competing
network.
To provide a benchmark for
evaluating the choices of competing
networks, the authors performed a
thought experiment. They asked: How
much information and data security
would a benevolent social planner
instruct the networks to choose? This
planner would take into account all
of the costs and benefits to individuals, and these hypothetical choices are
termed the optimal outcome.2 After
calculating the optimal outcome, the
authors examined market outcomes
in successively more general examples
and compared these with the optimal
outcomes.
In the first example, they assume
that firms do not secure their data at
all. In this case, firms collect too much
information. Collecting more information makes it harder for a thief to construct a viable identity to join but also
makes it easier for a thief to steal data
that can be used at the other network.
In this example, networks collect too
much information and data breaches
occur more often than in the optimum,
but interestingly, the prevalence of
identity theft is lower than the optimal
level. The more overlap between the
PII collected by the competing networks, the greater the inefficiency.
In their second example, they
assume that the proportions of skilled
and unskilled frauds are identical.
Again the basic externality arises: Networks collect too much information
and invest too little in data security.
The main insight from this example
is that although identity theft is lower
than in the optimum, it is unskilled

2

It is important to note that while there
are real costs to identity theft, the optimal
level of identity theft is not zero. This is
because it is costly to deter theft, and these
costs are ultimately borne by individuals.

www.philadelphiafed.org

theft that is mainly deterred. Networks
make it very difficult for unskilled
thieves to join, but their excessive data
collection and inadequate data security
makes skilled identity theft relatively
attractive.
For the most part, these insights
carry over to the most general version
of the model, in which security levels
are freely chosen by the networks. In
this setting, the authors showed that
when networks require substantially
similar types of information, security
levels were too low and networks collected too much data. The authors
argued that, in effect, competing

imposing civil liability but significantly
increased identity theft.
BANKRUPTCY REFORM AND
MORTGAGE DEFAULT
Michelle White of the University
of California-San Diego reported on
the results of an empirical study (with
Wenli Li and Ning Zhu) of the effects
of the Bankruptcy Reform Act of 2005
on mortgage default. Broadly, they
argued that the passage of the Bankruptcy Reform Act was associated with
a statistically significant and economically large increase in mortgage
defaults. According to the authors, the

Broadly, [White, Li, and Zhu] argued that the
passage of the Bankruptcy Reform Act was

    
and economically large increase
in mortgage defaults.
networks substitute information collection for data security. As in the
simpler examples, there was too much
skilled identity theft and too little
unskilled identity theft compared with
optimal levels. Furthermore, there is
less identity theft in equilibrium than
in the optimum, even though networks
collect too much information.
Finally, the authors examined the
effects of public policies that might
improve market outcomes. Since the
model is quite complicated, they used
simulations to evaluate the effects of
these policies. One possibility is to
increase civil liability for data breaches.
They found that this improved incentives to increase security, but networks
still collected too much information. A
second approach is for some government agency to set minimum data security standards. This nearly attained
the optimal outcome in their simulations. A third approach is to limit data
collection. This policy did as well as

act may have contributed to the severity of the subsequent crisis in housing
markets.
White argued that bankruptcy
law helped people save their homes,
at least temporarily, but also reduced
the costs of ultimately defaulting on
the mortgage. Under both Chapter 7
(liquidation) and Chapter 13 (restructuring) proceedings, homeowners can
protect exempted assets, in particular,
homes in which a household’s equity
does not exceed the state-mandated
homestead exemption. Both bankruptcy procedures also give a delinquent
homeowner some time to pay back
missed mortgage payments (arrears),
but Chapter 13 provides a substantially
longer period (three to five years) and
also permits delinquent homeowners
who cannot pay back arrears a significant amount of time before foreclosure.
The Bankruptcy Act of 2005
had three main effects that might
affect delinquent homeowners: First,

Business Review Q1 2010 35

it raised filing costs, thus making it
less attractive for a household to use
bankruptcy to save a home either
permanently or temporarily. The act
also placed a cap on the homestead
exemption at $125,000, a provision
that affected 10 states with high
homestead exemptions. Third, the act
imposed a means test for homeowners
to use Chapter 7. Specifically, it
required consumers with incomes
above the state median to file using
Chapter 13.
The authors’ empirical strategy
was to use a difference-in-difference
approach. In particular, the authors
examined the differential effects of the
change in bankruptcy law on certain
households living in different states.
The empirical tests exploited variation
in consumers’ circumstances, state
median incomes, and state homestead
exemptions to determine whether the
act affected homeowner delinquency.
The authors used data from
LPS on the performance of first-lien
30-year mortgages. These data also
included information about customer
credit quality, notably credit scores at
the time the loan was originated. By
merging the LPS data with data from
the Home Mortgage Disclosure Act,
the authors could also take account
of information and homeowner
demographic characteristics, most
notably household income. The final
sample included about 381,000 prime
mortgages and 268,000 subprime
mortgages.
Using these data, the authors
examined whether mortgage defaults
increased following passage of the act
for those homeowners for whom the
provisions of the act were actually
binding, in other words, in cases in
which consumers had equity in excess
of the now lower exemptions or where
the consumer was now required to
use his or her excess cash flow to pay
nonmortgage debts. Specifically, they

36 Q1 2010 Business Review

examined a window of three months
before and after passage of the act.
Descriptive statistics showed that
following passage of the act, default
rates were 15 percent higher for prime
loans and 9 percent higher for subprime loans. In addition, default rates
increased even more for homeowners who were subject to the new cap

due to income having been overstated
by these borrowers.
Results were largely the same
when the authors reestimated their
model using a six-month window before and after passage of the act. Thus,
the authors concluded that the effects
of act were not temporary.

The Bankruptcy Act of 2005 had three
main effects that might affect delinquent
        
a cap on the homestead exemption at
$125,000, and imposed a means test.
on the homestead exemption and for
prime homeowners who failed the
means test. However, defaults decreased for homeowners with subprime
loans who failed the means test. The
authors interpreted these findings as
largely consistent with the view that
the act increased mortgage defaults.
The authors then estimated a
logit hazard model, which estimates
the probability of defaulting with the
passage of time, taking into account
the changes in the bankruptcy law that
might affect a particular household,
and controlling for a large number of
demographic variables. They found
that homeowners subject to the cap on
the homestead exemption were more
likely to default following passage of
the act; specifically, White and her
coauthors found that prime mortgage
holders subject to the cap were 53
percent more likely to default and
subprime mortgage holders were 44
percent more likely to default. Prime
mortgage borrowers subject to the
means test were 14 percent more likely
to default, but there was no effect for
subprime mortgage borrowers. White
suggested that the result for subprime
mortgage borrowers may have been

CONSUMER PROTECTION
LAWS
Simon Gervais of Duke University
presented the results of a theoretical
model (with Bruce Carlin) that examined the role of the legal system when
customers are poorly informed about
the appropriate type of financial product to buy. In their model, households
depend on brokers to match them to
financial products for which they are
best suited.
The main assumption of the
model is that particular products
are better suited for particular
types of consumers. For example,
a household with moderate savings
and a student in high school might
be better advised to invest its savings
for college in a fixed income product,
rather than a stock index fund, but
the household might not have the
sophistication to know which product
is most appropriate. In Gervais and
Carlin’s model, both brokers and the
producers of financial products must
exert costly effort: At some cost,
brokers can direct consumers to those
products that suit them best, although
there is an unavoidable probability
of a mistake. Similarly, at some cost,

www.philadelphiafed.org

the producers of financial products
can develop higher quality products
that are suited to a wider range of
consumers. Crucially, no court can
observe their effort directly, nor can
the court disentangle the reason why
a particular product turned out poorly
for a particular customer: Was it a bad
match or a bad product?
In this setup, Gervais explained
that product quality and effort by
brokers are partial substitutes; that is,
more effort by a broker reduces losses
to consumers and this, in turn, reduces
the producer’s incentive to develop the
highest quality product. In a similar
fashion, when higher quality financial
products are developed, there is less
chance of losses to consumers, and this
reduces brokers’ incentives to identify
the most suitable products for their
customers.
The law’s design must take this
interaction into account. For example,
while legal penalties for a broker will
tend to increase broker effort, thereby
increasing the probability of a good
match for the customer, this will tend
to decrease the provision of quality
products because the firm realizes that
the broker’s effort will make up for the
lack of quality.
First, the authors demonstrated
that without legal penalties, the market leads to a serious underprovision
of effort both by brokers and firms.
Indeed, in their stylized setup, consumers are not willing to buy the product
at all and the market breaks down altogether. Intuitively, without penalties,
consumers who pay a price up-front
expecting an appropriate financial
product will always be disappointed
because producers and brokers always
have an incentive to chisel once they
have been paid.
Gervais then presented their
main results in a version of the model
in which a customer can seek redress
through the legal system only when he

www.philadelphiafed.org

or she has followed a broker’s advice.
In this context, the authors showed
that to achieve an efficient outcome,
total legal penalties can’t merely be
compensatory; they must be punitive.
This conclusion is jointly the result of
the substitutability of effort by producers and brokers and of the court’s
inability to assign blame to one or the
other for a bad outcome. When the
law seeks to push, say, the broker to
increase effort by penalizing him or
her when a match turns out poorly,

[Gervais and Carlin]
showed that to
   
outcome, total legal
penalties can’t merely
be compensatory;
they must be punitive.
the producer of the financial product
responds by reducing effort. Intuitively, this means that total penalties
must exceed the losses imposed on the
borrower for having been mismatched
to induce full effort by both brokers
and producers.
The authors then enriched the
basic model to include the realistic
possibility that the firm pays the broker
for each sale. In this setting, they show
that the optimal penalty structure
places higher penalties on the broker
than when brokers do not receive
direct payments from producers.
Gervais then explained how legal
penalties changed if customers were
permitted to seek legal redress even
when they have ignored a broker’s
advice. In this setting, Gervais and
Carlin demonstrated that the optimal
penalties were no longer punitive; customers only received compensation for
having made a poor decision. Intuitive-

ly, punitive penalties reward customers for ignoring their broker’s advice
and then seeking redress through the
courts whenever they make a bad
decision. Accordingly, the optimal
legal scheme can’t reward customers for
making bad decisions.
GLOBAL RETAIL LENDING
In the final paper, Jörg Rocholl of
the ESMT European School of Management and Technology presented
the results of an empirical study (with
Manju Puri and Sascha Steffen) of the
effects of the crisis in U.S. mortgage
markets on German banks. Rocholl
and his coauthors used the unique
structure of the German banking
system as a natural experiment for
distinguishing supply-side effects from
demand-side effects.
Rocholl explained that there are
11 Landsbanken in Germany, jointly
owned by state governments and the
savings banks in those states. The savings banks provide financial services
only for the customers in their municipality, primarily small and mediumsized firms, as well as retail customers.
A key feature of the system is that the
Landsbanken can rely on both formal
and informal support from the savings
banks with an ownership share. Thus,
losses at the Landsbanken will impose
losses on the savings banks, which are
significant owners.
In Rocholl’s account, Germany
experienced growth well into 2008 and
avoided the housing bubble occurring in the U.S. and other European
countries. Nor did it experience the
housing bust. But a number of Landsbanken were heavily exposed to risky
U.S. housing assets and had experienced large losses by the third quarter
of 2007.
A key feature of the German
banking market provided the setting
for a natural experiment. Only some
of the savings banks were owners of

Business Review Q1 2010 37

affected Landsbanken, while others
had no exposure to U.S. real estate
losses. And since the national housing
market was largely homogeneous,
Rocholl and his coauthors argued that
troubles at a savings bank’s Landsbank
may be viewed as a pure shock to the
supply of loans.
Using a difference-in-difference
analysis, the authors compared the
change in lending behavior at affected
and unaffected savings banks before
and after August 2007, when the U.S.
housing crisis began to affect assets
owned by certain Landsbanken. The
authors also had information about
loan applications at these banks,
which made possible a clear distinction
between changes in supply and
demand. For example, if the authors
observed that loan applications were
similar across savings banks, but fewer
loans were booked at affected banks
post-crisis, this is strong evidence that
the underlying source of the change in
lending was supply driven.
The authors had data on all
consumer and mortgage loans by
savings banks in Germany between
July 2006 and June 2008. They also
had data on loan applications and the
bank’s risk rating of the consumer,
as well as information about any
preexisting financial relationships with
the consumer, for example, credit lines
and assets held at the bank.

38 Q1 2010 Business Review

In the central findings of the
paper, Rocholl and his coauthors
estimated a linear probability model
of loan acceptance rates. They found
that loan acceptance rates at affected
banks declined significantly after
August 2007, while acceptance rates
increased insignificantly at unaffected
banks. The decline at affected banks
was economically large; across all types
of consumer lending, acceptance rates
declined 8.2 percent. The decline
was strongest for customers that were
assigned low credit ratings by the
banks, suggesting a flight-to-quality
effect. The results were consistent
across loan categories, although
the effects were larger for mortgage
loans. The authors argued that this
is because mortgage loans represent a
larger commitment by the bank than
other types of consumer loans.
Rocholl and his coauthors also
estimated a cross-sectional regression
to examine how bank characteristics
affected lending behavior. They found
that the declines were most dramatic
for smaller banks and that for such
banks the declines were particularly
large for mortgage loans. They also
found that the effects were greatest
for banks that were relatively illiquid
entering the crisis.
The authors then examined the
demand for loans. In a regression
framework, the authors found that loan

applications declined at both affected
and unaffected banks. There was
no statistically significant difference
in the trend for these two groups
of banks. The authors suggest that
the decline in applications reflected
a decline in demand, as consumers
became less certain about future
economic conditions, and the decline
was not bank-specific. Nor did the
authors find any significant difference
in the amount of loan requested. This
reinforced the authors’ view that the
declines in lending by affected banks
were driven by the supply shock rather
than effects on demand.
The authors then examined the
effects of relationships in a linear
probability model using a triple
difference approach, in which loan
applicants were further differentiated
according to whether they had an
existing relationship with the bank.
Customer relationships with a bank
increased acceptance rates, and the
effect was strongest at affected banks
after August 2007. Thus, pre-existing
customer relationships mitigated the
negative supply shocks at affected
banks, perhaps because lenders
have more information about such
borrowers. BR

www.philadelphiafed.org

RESEARCH RAP

Abstracts of
research papers
produced by the
economists at
the Philadelphia
Fed

You can find more Research Rap abstracts on our website at: www.philadelphiafed.org/research-and-data/
publications/research-rap/. Or view our working papers at: www.philadelphiafed.org/research-and-data/
publications/.

IMPOSING EXCESS CASH FLOW
SWEEP COVENANTS IN LOAN
CONTRACTS
With free cash flows, borrowers can
accumulate cash or voluntarily pay down
debts. However, sometimes creditors impose
a mandatory repayment covenant called
“excess cash flow sweep” in loan contracts
to force borrowers to repay debts ahead of
schedule. About 17 percent of borrowers in
the author’s sample (1995-2006) have this
covenant attached to at least one of their
loans. The author finds that the sweep
covenant is more likely to be imposed on
borrowers with higher leverage (i.e., where
risk shifting by equity holders is more
likely). The results are robust to including
borrower fixed effects or using industry
median leverage as a proxy. The covenant
is more common also in borrowers where
equity holders appear to have firmer control, e.g., when more shares are controlled
by institutional block holders, when firms
are incorporated in states with laws more favorable to hostile takeovers, or when equity
holders place higher valuation on excess
cash holdings. These determinants suggest
that the sweep covenant may be motivated
by creditor-shareholder conflicts. Finally,
the author shows that the covenant has real
effects: borrowers affected by the sweep covenant indeed repay more debts using excess
cash flows, and they spend less in capital
investment and pay out fewer dividends to
shareholders.
Working Paper 09-30, “Creditor Control
of Free Cash Flow,” Rocco Huang, Federal
Reserve Bank of Philadelphia
www.philadelphiafed.org

CORPORATE POLITICS: EFFECTS ON
INTERNAL CAPITAL ALLOCATIONS
AND LENDING BEHAVIOR
This study looks inside a large retailbanking group to understand how influence
within the group affects internal capital allocations and lending behavior at the member
bank level. The group consists of 181 member
banks that jointly own a headquarters. Influence is measured by the divergence from oneshare-one-vote. The authors find that more
influential member banks are allocated more
capital from headquarters. They are less likely
to decrease lending after negative deposit
growth or to increase lending following positive deposit growth. These effects are stronger
in situations in which information asymmetry
between banks and the headquarters seems
greater. The evidence suggests that influence
can be useful in overcoming information
asymmetry.
Working Paper 09-31, “Internal Capital
Markets and Corporate Politics in a Banking
Group,” by Martijn Cremers, Yale School of
Management; Rocco Huang, Federal Reserve
Bank of Philadelphia; and Zacharias Sautner,
University of Amsterdam
CURRENCY DENOMINATIONS AND
THE PRICES OF EXPORT GOODS:
HOW IMPORTANT ARE THEY?
The authors show that standard alternative assumptions about the currency in which
firms price export goods are virtually inconsequential for the properties of aggregate
variables, other than the terms of trade, in a
quantitative open-economy model. This result
is in contrast to a large literature that emphaBusiness Review Q1 2010 39

sizes the importance of the currency denomination of
exports for the properties of open-economy models.
Working Paper 09-32, “How Important Is the Currency Denomination of Exports in Open-Economy Models?”
Michael Dotsey, Federal Reserve Bank of Philadelphia, and
Margarida Duarte, University of Toronto
WORKER FLOWS AND JOB FLOWS:
SOURCES OF DIFFERENCES OVER THE
BUSINESS CYCLE
Worker flows and job flows behave differently over
the business cycle. The authors investigate the sources
of the differences by studying quantitative properties of
a multiple-worker version of the search/matching model
that features endogenous job separation and intra-firm
wage bargaining. Their calibration incorporates microand macro-level evidence on worker and job flows. The
authors show that the dynamic stochastic equilibrium
of the model replicates important cyclical features of
worker flows and job flow simultaneously. In particular,
the model correctly predicts that hires from unemployment move countercyclically while the job creation rate
moves procyclically. The key to this result is to allow
for a large hiring flow that does not go through unemployment but is part of job creation, for which procyclicality of the job finding rate dominates its cyclicality.
The authors also show that the model generates large
volatilities of unemployment and vacancies when a
worker’s outside option is at 83 percent of aggregate
labor productivity.
Working Paper 09-33, “Worker Flows and Job Flows: A
Quantitative Investigation,” Shigeru Fujita, Federal Reserve
Bank of Philadelphia, and Makoto Nakajima, Federal
Reserve Bank of Philadelphia
TOO-BIG-TO-FAIL: HOW MUCH WERE
BANKS WILLING TO PAY?
This paper estimates the value of the too-big-to-fail
(TBTF) subsidy. Using data from the merger boom of
1991-2004, the authors find that banking organizations
were willing to pay an added premium for mergers that
would put them over the asset sizes that are commonly
viewed as the thresholds for being TBTF. They estimate
at least $14 billion in added premiums for the eight
merger deals that brought the organizations to over
$100 billion in assets. In addition, the authors find that
both the stock and bond markets reacted positively
to these deals. Their estimated TBTF subsidy is large

40 Q1 2010 Business Review

enough to create serious concern, since recent assisted
mergers have allowed TBTF organizations to become
even bigger and for nonbanks to become part of TBTF
banking organizations, thus extending the TBTF subsidy beyond banking.
Working Paper 09-34, “How Much Did Banks Pay to
Become Too-Big-To-Fail and to Become Systemically Important?” Elijah Brewer III, DePaul University, and Julapa
Jagtiani, Federal Reserve Bank of Philadelphia
CAN WE INSURE AGAINST COLLEGEFAILURE RISK?
Participants in student loan programs must repay
loans in full regardless of whether they complete college. But many students who take out a loan do not
earn a degree (the dropout rate among college students
is between 33 to 50 percent). The authors examine
whether insurance against college-failure risk can be
offered, taking into account moral hazard and adverse
selection. To do so, they develop a model that accounts
for college enrollment, dropout, and completion rates
among new high school graduates in the U.S. and use
that model to study the feasibility and optimality of
offering insurance against college failure risk. They find
that optimal insurance raises the enrollment rate by 3.5
percent, the fraction acquiring a degree by 3.8 percent,
and welfare by 2.7 percent. These effects are more
pronounced for students with low scholastic ability (the
ones with a high probability of failure).
Working Paper 10-1, “Insuring College Failure Risk,”
Satyajit Chatterjee, Federal Reserve Bank of Philadelphia,
and Felicia Ionescu, Colgate University
WELFARE COSTS OF INFLATION
This paper studies the steady-state and dynamic
consequences of inflation in an estimated dynamic stochastic general equilibrium model of the U.S. economy.
The author finds that 10 percentage points of inflation
entail a steady-state welfare cost as high as 13 percent
of annual consumption. This large cost is mainly driven
by staggered price contracts and price indexation. The
transition from high to low inflation inflicts a welfare
loss equivalent to 0.53 percent. The role of nominal/real
frictions as well as that of parameter uncertainty is also
addressed.
Working Paper 10-2, “The Implications of Inflation
in an Estimated New-Keynesian Model,” Pablo GuerronQuintana, Federal Reserve Bank of Philadelphia

www.philadelphiafed.org