View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

For release on delivery
1 15 p in CST (2 15 p in EST)
January 3, 1998

"Problems of Price Measurement"
Remarks by
Alan Greenspan
Chairman
Board of Governors of the Federal Reserve System
at the
Annual Meeting
of the
American Economic Association
and the
American Finance Association
Chicago, Illinois
January 3, 1998

For most of the past twenty years, the challenges confronting monetary policymakers
centered on addressing the question of how inflation could be brought down with as little
economic disruption as possible Given the progress that has been made in reducing inflation,
and the very solid economic performance that this low-inflation environment has helped to
promote, a new set of issues is now emerging on the policy agenda Of mounting importance
is a deeper understanding of the economic characteristics of sustained price stability We
central bankers need also to better judge how to assess our performance in achieving and
maintaining that objective in light of the uncertainties surrounding the accuracy of our
measured price indexes
In today's advanced economies, allocative decisions are primarily made by markets
Prices of goods and services set in those markets are central guides to the efficient allocation
of resources in a market economy, along with interest rates and equity values Prices are the
signals through which tastes and technology affect the decisions of consumers and producers,
directing resources toward their highest valued use Of course, this signaling process, which
involves individual prices, would work with or without government statistical agencies that
measure aggregate price levels, and in this sense, price measurement probably is not
fundamental for the overall efficiency of the market economy Indeed, vibrant market
economies existed long before government agencies were established to measure prices
Nonetheless, in a modern monetary economy, accurate measurement of aggregate price
levels is of considerable importance, mcreasingly so for central banks whose mandate is to
maintain financial stability Accurate pnce measures are necessary for understanding
economic developments, not only involving inflation, but also involving real output and
productivity

If the general price level is estimated to be rising more rapidly than is in fact the

case, then we are simultaneously understating growth in real GDP and productivity, and real
incomes and living standards are rising faster than our published data suggest

-2Under these circumstances, policymakers must be cognizant of the shortcomings of our
published price indexes to avoid actions based on inaccurate premises that will provoke
undesired consequences

Clearly, central bankers need to be conscious of the problems of

price measurement as we gauge policies designed to promote price stability and maximum
sustainable economic growth Moreover, many economic transactions, both private and
public, are explicitly tied to movements in some published price rindex, most commonly a
consumer price index, and some transactions that are not explicitly tied to a published price
index may, nevertheless, take such an index into account less formally

If the price index is

not accurately measuring what the participants in such transactions believe it is measuring,
then economic transactions will lead to suboptimal outcomes
The remarkable progress that has been made by virtually all of the major industrial
countries in achieving low rates of inflation in recent years has brought the issue of price
measurement into especially sharp focus For most purposes, biases of a few tenths in annual
inflation rates do not matter when inflation is high They do matter when, as now, inflation
has become so low that policymakers need to consider at what point effective price stability
has been reached Indeed, some observers have begun to question whether deflation is now a
possibility, and to assess the potential difficulties such a development might pose for the
economy
Even if deflation is not considered a significant near-term risk for the economy, the
increasing discussion of it could be clearer in defining the circumstance Regrettably, the term
deflation is being used to describe several different states that are not necessarily depicting
similar economic conditions One use of the term refers to an ongoing fall in the prices of
existing assets Asset prices are inherently volatile, in part because expected returns from real
assets can vary for a wide variety of reasons, some of which may be only tangentially related
to the state of the economy and monetary policy Nonetheless, a drop in the prices of existing

-3assets can feed back onto real economic activity, not only by changing incentives to consume
and invest, but also by impairing the health of financial intermediaries--as we experienced in
the early 1990s and many Asian countries are learning now But historically, it has been very
rapid asset price declines-in equity and real estate, especially-that have held the potential to
be a virulently negative force in the economy I emphasize rapid declines because, in most
circumstances, slowly deflating asset prices probably can be absorbed without the marked
economic disruptions that frequently accompany sharp corrections The severe economic
contraction of the early 1930s, and the associated persistent declines in product prices, could
probably not have occurred apart from the steep asset price deflation that started in 1929
While asset price deflation can occur for a number of reasons, a persistent deflation in the
prices of currently produced goods and services—just like a persistent increase in these
prices—necessarily is, at its root, a monetary phenomenon Just as changes in monetary
conditions that involve a flight from money to goods cause inflation, the onset of deflation
involves a flight from goods to money Both rapid or variable inflation and deflation can lead
to a state of fear and uncertainty that is associated with significant increases in risk premiums
and corresponding shortfalls in economic activity
Even a moderate rate of inflation can hamper economic performance, as I have emphasized
many tunes before, and although we do not have any recent experience, moderate rates of
deflation would most probably lead to similar problems Deflation, like inflation, would
distort resource allocation and interfere with the economy's ability to reach its full potential
It would have these effects by making long-term planning difficult, obscuring the true
movements of relative prices, and interacting adversely with institutions like the tax system
that function on the basis on nominal values
But deflation can be detrimental for reasons that go beyond those that are also associated
with inflation

Nominal interest rates are bounded at zero, hence deflation raises the

-4possibility of potentially significant increases in real interest rates Some also argue that
resistance to nominal wage cuts will impart an upward bias to real wages as price stability
approaches or outright deflation occurs, leaving the economy with a potentially higher level of
unemployment in equilibrium
A deflation that took place in an environment of rapid productivity growth, however,
might be largely immune from some of these special problems For example, in the high-tech
sector of our economy today, we observe falling prices together with rapid investment and
high profitability

Although real interest rates may be quite high in terms of this sector's

declining product prices, rapid productivity growth has ensured that real rates of return are
higher still, and investment in this sector has been robust In practice, firms' decisions depend
on an evaluation of their nominal return on investment relative to their nominal cost of capital
In this sense, the choice of a specific, sometimes arbitrary, definition of real output and hence
of price by government statisticians is essentially a descriptive issue, and not one that directly
affects firms' investment decisions This is an illustration of where even individual price
measurement probably is not always of direct and fundamental importance for private sector
behavior
If such high-tech, high-productivity-growth firms produce an increasing share of output in
the decades ahead, then, one could readily imagine the economy experiencing an overall
product price deflation in which the problems associated with a zero constraint on nominal
interest rates or nominal wage changes would seldom be binding Nevertheless, even if we
could ensure significantly more rapid productivity growth than we have seen recently, there
are valid reasons for wishing to avoid ongoing declines in the general price level If increases
in both inflation and deflation raise risk premiums and retard growth, it follows that risk
premiums are lowest at price stability Furthermore, price stability, by reducing variation in

uncertainty about the future, should also reduce variations in asset values

-5But how are we to know when our objective of price stability has been achieved9

In

price measurement, a distmction must be made between the measurement of individual prices,
on the one hand, and the aggregation of those prices into indexes of the overall price level on
the other The notion of what we mean by a general price level—or more relevantly, its
change—is never unambiguously defined
Issues of appropriate weighting in the aggregation process will presumably always bedevil
us But it is the measurement of individual prices, not then- aggregation, that pose the most
difficult conceptual issues At first glance, observing and measuring prices might not appear
especially daunting But, in fact, the problem is deceptively complex To be sure, the dollar
value of most transactions is unambiguously exact, and, at least in principle, is amenable to
highly accurate estimation by our statistical agencies But dividing that nominal value change
mto components representing changes in real quantity versus price requires that one define a
unit of output that is to remain constant in all transactions over time Defining such a
constant-quality unit of output, of course, is the central conceptual difficulty in price
measurement
Such a definition may be clear for unalloyed aluminum ingot of 99 7 percent or greater
purity in wide use Consequently, its price can be compared over tune with a degree of
precision adequate for virtually all producers and consumers of aluminum ingot Similarly,
the prices of a ton of cold rolled steel sheet, or of a linear yard of cotton broad woven fabric,
can be reasonably compared over a period of years
But when the characteristics of products and services are changing rapidly, defining the
unit of output, and thereby adjusting an item's price for improvements in quality, can be
exceptionally difficult

These problems are becoming pervasive in modern economies as high

tech and service prices, which are generally more difficult to measure, become ever more
prominent in aggregate price measures One does not have to look only to the most advanced

-6technology to recognize the difficulties that are faced

To take just a few examples,

automobile tires, refrigerators, winter jackets, and tennis rackets have all changed in ways that
make them surprisingly hard to compare to their counterparts of twenty or thirty years ago
The continual introduction of new goods and services onto the markets creates special
challenges for price measurement In some cases, a new good may best be viewed as an
unproved version of an old good But, in many cases, new products may deliver services that
simply were not available before

When personal computers were first introduced, the

benefits they brought households in terms of word processing services, financial calculations,
organizational assistance, and the like, were truly unique And, further in the past, think of
the revolutionary changes that automobile ownership, or jet travel, brought to people's lives
In theory, economists understand how to value such innovations, in practice, it is an enormous
challenge to construct such an estimate with any precision
The area of medical care, where technology is changing in ways that make techniques of
only a decade ago seem archaic, provides some particularly striking illustrations of the
difficulties involved in measuring quality-adjusted prices Cures and preventive treatments
have become available for previously unbeatable diseases Medical advances have led to new
treatments that are more effective and that have increased the speed and comfort of recovery
In an area with such rapid technological change, what is the appropriate uint of output? Is it a
procedure, a treatment, or a cure? How does one value the benefit to the patient when a
condition that once required a complicated operation and a lengthy stay in the hospital now can
be easily treated on an outpatient basis?
Although we may not be able to discern its details, the pace of change and the shift toward
output that is difficult to measure are more likely to quicken than to slow down How, then,
will we measure inflation in the future if our measurement techniques become increasingly
obsolete? We must keep in mind that, difficult as the problem seems, consistently measured

-7prices do exist in principle Embodied in all products is some unit of output, and hence of
price, that is recognizable to those who buy and sell the product if not to the outside observer
A company that pays a sum of money for computer software knows what it is buying, and at
least has an idea about its value relative to software it has purchased in the past, and relative to
other possible uses for that sum of money in the present
Furthermore, so long as people continue to exchange nominal interest rate debt instruments
and contract for future payments in terms of dollars or other currencies, there must be a
presumption about the future purchasing power of money no matter how complex individual
products become Market participants do have a sense of the aggregate price level and how
they expect it to change over time, and these views must be embedded in the value of financial
assets
The emergence of inflation-indexed bonds, while providing us with useful information,
does not solve the problem of ascertaining an economically meamngful measure of the general
price level By necessity, the total return on indexed bonds must be tied to forecasts of
specific published price indexes, which may or may not reflect the market's judgment of the
future purchasing power of money To the extent they do not, of course, the implicit real
interest rate is biased in the opposite direction Moreover, we are, as yet, unable to separate
compensation for inflation risk from compensation for expected inflation
Eventually, financial markets may develop the instruments and associated analytical
techniques for unearthing these implicit changes in the general pnce level with some precision
In those circumstances, then—at least for purposes of monetary policy—these measures could
obviate the more traditional approaches to aggregate price measurement now employed They
may help us understand, for example, whether markets perceive the true change in aggregate
prices to reflect fixed or variable weight indexes of the components, or whether arithmetic or
logarithmic weighting of the components is more appropriate

-8But, for the foreseeable future, we shall have to rely on our statistical agencies to produce
the price data necessary to assess economic performance and to make economic policy

In that

regard, assuming further advances in economic science and provided that our statistical
agencies receive adequate resources, procedures should continue to improve To be sure,
progress will not be easy, for estimating the value of quality improvements is a painstaking
process It must be done methodically, item by item But progress can be made
In recent years, we have developed an improved ability to capture quality differences by
pricing the underlying characteristics of complex products With an increasingly wide range
of product variants available to the public, product characteristics are now bundled together in

an enormous variety of combinations A "personal computer" is, in actuality, an
amalgamation of computing speed, memory, networking capability, graphics capability, and so
on Computer manufacturers are moving toward build-to-order systems, in which any
combination of these specifications and peripheral equipment is available to each individual
buyer

Other examples abound Advancements in computer-assisted design have reduced the

costs of producing multiple varieties of small machine tools And in services, witness the
plethora of products now available from financial institutions, which have allowed a more
complete disentangling and exchange of economic risks across participants around the world
Although hard data are scarce, there can be little doubt that products are tailor-made for the
buyer to a larger extent than ever Gone are the days when Henry Ford could say he would
sell a car of any color "so long as it's black "
In such an environment, when product characteristics are bundled together in so many
different combinations, defining the unit of output means unbundling these characteristics, and
pricing each of them separately The so-called hedonic technique is designed to do precisely
that This technique associates changes in a product's price with changes in product
characteristics

It therefore allows a quality comparison when new products with improved

-9charactenstics are introduced This approach has been especially useful in the pricing of
computers But hedonics are by no means a panacea First of all, this technique obviously
will be of no use in valuing the quality of an entirely new product that has fundamentally
different characteristics from its predecessors The benefits of cellular telephones, and the
value they provide in terms of making calls from any location, cannot be measured from an
examination of the attributes of standard telephones
In addition, the measured characteristics may only be proxies for the overall performance
that consumers ultimately value In the case of computers, the buyer ultimately cares about
the quality of services that computer will provide-word processing capabilities, database
services, high-speed calculations, and so on But, in many cases, the number of message
instructions per second and the other easily measured characteristics may not be a wholly
adequate proxy for the computer services that the individual buyer values In these
circumstances, the right approach, ultimately, may be to move toward directly pricing the
services we obtain from our computers-that is, word processing services, database
management services, and so on—rather than pricing separately the hardware and software
The issues surrounding the appropriate measurement of computer prices also illustrate
some of the difficulties of valuing goods and services when there are significant interactions
among users of the products New generations of computers sometimes require software that
is incompatible with previous generations, and some users who have no need for the unproved
computing power nevertheless may feel compelled to purchase the new technology because
they need to remain compatible with the bulk of users who are at the frontier

Even if our

techniques allow us to accurately measure consumers' valuation of the increased speed and
power of the new generation of computer, we may miss the negative influence on some
consumers of this incompatibility Therefore, even in the case of personal computers, where

-10we have made such great strides in measuring quality changes, I suspect that important
phenomena still may not be adequately captured by our published price indexes
Despite the advances in price measurement that have been made over the years, there
remains considerable room for improvement As you know, a group of experts empaneled by
the Senate Finance Committee—the Boskin commission—concluded that the consumer price
index has overstated changes in the cost of living by roughly one percentage point per annum

in recent years About half of this bias owed to inadequate adjustment for

quality

improvement and the mtroduction of new goods, and about half reflected the manner in which
the individual prices were aggregated Researchers at the Federal Reserve and elsewhere have
come up with similar figures

Although the estimates of bias owing to inadequate adjustment

for quality improvements surely are the most uncertain aspect of this calculation, the
preponderance of evidence is that, on average, such a bias in quality adjustment does exist
The Boskin commission and most others estimating bias in the CPI have taken a
microstatistical approach, estimating separately the magnitude of each category of potential
bias Recent work by staff economists at the Federal Reserve Board has added corroborating
evidence of price mismeasurement, using a macroeconomic approach that is essentially
independent of the microstatistical exercises Specifically, employing disaggregated data from
the national mcome and product accounts, this research finds that the measured growth of real
output and productivity in the service sector is implausibly weak, given that the return to
owners of busmesses in that sector apparently has been well-maintained

Indeed, the published

data indicate that the level of output per hour in a number of service-producing industries has
been falling for more than two decades It is simply not credible that firms in these industries
have been becoming less and less efficient for more than twenty years Much more reasonable
is the view that prices have been mismeasured, and that the true quality-adjusted prices have
been rising more slowly than the published price indexes Properly measured, output and

-11productivity trends in these service industries are doubtless considerably stronger than
suggested by the published data Assuming, for example, no change in the productivity levels
for these industries in recent years would imply a price bias consistent with the Boskin
commission findings
A Commerce Department official once compared a nation's statistical system to a tailor,
measuring the economy much as a tailor measures a person for a suit of clothes—with the
difference that, unlike the tailor, the person we are measuring is running while we try to
measure him The only way the system can succeed, he said, is to be just as fast and twice as
agile That is the challenge that lies ahead, and it is, indeed, a large one
There are, however, reasons for optimism The information revolution, which lies behind
so much of the rapid technological change that makes prices difficult to measure, will surely
play an important role in helping our statistical agencies acquire the necessary speed and
agility to better capture the changes taking place in our economies Computers, for example,
might some day allow our statistical agencies to tap into a great many economic transactions
on a nearly real-time basis Utilizing data from store checkout scanners, which the BLS is
now investigating, may be an important first step in that direction But the possibilities
offered by information technology for the improvement of price measurement may turn out to
be much broader in scope Just as it is difficult to predict the ways in which technology will
change our consumption over tune, so is it difficult to predict how economic and statistical
science will make creative use of the improved technology
Such advances must be taken to ensure that our economic statistics remain adequate to
support the public policy decisions mat must be made If the challenge for our statistical
agencies is not to lose in their race against technology, the challenge for policymakers is to
make our best judgments about the limitations of the existing statistics, as we design policies to
promote the economic well-being of our nations In confronting those challenges, both

-12government statisticians and policymakers would benefit from additional research by you, the
economics profession, into the increasingly complex conceptual and empirical issues involved
with accurately measuring price and quantity