View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

BANK OF JAPAN CONFERENCE ON MONETARY POLICY
IN A WORLD OF KNOWLEDGE-BASED GROWTH, QUALITY CHANGE,
AND UNCERTAIN MEASUREMENT
Tokyo, Japan
June 19, 1998

.....................................................................

1. Introduction
The theme of this conference is a very important one for monetary policy in the United States and globally.
How should monetary policy be conducted in a world of knowledge-based growth, quality change and uncertain measurement? In the United States, the Federal Reserve’s monetary policy objectives are to foster maximum sustainable economic growth and low inflation. In order to achieve these goals, at a minimum, policy
makers need to have useful estimates of the economy’s growth potential, an assessment of the factors that
may be changing this potential growth rate, and the relationship between inflation and these possibly changing growth factors.
I found the papers in this session by Bob Gordon and Jim Stock to be very interesting and helpful, but I must
say they are useful reminders of how difficult a central banker’s job is. My comments are organized in the following way. First, I’ll discuss Stock’s analysis of the NAIRU and how those estimates compare with research at
the Federal Reserve Bank of Chicago. Second, I’ll comment on Gordon’s paper and some alternative interpretations of the information technology revolution. And third, I’d like to return to Stock’s discussion of robust
monetary policy in the face of parameter uncertainty, which was also discussed by John Taylor yesterday.

2. Stock paper
In the first part of Stock’s paper, he discusses a method for estimating the nonaccelerating inflation rate of
unemployment, the NAIRU. The NAIRU is defined as the rate of unemployment that leads to a constant rate
of price inflation. If this theory is accurate and if monetary authorities know the value of the NAIRU relative
to today’s unemployment rate, this knowledge would be useful for hitting an inflation objective.

352

Michael Moskow Speeches

1998

The estimation methods here are more advanced than two earlier papers by Stock and his co-authors that I
am aware of (a 1996 NBER conference I attended where Staiger, Stock and Watson presented related results
on the NAIRU; and a 1995 Federal Reserve Bank of Chicago Economic Perspectives article by Jim, Mark
Watson, and Robert King). But as Jim said a basic result from this research is that a large degree of uncertainty surrounds our knowledge of the NAIRU. For example, glancing at the estimated NAIRU in Figure 1
for total employment from 1961-97, I’m left with the impression that the NAIRU varies somewhat over time.
The standard error bands and statistical tests, however, cannot reject the hypothesis that the NAIRU is constant over this period. Nevertheless, I agree with Jim’s decision to consider the time-varying estimates. If
this theory is useful for policy making, I will take seriously that the NAIRU moves around somewhat over
time. But what are the factors that account for this variation?
Stock’s paper investigates the correlations between the NAIRU estimates and several proxies for the shift
toward knowledge-based production. These proxies are the share of income in services; the shares of
employment in services; business services; and finance, insurance and real estate. At best, the evidence
seems mixed that time-variation in the NAIRU is due to trend movements toward knowledge-based production. (Stock’s assessment at the bottom of page 13.)
I want to pursue this issue further in a slightly different context. In his 1968 Presidential address to the
American Economic Association, Milton Friedman defined the natural rate of unemployment in two complementary ways. One way explicitly defines the NAIRU in terms of inflation. This approach is used by
Stock and Gordon. The other definition is that the natural rate of unemployment is that rate of unemployment that the economy would grind out on its own over time. This latter definition doesn’t make any explicit references to the rate of inflation. I want to briefly describe our efforts at the Federal Reserve Bank of
Chicago to estimate a natural rate of unemployment that is consistent with this alternative definition. And
then I’ll relate this alternative measure back to the theme of the conference. This analysis is based upon
research conducted by Ellen Rissman.
The natural rate of unemployment reflects factors that contribute to frictional and structural unemployment, but not cyclical unemployment. Rissman’s analysis starts by dividing industry employment movements into two parts: a cyclical component and an industry component. The cycle variable is unobserved,
but estimated from the similar way in which it influences each industry over time. The industry component
is the idiosyncratic part of the employment movement that is unrelated to the business cycle component.
Rissman assumes that large amounts of dispersion in these idiosyncratic industry components for any given
time period are associated with a high degree of labor reallocation across sectors and industries. In this scenario, the natural rate rises when this reallocation activity is high (and the activity is not due to the common business cycle component). This analysis is a refinement of David Lilien’s 1982 research on sectoral
dispersion and structural unemployment.
In many ways, the qualitative movements in Rissman’s natural rate of unemployment are similar to Stock’s
estimates. Like Stock’s estimates in figure 1 for total unemployment, Rissman’s natural rate generally rises
through the 1960s until the mid-1970s. Both measures display a double-hump in the mid-1970s and early
1980s (although the timing is not identical). Both measures have generally fallen since the early 1980s to
the present, although Rissman’s measure increased briefly in the early 1990s due primarily to idiosyncratic government employment shocks which seem related to the reduced military spending. On the other
hand, Rissman’s natural rate is often one to one and a half percentage points higher than Stock’s estimates.

Michael Moskow Speeches

1998

353

And Stock’s measure is smoother than Rissman’s. In any event, it appears that these two measures are capturing similar economic phenomena, but with sufficiently interesting differences to warrant considering
both measures.
With Rissman’s concept of the natural rate, there are two ways that the natural rate can change over time
which relate to the theme of this conference. First, if the idiosyncratic industry shocks become less volatile
over time, then Rissman’s dispersion measure will fall over time. Hence, structural reallocation and the natural rate will fall. Now why might this be the case? I’m going to be speculative here. Perhaps information
technology is making inventory controls more efficient. Fewer stock-outs and less excess inventory building
could mean more stable employment within a sector. Similarly, production methods have undoubtedly
improved somewhat due to IT. In any event, Rissman’s estimated idiosyncratic manufacturing shocks are
substantially smaller since the mid-1980s compared with the 1970s and early 1980s. This accounts for a
substantial fall in the natural rate over this period.
Second, if the employment shares are increasing over time in the industries with relatively lower idiosyncratic shock variances, then the weighted average dispersion measure will fall. In fact, the services and FIRE
sectors have relatively low volatility, and Stock’s figures show a rising share of employment in these sectors
which would imply a lower natural rate in Rissman’s model.
So, in this model of a time-varying natural rate, growth in information technology and knowledge-based
production could account for some reductions in the natural rate over time. The very rough evidence that
I’ve sketched suggests that smaller idiosyncratic shocks have played a larger role than slowly shifting
employment shares. In this sense, Stock’s weak evidence is compatible with Rissman’s analysis. Although
the results are not definitive, both Stock’s and Rissman’s models give weak support to the hypothesis that
the natural rate has declined due to fundamental changes in the labor market.

3. Gordon paper
Now let me turn to Bob Gordon’s paper. The focus of his paper is the relationship between IT and productivity growth. To the extent that higher productivity can support a lower NAIRU, this is related to Stock’s
analysis. The impression we get from Gordon’s interesting and provocative paper is that investment in IT is
not leading to higher productivity growth.
The first step in Gordon’s analysis is an examination of evidence on productivity growth. From the perspective of proponents of the “new economy” the persistent slowdown in measured productivity growth since
the early 1970s appears to present a problem. Gordon’s analysis of productivity is aimed at deflating the case
that the official government data systematically understate the contribution of IT to productivity growth.
Here I think Gordon is quite convincing.
Let us assume that Gordon is correct and that indeed the productivity growth slowdown is genuine and not
a figment of measurement error. What does this say about the prospects for a “new economy”? Gordon’s view
is that the slowdown is in fact a reflection of a fundamental problem with computers, specifically, and IT
more broadly. His thesis is argued in four parts.
First, compared to other eras of great technological change — such as electricity, electric motors, the internal combustion engine, petrochemicals, and plastics — the development of computers and related informa-

354

Michael Moskow Speeches

1998

tion technology is not very significant. Second, Gordon argues that there are fundamental limits on what
computers can do so that large sectors of the economy like services cannot take advantage of them. Third,
in Gordon’s assessment, the best uses of IT have already been implemented and future uses will have much
lower payoff than previous uses. Finally, Gordon views a large percentage of current uses of IT to be wasteful and that far from contributing to productivity growth, IT in many instances subtracts from it.
Taken together these points make the case that the observed productivity slowdown is in large part due to
overinvestment in IT and that we should not be expecting any substantial productivity gains from IT ever.
Gordon’s discussion of these issues represents a healthy challenge to conventional wisdom and therefore are
welcomed. However, since his views are largely based on anecdote and opinion, it is difficult to judge their
validity. If his thesis were the only reasonable interpretation of the productivity slowdown in the face of
seemingly rapid advancements in IT and their widespread implementation, it would be cause for great concern. Fortunately, it is not.
What I have in mind here is the so-called delayed implementation hypothesis advanced by Paul David yesterday and a related set of arguments developed by Jeremy Greenwood of the University of Rochester and
his co-authors that I will discuss.
David and Greenwood look back in history to other episodes in which great technological advances
occurred. Interestingly these episodes have many things in common with recent experience in the US.
Consider Greenwood’s analysis of the Industrial Revolutions in Britain and the US. These years were marked
by (i) an initial slowdown in productivity growth, (ii) an initial rise in income inequality and (iii) dramatic drops in the prices of new capital equipment associated with the underlying technological innovations.
These facts are consistent with what we have seen in the US over the last 25 years.
What connects these observations? Greenwood and his co-authors develop a formal economic model with
several key ingredients. First, technological progress is embodied in new equipment whose cost of production falls continuously through time. Second, firms face a learning curve when they adopt a new technology. Third, firms can travel down their learning curves faster only if they hire skilled labor. The basic idea is
that it takes time for firms to work out the best uses of new technology and that many mistakes will be made
in the process. However, eventually the best uses will be found and productivity will rise. Moreover, when
the best uses of the new technology have been found, skilled labor will become less necessary, so we should
expect wage inequality to decline.
One of the interesting results to come out of the work by Greenwood and his co-authors is that there are
very long lags between the initial introduction of the new technology and subsequent productivity gains. In
their model it takes about 20 years before productivity growth surpasses its old level and 40 years for the
level of productivity to cross its old trend line, the path that productivity would have traveled along if it had
continued at its old growth rate. Another interesting result from their analysis is that in the initial stages of
the introduction of the new technology, the stock market is predicted to boom.
I don’t want this discussion to mean that we should ignore Gordon’s points and adopt the David-Greenwood
view. But there are other reasonable explanations for why measured productivity has slowed in the face of
tremendous investments in new technology. In the end we will have to wait for many years before one view
or another is proved correct (or maybe some other, entirely different explanation will emerge.) The DavidGreenwood view and the Gordon view are virtually irrefutable without many more years of data.

Michael Moskow Speeches

1998

355

Ultimately, I am left with the impression that we know very little about what is really going on. We are
uncertain about which economy we are in.

4. Monetary policy under uncertainty
How should monetary policy be conducted when policy makers are uncertain about their model of the economy? This leads me to the final section of Jim Stock’s paper and John Taylor’s paper. The intuition from
Brainard’s analysis of this question is that policy makers should respond cautiously. When Stock and others
have considered this question in a different context, they find that policy makers should respond aggressively to many uncertain situations. (This is a fast growing field of analysis that is called robust control theory). I think the intuition goes something like this. Suppose that inflation is expected to increase over the
next year. But the policy maker is uncertain whether a 100 basis point tightening will have a negligible effect
or a super contractionary effect. Brainard-style caution would probably suggest tighten by 100 basis points
and then wait to see if the policy move had a small effect or a stringent effect. The robust control intuition
is that policy makers should move more aggressively, say by 300 basis points. If the negligible effect is relevant, this action will move inflation closer to the desired level. If the super contractionary effect is active,
it’s o.k. because now the monetary authority knows it can quickly shift gears back to an easier stance. And
because policy is very effective, this will end up being a reasonably controlled situation. There is a sense in
which this is really a cautious policy prescription, even though it would not be recognized that way by its
effects on the policy instrument.
I think a potential short-coming of the robust control analysis here is the simplicity of the economic example. Volatile monetary policy actions can whipsaw financial markets and individual planners. These actions
have potential costs that are not fully accounted for in this analysis. Reversing the policy course as rapidly
as the robust control policy dictates may have additional negative effects on the economy that are not
accounted for in these experiments. I’m not sure this is the case; I suspect that many policy makers place
some credence on this possibility. But having read this paper and recognizing my own uncertainty about
how the economy changes over time, I’ve learned something: I’m not quite sure what the cautious course of
action really is. There is a temptation to equate caution with small movements in the monetary policy
instrument. But in fact, equating caution with small movements in the policy goal variables may require
accepting larger movements in the policy instrument. This is an interesting idea that I’m glad more research
is being directed towards.

5. Conclusion
In conclusion, the Stock and Gordon papers are interesting and thought-provoking. They both suggest a
high degree of uncertainty about the true nature of the economy. This uncertainty is unlikely to be resolved
within a time frame that allows monetary policy to ignore its many potential consequences. We are thus left
with the task of formulating monetary policy in an uncertain world —- something that we have done in the
past and are certain to do in the future.

356

Michael Moskow Speeches

1998

6. References
Brainard, W. (1967), ‘Uncertainty and the Effectiveness of Policy,’ American Economic Review 57: 411-425.
Greenwood, J. (1996), ‘The Third Industrial Revolution,’ working paper, Rochester Center for Economic
Research, #435 (prepared for the American Enterprise Institute).
Greenwood, J. and Yorukoglu, M. (1996), ‘1974,’ working paper, Rochester Center for Economic Research,
#429.
King, R., Stock, J., and Watson, M. (1995), ‘Temporal Instability of the Unemployment-Inflation
Relationship,’ Economic Perspectives, Federal Reserve Bank of Chicago, May/ June, 2-12.
Lilien, D. (1982), ‘Sectoral Shifts and Cyclical Unemployment,’ Journal of Political Economy 90: 777-793.
Rissman, E. (1997), ‘Measuring Labor Market Turbulence,’ Economic Perspectives, Federal Reserve Bank of
Chicago, May/ June, 2-14.
Rissman, E. (1998), ‘Estimates of the Natural Rate of Unemployment,’ unpublished manuscript, Federal
Reserve Bank of Chicago.
Staiger, D., Stock, J. and Watson, M. (1996), ‘How Precise are Estimates of the Natural Rate of
Unemployment?’ in C. Romer and D. Romer (eds.), Reducing Inflation:
Motivation and Strategy (Chicago: University of Chicago Press for the NBER): 195-242.

Michael Moskow Speeches

1998

357