View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

Preliminary

Econometric Models and the Monetary Policy Process

David L. Reifschneider, David J. Stockton, and David W. Wilcox
Board of Governors of the Federal Reserve System
Washington D. C. 20551
October 1996

The analysis and conclusions set forth are those of the authors and do not indicate
concurrence by other members of the staff, by the Board of Governors, or by the Federal
Reserve Banks.




1.

Introduction
Economic analysis is both a science and an art. Science enters through the use of

sophisticated econometrics, rigorous theory, and high-speed computers. Art enters in order to
deal with all the shortcomings of science. This paper is about the ongoing effort of the staff
at the Federal Reserve Board to mix science and art in a sensible fashion in support of the
monetary policymaking process at our institution.
More specifically, the paper addresses the use of econometric models in monetary
policymaking. As we describe in the second section of this paper, there are many
econometric models in use at the Federal Reserve Board. These models run the gamut from
single-equation models describing individual markets to large-scale systems describing the
U.S. economy as a whole. Even within the latter group of models, there are important
distinctions among models: Some models take a structural approach while others are
completely non-structural; some models focus on U.S. markets exclusively while others
embed our economy in the world context.
As we describe in the third section of the paper, these models are used in a wide
variety of activities at the Federal Reserve Board. One such activity is economic forecasting.
A key objective of this section of the paper is to convey a realistic understanding of how the
staff forecast is assembled and the role of models in that process. A central theme that
emerges from this section of the paper is that models are rarely, if ever, used at the Federal
Reserve without at least the potential for intervention based on judgment.

Instead, our

approach involves a mix of science and art, where “science” is meant to denote a rigid and
unyielding adherence to model-based methods and “art” is meant to denote the selective




-

2-

application of judgment guided by information not available to the model. Models are also
used for other important activities at the Board aside from forecasting, including general
policy analysis and the evaluation of monetary policy rules, and we describe in section 3 the
role of models in these areas as well.
In section 4, we evaluate the strengths and weaknesses of the mixed approach to
forecasting and policy analysis typically used at the Federal Reserve. We discuss the types of
problems that motivate the reliance on the judgmental approach. However, we recognize that
there are hazards inherent in the judgmental approach, and we review these risks and the
manner in which they are addressed.
In section 5, we use the experience of the 1990-91 recession to illustrate some of the
practical difficulties involved in economic forecasting and policy analysis. During that
episode, an overwhelming volume of anecdotal evidence suggested that the economy was
being held back by financial “headwinds,” including an atypical reluctance of banks to
lend—possibly associated with the implementation of new capital standards—and an unusual
reluctance of households and businesses to spend, as they labored to improve the state of their
balance sheets. Moreover, a nascent research literature (contributed to in part by Board
authors) suggesting that such forces could be understood in the context of rigorous models,
and that they might play an empirically important role in shaping the business cycle.
However, standard macroeconometric models of the day—including the Board’s—gave no
role to such forces in the determination of real activity. In response to the anecdotal evidence
and the research, the staff modified its analysis in a way that it never could have done had it




-3-

been locked into a doctrinaire view of the forecasting process Finally, we present some brief
conclusions in section 6.
2.

Macroeconomic models at the Federal Reserve Board
There is no one model that represents “the” official staff model of the Federal Reserve

Board. Rather, many models are in use, each designed with a different purpose in mind.
These models vary in complexity and scope, in their theoretical and empirical foundations,
even in their formality. In this section, we present a brief overview of the major
macroeconomic models in use at the Board. Our focus is on their general design; discussion
of their roles in providing support to monetary policy is deferred until section 3.1
2.1

The MPS model
From the late 1960s until the beginning of this year, the MPS model was the Board's

primary formal model of the U.S. economy.2 It was a large-scale quarterly model, with about
125 stochastic behavioral equations and more than 200 identities. This model had a
neoclassical steady-state: In the long run, all markets cleared and the marginal product of
each factor of production (labor, capital and energy) was equilibrated to its relative price.
The aggregate production technology was characterized as Cobb-Douglas, so the growth of
output in the long run was a function of population growth and the growth of multi-factor
productivity, both of which were assumed to be exogenous. However, the steady-state level

xIn this review, we make no claim to provide a comprehensive treatment of empirical
macroeconomic work carried out at the Federal Reserve Board, let alone at the Federal
Reserve Banks.
2See Brayton and Mauskopf (1985) for an overview of the structure and properties of
the MPS model.




-4of production was influenced by a variety of additional factors, including marginal tax rates,
transfer payments and other government policies, that altered both the after-tax rate of return
on investment and the propensity to save. The latter role for fiscal policy arose because
consumers were assumed to be non-Ricardian, and because the private saving rate depended
on the composition of household income and ratio of property wealth to income.
By contrast, the short-run structure of the MPS model was Keynesian in spirit: Output
and employment were largely determined by the level of aggregate demand because wages
and prices were assumed to adjust only in a sluggish manner. As a result, both monetary and
fiscal policy had significant effects on real activity in the short run.3 The short-run dynamics
of the model were quite complex, owing not only to the wage-price sector but also to the
equations for household and business spending. These dynamics arose not from tightly
specified theoretical models, but from estimated equations in which the level of an (often
nonstationary) variable was regressed on a set of explanatory variables, selected on the basis
of theory. Coefficients on explanatory variables typically were constrained to lie on loworder polynomial distributed lags; such restrictions were justified as a parsimonious way to
capture a combination of adjustment costs and expectations formed from an autoregressive
forecasting process. However, no attempt was made to identify expectations from intrinsic
sources of sluggish adjustment.

3Strictly speaking, the model was not superneutral owing to the various nonneutralities
in the tax code. As a result, monetary policy had modest long-run effects on the capitaloutput ratio, and thus on the level of potential output. Otherwise, the MPS aggregate supply
curve was vertical in steady-state.




-52.2

The FRB/IJS model
Owing partly to dissatisfaction with the treatment of expectations in the MPS model,

as well as a desire to update the theoretical basis of the model and to improve upon the
econometric techniques used in estimation, staff began to work several years ago on
developing a replacement. The result is the FRB/US model, which became fully operational
earlier this year.4 Like the MPS model, FRB/US is a relatively large quarterly model of the
U.S. economy; it has roughly 30 stochastic behavioral equations and about 300 accounting
and expectational identities.5 In addition, FRB/US has a similar neoclassical steady-state
structure. However, its short-run dynamic structure is considerably different from that of the
older model. In particular, a great deal of effort has been directed toward disentagling
expectations from other sources of dynamic behavior, in the context of an explicit theoretical
structure. As a result, model users are able to examine the sensitivity of forecasts and policy
simulations to different assumptions about the manner in which expectations are formed.
Most of the major behavioral equations in FRB/US are derived from formal
specifications of optimizing behavior of forward-looking households, firms, and investors. In
the case of asset prices, this approach gives rise to conventional arbitrage equations in which,
for example, the yield on a bond of a given maturity equals a weighted average of expected

4For an overview of the FRB/US model, see Brayton and Tinsley (1996).
5However, the effective size of the model is considerably smaller because many
identities could be easily dropped or substituted out: A linearized version of the model that
does just this—in order to economize on space for stochastic simulations under modelconsistent expectations—has only 95 identities, of which 35 determine expectational variables.




-6future short-term interest rates, plus a term premium. Similarly, the value of corporate
equities equals the present value of expected dividends.
In the other sectors of the model, where frictions such as staggered contracts, habit
persistence, labor training, and investment planning and installation costs are significant, the
derivation of behavioral equations is more complicated. Agents are assumed to solve an
explicit cost minimization problem, in which the cost of deviating from a desired trajectory is
balanced against a generalized specification of adjustment costs.6 The resulting decision rules
take the form of tightly parameterized error-correction equations. In these equations, the
change in, say, consumption is a function of three factors: (1) the lagged deviation of actual
spending from its desired, or target, level (as defined by the present value of expected future
income); (2) lagged changes in actual consumption; and (3) a weighted sum of expected
future changes in the path of target consumption, where the weights are determined by the
estimated coefficients on the first two factors.7
Among the explanatory variables in these dynamic equations, are expectations of
future variables. In order to estimate these equations, the staff must develop proxies for these
expectations. In FRB/US, these proxies are computed using small-scale VAR models. The

Adjustment costs are approximated by a higher-order polynomial, rather than the
conventional quadratic specification. This flexible functional form implies that it is costly to
adjust not only the level of a decision variable, but also its growth rate, its acceleration, and
so forth. By writing the cost minimization problem in this manner, one obtains decision rules
in an error-correction format, where significant coefficients on lagged changes in the decision
variable imply adjustment costs that are of an order higher than quadratic (Tinsley, 1993).
7The dynamic consumption equation is actually somewhat more complicated than
suggested here, in that target spending depends on the composition as well as the level of
permanent income (owing to aggregational and distributional effects), and the dynamic
equation is modified to account for liquidity-constrained households.




-7-

result is a theory-based model that fits the historical data reasonably well.

The model also

provides considerable flexibility in the treatment of expectations. In addition to using the
historical VAR system to generate expectations for policy simulations, expectations can also
be generated in alternative ways, including: (1) VAR systems designed to be fully consistent
with a particular characterization of policy; (2) full model-consistent expectations; and (3)
variants of (1) and (2) in which agents must learn about key features of the model, such as
the inflation goal of the monetary authorities. In the case of forecasting, the practice to date
has been to project expectations using the historical VAR system.
Given the explicit treatment of expectations in the model, stability in simulations is
guaranteed only if agents are assumed to expect both monetary and fiscal policy to be
governed by explicit rules. The authorities can depart from these rules in the short run but
not in the long run.

Thus FRB/US incorporates explicit monetary policy reaction functions

(typically variants of Taylor's rule), and fiscal rules that target an exogenous debt-to-GDP
ratio by endogenously adjusting tax rates. In the case of the MPS model, such explicit
characterizations of policy were not necessary (except in long-run simulations), owing to the
adaptive treatment of expectations.

2.3

The Multi-Countrv model (MCM~)
The MCM is a large-scale dynamic model of the world economy, with nearly 1400

equations. These equations are broken into country and regional blocks, so that there are
separate specifications for each of the G-7 economies and Mexico, as well as ones for
regional aggregates such as the “other” OECD economies, the newly industrialized
economies, and OPEC (excluding Mexico).




A typical country block for a G-7 economy has

-8about 35 behavioral equations and 100 identities; the general specification of these equations
is fairly similar across countries.
The MCM has been in use at the Federal Reserve since the late 1970s, but has
recently undergone a redesign (completed this year) similar in conceptual direction to that of
FRB/US.8

Like FRB/US, the new MCM treats expectations explicitly, setting them either in

a model-consistent manner, or according to a small-scale VAR forecasting system.9 The use
of expectational mechanism is pervasive: Bond yields equal a weighted average of expected
future short-term interest rates, the exchange rate is determined by an uncovered interest
parity condition, consumption and investment depend on the expected real interest rate and
the nominal wage rate depends on expectations of both future wage rates and unemployment.
The new MCM is neoclassical in its long-run structure, and it too employs explicit
policy reaction functions that tie-down the long-run level (or growth) of the price level and of
the indebtedness of the government. However, the treatment of adjustment costs is somewhat
simpler, although dynamic equations are often written in an error-correction format.
Parameter values are either estimated directly or taken from other econometric work to
replicate the empirical behavior of different aspects of the individual countries.

2.4

The World model
Of all the models at the Federal Reserve Board, the world model probably has the best

claim to being “the” official staff model. The World model is a hybrid that combines the

8For an overview of the original MCM model, see Stevens, et al.(1984).
9For an introduction to some of the properties of the new MCM, see. Levin (1996)..




-9-

non-U.S. blocks of the MCM with the domestic equations of FRB/US.10 The merger exploits
the strengths of each of its components. (When FRB/US is simulated on its own, foreign
output and inflation are essentially exogenous; when the MCM is simulated on its own, the
treatment of the U.S. economy is less detailed.) At present, the treatment of expectations in
the World model is more limited than in either of its components, in that only VAR-based
expectations are possible. However, work is currently under way to produce a version of the
combined model that can be simulated under full model-consistent expectations.

2.5

Other quarterly macroeconometric models
In addition to the large-scale macro models, the Federal Reserve staff also maintains

other models of the aggregate economy, such as vector autoregression models (VARs). The
standard VAR is small and only includes three or four variables, such as a measure of real
activity, inflation, and a financial indicator. Given the low cost of exploring alternatives,
many different specifications may be in use at any given time: Real activity can be measured
by the unemployment rate, manufacturing capacity utilization, or the GDP gap; inflation can
be measured using the CPI or the GDP chain-weight price index; financial conditions can be
measured by the federal funds rate, M2 growth, or non-borrowed reserves.

Such VARs are

viewed as providing an atheoretical check on the dynamics generated by our structural models
in response to various shocks.
In addition to these models, which are maintained by Federal Reserve staff, we also
consult the models of several commercial vendors.

10Prior to the creation of FRB/US, the World model (dubbed MOMBO) consisted of
the linked MPS and old MCM models. The practice of combining the main domestic and
international models goes back several years.




-103.

The uses of macroeconometric models in the monetary policy process
In this section, we shall describe the principal activities in which macroeconometric

models are used in the monetary policymaking process. We shall focus on three distinct, but
related, enterprises in which the models play a critical role: economic forecasting, the
generation of impact multipliers for policy shocks and for changes in other exogenous
variables, and the evaluation of alternative monetary policy rules. We will also provide a
rationale for the approaches we have chosen to employ in these activities.
Before describing how macroeconometric models are used by the staff, we should
point out that models play an often overlooked, but important, role as vehicles for
communication. Because the specification of aggregate demand and aggregate supply
embedded in the FRB/US model is a reasonable representation of the staffs conception of
macroeconomic behavior, the model provides a framework for discussions about the structure
of the economy and about the influence of monetary policy on key macroeconomic variables.
In this role, the model helps enforce clarity and precision on the staff’s macroeconomic
analysis. It also enforces a degree of consistency on the staff over time in our
communications with policymakers.11
To be sure, as is the case with the profession at large, there are differences of opinion
among the policymakers (and among the staff) concerning the validity of the models in use at
the Board as a complete description of the underlying structure of the economy. For some,

11 Unfortunately, econometric models seem to us to play a less prominent role in
facilitating conversation between the staff and the outside economists. This no doubt
importantly reflects the fact that academic economists, with only a few exceptions, gave up
the enterprise of structural modelling about 20 years ago.




-11these differences center on the specification of particular behavioral relationships. But for
others, there are fundamentally different views about the economic paradigm described by the
model. Such differences of opinion explain why the staff has developed alternative models,
and why not all discussion and research activity is carried out in the context of FRB/US.
Nevertheless, the model provides policymakers with a clear view of the baseline framework
employed by the staff in our macroeconomic analysis.
That said, there are limitations that could result from too rigid an adherence to the
framework embodied by the model. We know with certainty that the model is
misspecified—in some areas, probably by a great deal. The difficulty is that misspecification
often is revealed only gradually over time. Indeed, some constructive tension always exists
between the current specification of the model and the staffs evolving understanding of
macroeconomic behavior based on our research efforts and those of others. For example, the
treatment of expectations in the MPS model did not accurately reflect the staff’s view of how
the economy actually operates, especially the financial sector. The resulting divergence
meant, for example, that the MPS gave a different accounting of how a multi-year fiscal
package would affect the term structure of interest rates than the staff would have done. The
bottom line is that while, under most circumstances, the model serves as a useful
communication device, it cannot always be taken as a literal representation of the staff’s
“model.”
In the subsequent discussion, we shall outline three broad approaches to carrying out
macroeconomic analysis for policymaking. The first approach involves a large-scale
macroeconometric model without intervention; the second involves using of a large-scale




-12model with intervention; and the third is explicitly judgmental. By the use of a large-scale
model without intervention, we mean to connote a regime in which the model is not altered in
any respect once it has been estimated.12 Use of a large-scale model with intervention implies
to us an activity in which the analyst may shift intercepts or apply other add-factors to
residuals. A judgmental approach may incorporate information from one or more large- or
small-scale models, but the analysis is not framed in terms of adjustments to a single system
of equations. To be sure, the distinctions among these approaches are somewhat murky,
especially for the latter two approaches. In effect, the Board staff employs all three of these
approaches, but the activities in which they are used differ. At the risk of some
oversimplification, the economic forecast is produced using a judgmental approach, the
analysis of shocks to that forecast is conducted using an add-factored econometric model, and
analysis of alternative policy rules usually involves an unadjusted model.

We will attempt to

provide a rationale for this eclectic approach.

3.1

Forecasting
The Board staff prepares a detailed forecast of the U.S. and foreign economies prior to

each of the eight Federal Open Market Committee (FOMC) meetings held each year. The
staff presents this forecast and the accompanying analysis in a policy document called the
Greenbook. The projection interval typically is about two years in length. The forecast is
designed to provide a baseline for Committee discussion and is the view of the staff and not
of the members of the FOMC. It is quite common for Committee members to take issue with

12We recognize that even this definition leaves some role for judgment to play a role;
mainly, judgment would enter in from speculation, and possibly the unspecification, of the
model.




-13-

either the particulars or the broad contours of the staff projection. Some Committee members
may have different views about the economic outlook, even taking as given the policy
assumptions underlying the staff forecast. Others may view the staff projection as plausible
given the policy assumptions, but may regard the macroeconomic outcome as unsatisfactory
given their policy objectives. Finally, there may be disagreements about other conditioning
factors, such as foreign economic policies, fiscal policy, or oil prices. When there are
obvious risks associated with the assumptions underlying the forecast or with key
macroeconomic relationships, simulations of FRB/US, MCM, World, and commercial models
are used to highlight the sensitivity of the forecast to these risks. (These activities are
described in section 3.2.)
The baseline forecast presented in the Greenbook is referred to as a “judgmental”
projection because it does not result from a mechanical run of any large-scale
macroeconometric model nor is it derived directly by add factoring any such model. Until
recently, we have not made much use of “pure” model forecasts. Instead, persistent patterns
in equation errors (often regarded as evidence of structural change) were factors projected into
the future judgmentally. Equation add-factors were also adjusted to reflect current quarter
information. But during the major respecification project of the past several years, the staff
has developed an automated time-series approach to forecasting model residuals.
By design, equation residuals in the new model are supposed to be white noise. But
in practice, this property may not hold out of sample, perhaps owing to structural change. In
addition, for some equations it does not even hold within the estimation period (although the




-14economic importance of this residual serial correlation is not great).13 To take account of the
predictable element of the errors in the projection, all residuals are analyzed using a simple
time series model. For each equation, the model is fitted to the last 10 year's worth of data
using weighted least-squares, where the weights decline geometrically as one moves back in
time. (Roughly 50 percent of the weight is put on observations in the most recent three
years.) These coefficient estimates are then adjusted using Thiel's mixed estimation
procedure, where the best estimate of the “true” error model is defined to be a weighted
average of the least-squares estimator and a Bayesian prior that the errors are white noise.
The weights chosen in this procedure are a function of the variance-covariance matrix of the
least-squares estimates, so that more weight is placed on the Bayesian prior if the time series
model is estimated imprecisely. The final version of the model is then used to project the
equation errors over the forecast period. This new algorithmic approach should allow
practical consideration of a “pure” model forecast, though we have not yet accumulated
enough experience with this technique to assess its usefulness in actual application.
Consequently, we are still some distance away from being able to rely heavily on a pure
model-based forecast
The staff has also avoided framing the forecast in terms of add factors to our largescale models. FRB/US is used as an input in the broader judgmental forecasting process, but
it, like the MPS model, is not the principal vehicle for producing the forecast. In part, this

13There are a few exceptions to this statement, most notably the bond rate and stock
market equations. In these particular cases the residuals display a significant degree of serial
correlation, which has been interpreted as a time-varying component of the term and equity
premiums. Work is ongoing to develop economic models for these premiums.




-15reflects a concern that approaching the forecast as a collection of add factors would create too
narrow and mechanical a focus in both the preparation and communication of the forecast. In
developing the judgmental forecast, the staff considers a wide range of alternative
specifications for key behavioral relationships, including those specifications embodied in
FRB/US. In addition, the staff incorporates other non-model sources of information into its
forecast. We evaluate the pros and cons of this approach in the next section of the paper.
Before describing how FRB/US is employed as a tool in preparing the forecast, we review the
institutional setting in which the forecast is produced, and then outline the judgmental
forecasting process.
The judgmental forecasting process begins with a set of conditioning assumptions for
monetary and fiscal policy and for some key variables that, to a first approximation, are
exogenous to U.S. output, inflation, and interest rates. With respect to monetary policy, the
point of departure for the staff forecast most often is an assumption that the nominal federal
funds rate will remain unchanged over the next six to eight quarters. This procedure might
seem less natural than one in which the staff projection was organized around developing a
path for the federal funds rate that would minimize the losses associated with the “objective
function” of the policymakers. In part, our approach reflects important institutional features
of the Federal Reserve System. The FOMC is composed of the seven members of the Board
of Governors and five of the twelve Reserve Bank presidents.14 The Committee is a

14The President of the Federal Reserve Bank of New York serves on the FOMC every
year. Chicago and Cleveland serve in alternating years, and the other nine presidents serve
one year out of every three. All presidents participate in Committee discussions, regardless
of their current membership status.




-16deliberative body; therefore, issues surrounding the objectives, strategies, and tactics of
monetary policy are open to discussion. The diversity of the Committee members' views
makes it difficult to define clearly an “objective function” from which an “optimal” federal
funds rate path could be derived. The role of the staff in this process is not to recommend
policies but to provide the forecasts and analysis that will facilitate Committee deliberations
and decisions. Strictly speaking, therefore, the staff forecast is not an unconditional forecast
of the economic outlook. On the contrary, it is conditioned on a particular assumption about
the path of monetary policy which, at the time of the forecast, may not appear to be the path
most likely to be pursued. Of course, there are times when the economic forecast produced
by an assumption of an unchanged federal funds rate is so at odds with the stated objectives
of most policymakers that such a projection would not serve as a useful baseline for
discussions. Under these circumstances, the forecast would be conditioned on an alternative
path for the funds rate. In general, for each forecast, model multipliers on changes in the
federal funds rate are applied to the staff forecast to allow the policymakers to judge how the
projection would differ under alternative assumptions about interest rates.
Fiscal policy also is a key conditioning factor in the staff projection. To be sure, large
portions of the federal budget are sensitive to output, incomes, inflation, and interest rates,
and thus are endogenous to the forecast. The staff preserves this endogeneity by specifying
its assumptions in terms of tax policy and discretionary spending, which are seen as largely
exogenous over the two-year interval of the projection.
Other aspects of the forecast exhibit varying degrees of exogeneity. Foreign economic
activity is sensitive to U.S. economic and financial conditions, but considerably less so than




most components of domestic spending. The same is true of the price of oil. For these
inputs into the domestic forecast, feedbacks must be accounted for, though in general the
effects are small. As with the monetary policy assumptions, when especially large risks are
associated with these assumptions, model simulations are used to highlight the associated
sensitivities of the staff forecast.
At the start of each FOMC forecast process, a forecast coordinator provides the
forecast participants with these key conditioning assumptions as well as initial paths for
output and inflation, which often are those of the most recent forecast. Given these
conditioning assumptions and the economic and financial news that has become available
since the previous forecast, economists covering various components of aggregate
demand—consumption, fixed investment, inventories, government spending, and net
exports—and aggregate supply—labor supply, capital services, and wage and price
determination—adjust the projections for their sectors. These forecasts are then assembled by
the coordinator into an overall economic forecast of aggregate output, income, inflation, and
interest rates, along with projections of sectoral detail. This forecast is then relayed back to
the sector analysts, who react to any changes in the endogenous variables of relevance to their
sector and provide the coordinator with a new forecast. If necessary, further iterations occur
until the process converges. “Solution” of this rather cumbersome judgmental system is
tractable largely because the economic outlook generally does not change greatly from one
meeting to the next As a check on the process, FRB/US is used to ensure that the
implications of incoming news for intermediate-term dynamics of the economy have been
adequately taken into account




Y \M

-18Considerable effort in preparing the forecast is involved with filtering high-frequency
data to establish the near-term forecast. Getting the correct starting point is important
because much of the dynamics of spending and production in subsequent quarters depends on
an appropriate assessment of the current cyclical state of the economy. Is underlying income
growth strong? Is current production flowing mainly to final sales or to inventories? Are
increases in final sales adding mainly to demands on domestic or foreign producers?
Because the most recent statistical data are noisy, incomplete, and often contradictory, these
questions are more difficult to answer than might be imagined.. Large-scale
macroeconometric models, which rely primarily on quarterly time-series data, are not well
suited to this activity. By contrast, the staff involved in the judgmental forecast exploit a vast
array of econometric and statistical models in filtering and forecasting these high-frequency
data. (See, Braun (1990) for an example of a model that pools data from the establishment
and household surveys to produce a projection of current-quarter GDP.)
Board staff are continuously engaged in the interpretation of incoming data.

We

attempt to operate like a Kalman filter, extracting from the newly available information (say,
a monthly retail sales report or a revised quarterly reading on GDP growth) that portion
which is “news” in the formal sense. Conversations between the staff and the members of the
Board often are conducted in terms of the implications of the incoming information for the
Staff forecast. Large-scale models generally play little role in this near-term filtering exercise
because baseline assumptions about the soon-to-be-released indicators are established in the
course of each FOMC forecasting round, and it is a relatively (though not entirely)
mechanical process to compare these assumptions with the just-announced data.




-19In addition to near-term filtering models, each sectoral economist maintains and
consults a stable of econometric models relevant to his or her sector for guidance in the
preparation of the longer-term forecast. These models typically encompass a range of
plausible specifications—some are minor variations on a standard model, while others can
represent quite different paradigms. For example, the economists covering business fixed
investment maintain accelerator models, neoclassical investment models, putty-clay models,
and models that incorporate various financial constraints on investment spending, such as
corporate cash flow. The sensitivity of the forecasts of these models might also be examined
with respect to the particulars of specification: for example, the forecast sensitivity of
investment equations might be examined with respect to the choice of variables used to
measure sales expectations and the cost of capital, or the lag length assumed to capture
adjustment costs. This approach allows the judgmental forecasters a better understanding of
both the risks associated with the forecast and the sources of those risks; this is information
not easily filtered through the lens of a single large econometric model.
Although not used directly to produce the staff economic projection, the large
macroeconometric model serves an important function in this process. FRB/US is simulated
throughout the forecasting process and is updated for incoming data and changes in any
assumptions about policy or exogenous variables. (The model uses the staff judgmental
assumptions about fiscal policy, short-term interest rates, oil prices, the exchange rates, and
foreign activity.) The results of these model runs are used at both the sectoral and aggregate
level as an input to and a check on the judgmental projection. To flag any unusual
discrepancies, the staff examines the add-factors required on the model equations to produce




-20the staff judgmental projection. These implicit judgmental add-factors can be compared with
those produced by the algorithmic approach to provide a clear identification of those areas
where tensions exist between the staff’s judgment and that of the model. In addition,
stochastic simulations of the model are used to generate confidence intervals around the
model forecast. The position of the staff forecast relative to a 70 percent confidence interval
around the model forecast provides an additional consistency check on the staff forecast.
FRB/US plays a more significant role in the development of the staff projection when
there have been large changes in conditioning assumptions or in exogenous variables. Under
these circumstances, the judgmental process of incremental adjustment to the forecast could
not be counted on to converge quickly or accurately to a new solution path. As a
consequence, the model can be used to establish a new baseline that can serve as the point of
departure for the judgmental forecast when there have been large changes in policy
assumptions or in the paths of exogenous variables, or extraordinary amounts of “news”
received since the previous meeting. The next section describes how the models are used for
these and other similar exercises.

3.2

Policy Simulations
While macroeconomic models play an important role in the development of the

forecast at the Board, they may play an even more important role in addressing “what i f ’
questions. This role is generally fulfilled through multiple simulations of a model in which
baseline results are compared with those obtained when model structure or exogenous
variables are altered. The range of questions that models are used to address is broad, and
often includes issues well beyond the domain of monetary policy. Thus, while model




-21simulations are frequently used to gauge the macroeconomic consequences of hypothetical
monetary policy actions, they also provide quantitative answers to such diverse questions as
the dynamic response of the current account to an exogenous appreciation of the dollar, the
effects of an increase in the minimum wage on inflation, and the effects of changes in fiscal
policy on the term structure of interest rates, output, and capital accumulation. Just about any
question of general macroeconomic interest is fair game, and it is only the amount of detail
incorporated into the models themselves that limits their use in this regard.

3.2.1 Changes in conditioning assumptions
The policy simulations carried out at the Board can be divided into several categories.
Simulations in the first category involve changes in the conditioning assumptions that underlie
the staff economic projection. As noted above, the staff judgmental forecast is based on
explicit assumptions for several key macroeconomic variables—the federal funds rate, federal
spending and tax rates, oil prices, and foreign economic policies. A great deal of uncertainty
surrounds these assumptions, and an important issue in any forecast is how the overall
outlook would change if these variables were to evolve in a different manner. To this end,
simulation exercises are routinely carried out to estimate the quantitative influence of
plausible perturbations in these variables on aggregate output and inflation.
The evolution of alternative monetary policy settings is an important model activity.
With the introduction of the forward-looking models, it has become necessary to be more
specific about the ultimate purpose of the policy action. Specifically, is the shock to the
funds rate a transitory event, or is it the start of a sustained policy to alter the long-run level
of inflation? This question did not arise in the context of the older models because




-

22-

expectations were adaptive. But in the new models, how agents interpret policy actions has
an important effect on the dynamic response of output and inflation. If the change in interest
rates is seen as transitory, the short-run response of the economy will be minimal. But if
agents perceive the action as part of, say, a sustained disinflation policy, then the immediate
response of long-term interest rates, inflation and output will be substantial. Generally, the
practice in these simulations has been to assume that agents view such shocks as transitory.
However, in experiments where the funds rate is raised or lowered for more than a year, this
assumption implies that agents make a long string of forecast errors —a statistically
improbable event.
In exercises involving changes to other variables, monetary policy is typically treated
in one of two ways. One approach treats monetary policy as exogenous, in the sense that the
nominal federal funds rate is held at its baseline path. The rationale for this treatment is that
it isolates the effect of the change in conditioning assumptions. The other approach treats the
funds rate as endogenous, either by targeting nominal income, or by using an explicit Taylortype reaction function, or by employing ad hoc search procedures to find values for the funds
rate that return inflation or unemployment to their baseline paths. These exercises are
routinely reported in Greenbooks and in other policy documents, and are often used by the
staff to illustrate significant risks to the forecast.

3.2.2

Alternative policy strategies and risk analysis
A second category of model simulation involves the analysis of alternative strategies

for monetary policy over a longer period. This type of exercise begins with the creation of a
five-year (or longer) baseline. The baseline is set equal to the staff judgmental forecast as far




-23out as that forecast is available; beyond that, the baseline is generated using the model, based
on time-series extrapolations of exogenous variables and explicit assumptions for fiscal and
monetary policy. For example, fiscal policy assumptions might be based on Congressional
Budget Office projections, while monetary policy could be chosen to keep inflation on some
specified path. In addition, extended baselines typically incorporate certain “stylized facts”
about the economy—drawn from a large body of econometric work within and outside the
Federal Reserve System—that may or may not be embedded in the model itself. For
example, multi-factor productivity, labor force participation and population might be
constrained to grow at particular trend rates, and the model's price equation residuals could be
adjusted to yield a particular value of the NAIRU.
With the baseline in place, the next task is to simulate the macroeconomic effects of
alternative monetary policies. These alternative monetary policy strategies are discussed in
the Bluebook—a staff document prepared for the FOMC with the goal of facilitating
discussion of policy options and risks. The main focus of the Bluebook discussion is on the
likely path of inflation under different strategies, and on the implications for policy of
different risks to the outlook. In that regard, the old combined MPS-MCM model was used
to compare the implications of alternative medium-term trajectories of inflation for the paths
of output and interest rates.
Our new models allow consideration of the much richer context in which policy
decisions must be made. For example, the question of policy credibility can now be formally
addressed. Disinflation policies that are fully credible presumably have lower costs in terms
of lost output and employment than ones that are discounted by the public, and FRB/US can




-24now be used to assess the likely empirical importance of such credibility effects. In
particular, the model can be run first under the assumption that monetary policy is fully
credible, and second under the assumption that agents only learn about the FOMC's long-run
inflation target over time. In the second simulation, the cost of disinflation in terms of lost
output will be higher than in the first This type of analysis provides policymakers with a
sense of the implications of credibility and expectations for the economic consequences of
disinflation.
In a similar vein, changes in policy are likely to engender changes in the expectations
formation process, and thus in the dynamic behavior of the economy, and FRB/US can be
used to assess the importance of these effects as well. Although the alternative policy
strategies that we had presented in the past were not radically different from historical
policies, and thus would not be expected to invalidate the general flavor of simulations
conducted under the assumption of adaptive expectations, the Lucas critique was a source of
discomfort to all involved in the production and consumption of these simulations. In any
event, we believe that the Lucas critique should have less force vis-d-vis FRB/US and the
redesigned version of the MCM.
As noted earlier, model simulation results are also used by the staff to quantify risks
in the economic and policy outlook. For example, we have investigated a hypothetical
situation in which the NAIRU differs from that initially perceived by policymakers at the start
of the simulation. The purpose of this exercise was to gauge how long it might take for such
an error to become apparent, given that such an error almost certainly would be accompanied
by transitory price shocks that would create signal extraction problems. Other examples of




-25risk analysis carried out by the staff include simulations of alternative policy responses to
increases in the minimum wage, and the effects of changes in the real exchange rate.

3.2.3 Other examples of model-based analysis
Macroeconometric models have been used in many other ways to address important
economic issues. Examples include: decomposing the sources of historical GDP forecast
errors, by analyzing the relative contribution of equation errors made in different sectors
(Stockton 1993); computing historical movements in the equilibrium real interest rate
(Bomfim 1996); and estimating the size and persistence of unobserved shocks to aggregate
demand and supply, based on observed movements in the term structure of interest rates.
Outside of forecasting and the development of monetary policy scenarios, analysis of
alternative fiscal policies is one of the most frequent uses of our large-scale models. Models
have helped to shed light on two different aspects of fiscal policy—steady-state aggregate
supply-side responses, and the dynamic effects of fiscal policy changes on aggregate demand.
With respect to aggregate supply, detailed models of the U.S. economy such as
FRB/US incorporate many channels for fiscal policy to influence the long-run economy:
marginal tax rates and investment subsidies affect the desired capital-output ratio and the
labor force participation rate;15 transfer policies alter the mix of household income and thus
the private saving rate; and debt policies influence the aggregate saving rate because
households are not Ricardian.

15In FRB/US marginal tax rates have no effect on labor force participation, but they do
in two of the commercial models (DRI and WUMM).




-26Large-scale macro models are also used to analyze the short-run effects of fiscal
policy changes. Unlike the long-run aspects of fiscal policy, which are of only general
interest to monetary policymakers, the dynamic response of the economy to a fiscal policy
shock can be of immediate practical concern. One reason for this interest has been already
discussed: The stance of fiscal policy, as given by the projected growth of federal spending
and various tax rates, is an important conditioning assumption fpr the near-term outlook. But
another reason concerns the behavior of bond and other asset prices. Because the level of
government spending, taxation and debt influences the level of the equilibrium real interest
rate, anticipated future changes in fiscal policy affect bond yields and equity prices in the
present. Such expectational effects raise special issues for monetary control, because a
credible program of fiscal austerity, if scheduled for a future date, can potentially be
stimulative in the near term owing to the fall in bond rates that follows the announcement of
the plan (see Taylor 1995). Under these conditions, keeping inflation constant may require
raising short-term interest rates for a time. Such expectational effects were extremely difficult
to account for in the old MPS and MCM models, but FRB/US and the new MCM are well
designed for the analysis of such questions.

3.3

Evaluation of monetary policy rules
Much research undertaken at the Board over the past twenty years has involved the

use of macroeconometric models to evaluate various proposed monetary policy rules. This
might seem peculiar given our earlier acknowledgment that the policy process of the Federal
Reserve is not driven by rules in any simple or mechanical sense. However, there is not any
necessary tension here; indeed, Yellen (1996) and Meyer (1996) have laid out some of the




-27reasons that even a discretionary policymaker might wish to be apprised of the prescriptions
of rules. The staff can play a useful role in this regard by assisting policymakers in using
models to define, develop, and select rules with favorable characteristics.
In this section, we summarize a small but representative sample of work that has used
macroeconometric models to evaluate monetary policy rules. These studies have addressed
several broad issues. Some of these studies have attempted to sort out which variables
policymakers should consider in adjusting the federal funds rate; at various times, the staff
has looked at rules that are dependent on inflation, real output, nominal income, monetary
aggregates, commodity prices, and financial variables such as the yield curve and exchange
rate. This work has sometimes explored the implications of the degree of aggressiveness of
policy in response to changes in these policy indicators. Models have also been used to
examine issues involved in establishing the long-run objectives of monetary policy. Other
work has taken the objectives of monetary policy as given, and has focused more on
alternative strategies for achieving those objectives. Finally, some work has used a variety of
macro models to explore the robustness of rules to alternative assumptions about the structure
of the economy—in the tradition of McCallum (1984, 1988). The results of this work are
transmitted to the Board and the FOMC by way of research papers, memoranda, and special
briefings or seminars.
An example of the first type of study is provided in Brayton and Tinsley (1996), who
use stochastic simulations of the MPS model to compare the effectiveness of four simple
policies in minimizing the variability of prices and output. Each policy responds to
fluctuations in a different indicator. Brayton and Tinsley consider four candidates (nominal




-28GDP, the GDP deflator, M2, and an index of commodity prices), and conclude that, by a
fairly wide margin, nominal GDP is the best. Perhaps surprisingly nominal GDP ranks first
in Brayton and Tinsley’s assessment even if the policymaker is assumed to care only about
stabilizing prices. The explanation for this result appears to lie in the fact that demand
shocks feed into prices only with a lengthy lag; as a result, a policy that keys on the GDP
deflator will tend to react too litde and too late to demand shocks. In sum, according to
Brayton and Tinsley, “output and price stability are more often compatible, rather than
competing, intermediate-term objectives [p.312].”
Brayton, Levin, Tryon, and Williams (1996) build on the idea that any particular
monetary policy rule will endow inflation with a certain variance around its target level, and
will also endow output with a certain variance around potential. Moreover, any realistic
econometric representation of the economy will imply that there exists an efficient policy
frontier, consisting of policies that deliver the minimum variability of inflation for given
variability of output Brayton et al. ^oM de an explicit empirical estimate of the location of
this frontier given the structure of FRB/US. They also find that a policy rule estimated using
data covering the period since 1980 appears to have been quite close to that frontier. By
contrast, a policy based on mechanical application of Taylor's rule would have been relatively
far from the frontier, because the responses to inflation deviations and output deviations
would have been insufficiently vigorous.
An example of the second type of study is provided by Fuhrer and Moore (1995).
These authors specify and estimate a small-scale rational expectations macroeconometric
model of the U.S. economy. A key component of this model is a policy rule which




-29determines changes in the short-term nominal interest rate based on deviations of inflation
from target, and deviations of output from output from potential. Fuhrer and Moore examine
various possible settings for the parameters of this rule, and conclude that the settings that
most closely match the historical behavior of the Federal Reserve strike a good balance
among four competing policy objectives: minimizing the sacrifice ratio, minimizing the
variance of the output gap, minimizing the variance of inflation deviations from assumed
target, and minimizing the variance of the short-term nominal rate.16
Fuhrer and Madigan (1996) provide an example of research that has explored issues
related to the long-run objectives of monetary policy. These authors use the Fuhrer-Moore
model to examine whether the constraint at zero on nominal interest rates has implications for
the optimal setting of the long-run inflation objective. They note that a central bank may find
it desirable, in the course of a recession, to set the short-term nominal interest rate below the
rate of inflation—in other words, to push the rate into negative territory in real terms. If the
inflation rate is very low, the central bank may face a binding constraint on its ability to drive
the real rate down, and thus on its ability to stabilize real activity. Summers (1991) cited this
possibility as a reason for setting a positive rate of inflation as the objective of monetary
policy. Based on simulations of the Fuhrer-Moore model, however, Fuhrer and Madigan
conclude that, for most shocks typical of the post-war U.S. experience, the ability of the
Federal Reserve to conduct countercyclical monetary policy in a low-inflation context would

I6The bulk of this paper actually is concerned with considerations related to the
correlation between the output gap and various interest rates; for our purposes, however, the
results described in the text are the more relevant portion of this paper.




-30not have been seriously impaired by the zero constraint on nominal interest rates. For severe
shocks, however, this ability would indeed have been reduced.
Orphanides et al. (1996, in process) provide an example of work that has focussed on
the strategy of monetary policy. This work aims to compare the performance of an
“opportunistic” policy rule with a conventional rule, using a small-scale (ten equation)
empirical macroeconometric model with rational expectations. Because the opportunistic rule
is nonlinear in the state variables, the major macroeconomic variables of interest (such as the
output gap and the deviation of inflation from its target) are not distributed according to the
normal distribution even if the underlying shocks to the economy are normal. The paper
illustrates some of the tradeoffs this situation makes available to policymakers. For example,
there exist opportunistic policies that produce more diffuse inflation distributions than their
conventional counterparts, but more concentrated output distributions. The ultimate objective
of the paper is to assist policymakers in assessing alternative disinflation strategies.
Finally, Bryant, Hooper, and Mann (1993) offers an example of work intended to
explore the robustness of rules to alternative assumptions about the structure of the economy.
This volume represents the latest installment in a monumental effort to coordinate research
efforts across a large number of large-scale models.17 Board staff have played an important
role in this project since its inception, and this volume was no exception (two of its three
editors and several of its contributing authors being members of the staff). As with many
other studies in this literature, the aim of this volume was to evaluate the success of

17The earlier volumes in this series include Bryant, Henderson, et al. (1988), Bryant,
Holtham, and Hooper (1998), and Bryant, Currie, et al. (1989).




-31-

altemative monetary policy rules in achieving certain policymaker objectives. The
distinguishing features of this effort were that it involved extensive participation by eight
different teams using large-scale macroeconomic models, and that each of these models
placed the U.S. economy explicitly in a global context. One of the main findings of the study
was that “either nominal-income targeting or real-GNP-plus-inflation targeting, in contrast to
money targeting or exchange-rate targeting, best stabilizes national economies if the loss
functions of policymakers ... stress a combination of [real variables such as output or
employment] and nominal ultimate-target variables such as the rate of inflation or the price
level [p.30].”

4.

An evaluation of the relative merits of model-based and judgmental approaches to
policy analysis
In this section, we evaluate the procedures used at the Board, which employ both

judgment and models in developing forecasts and policy analysis. Macroeconometric
models—both large- and small-scale—serve an extremely important function in the
development of analysis for the monetary policy process. However, we do not see these
models as being capable of completely supplanting the use of judgment in the preparation of
this analysis. Nevertheless, there are some drawbacks and potential pitfalls to the approach
we have taken, and we highlight those that we view as being most serious.

4.1

The weaknesses of models and their mechanical application in policy analysis

4.1.1 Identification and the Lucas Critique
A common criticism of large-scale structural models is the lack of adequate
identification. These models typically contain many more variables than equations, and
identification in such systems is obtained by a host of exclusion restrictions, some of which




-32are difficult to justify on theoretical grounds. Another common criticism of large-scale
models involves the Lucas critique: Because such models are simply reduced-form
representations of the economy, their structure will change when ever there is a significant
shift in policy, and thus, these models are not well suited for analysis of the effects of
changes in policy. These two arguments help explain why structural models fell out of favor
with the profession during the 1970s and early 1980s.
In large measure, these criticisms motivated the development of FRB/US and the new
MCM. For example, FRB/US attempts to deal with the identification issue by basing the
specification of its dynamic equations on theory, and using explicit proxies for expectational
variables in estimation. As for the Lucas critique, the model is often run under modelconsistent expectations, implying that the dynamic properties of the model fully incorporate
any changes in the behavior of agents that would be induced by a shift in policy.18 Even
under VAR-based expectations, agents' expectations in FRB/US cannot be continually biased
because learning mechanisms ensure that, expectations are fully in accord with simulated
outcomes in the long run. Still, one can conceive of policy experiments that would make the
use of fixed VAR forecasting mechanisms problematic.19 As a practical matter, we recognize

18This statement assumes that the change in policy is not so great that it would induce
a change in the structural parameters of the model. If one was interested in the effects of a
hyperinflation on the behavior of the U.S. economy, FRB/US would probably not be a good
guide, because in the real world there would presumably be a change in the adjustment costs
embedded in the (fixed) coefficients of the wage and price equations.
19To address this problem, we plan to develop procedures whereby the expectational
system can be made consistent with the behavior of the overall model under any specification
of policy, by re-estimating the VARs using pseudo-data generated by stochastic simulations.




-33-

that myriad identification questions remain, and we have directed our research resources
toward resolving those with the greatest possible policy consequences.

4.1.2 Problems analyzing changes in endogenous variables
Although the introduction of forward-looking expectations has alleviated some
problems associated with the use of structural models, many difficulties still persist. In
particular, the use of models to estimate the macroeconomic effects of changes in
conditioning assumptions can be problematic.

It is straightforward to use model simulations

to estimate the response of real GDP or inflation to a change in a truly exogenous variable, or
even a variable that is weakly exogenous, such as the price of oil. It is more complex to
construct coherent experiments for shocks to variables that have a large endogenous
component, such as the exchange rate or bond yields.
As a practical matter, identifying the source of these shocks, at times, can be
extremely difficult, yet will be critical to setting up the appropriate experiment. Shocking
long-term interest rates in the model in response to observing an unexplained rise bond yields
will lead the model to signal future weakness in economic activity. However, such a
prediction would be justified only if the rise in long-term interest rates was associated with an
increase in term premiums. If, instead ,the observed rise was associated with expectations of
higher output and inflation in the future (and thus higher short-term rates), the correct
conclusion would be that the change should have the opposite effect. Although we are fully
aware of the pitfalls involved in interpreting innovations in financial variables, using the
model to estimate the effects of changes in these variables is fraught with difficulty.




-344.1.3 Structural Change
Another problem that afflicts any large-scale macroeconometric model is structural
change. Although the consequences may not be too serious if the evolution of the economy is
gradual and the model is not used for long-run forecasting, the situation is much less
favorable if structural changes occur abruptly and could not be contemplated by the model
builders. For example, the end of Regulation Q and the innovations in the mortgage market
have altered the dynamics of housing demand in ways that we are still struggling to
understand. In these cases, the structure of the model may immediately become inadequate
for describing behavior in the sector. Under some circumstances, it may be possible to alter
the model’s structure in an appropriate manner, especially if theory provides guidance for how
the effect should be incorporated into the system. But if the quantitative effects of the event
cannot be pinned down from theory, then the model builder may be stymied: The influence
may not be estimable from historical data.

4.1.4 Consideration of Alternative Specifications
As was mentioned in the previous section, we believe that it is useful for the staff to
be able to examine a range of econometric specifications—both structural and reducedform—in carrying out policy analysis, rather than relying on a single specification enshrined
in the “staff model.” The examination of a range of specifications and behavioral models
affords a more realistic assessment of the uncertainty attached to the analysis. Indeed, it is
difficult to cite any major macroeconomic relationship about which there would be
widespread professional consensus as to the appropriate specification. To be sure,
specification tests are available that with sufficient data should enable us to sort through




-35-

altemative specifications, or to optimally combine forecasts from alternative models. And, the
research program of the Board staff involves a considerable amount of this type of work
(Oliner, Rudebusch, and Sichel, 1995, 1996; Ando and Brayton, 198x; Stockton and
Glassman, 1987; Stockton and Struckmeyer, 1989; Porter and Feinman, 1992; Edison, 1991;
Edison and Pauls, 1993; Meese and Rogoff, 1983.)
But the uncertainties that exist—and probably will continue in perpetuity—suggest a
need to remain alert to the implications of alternative specifications. The use of a single
model can inadvertently straightjacket thinking and miss important phenomena that are not
incorporated in the model’s specification. Some of these differences can have significant
effects on the outlook. For example, whether one's model included an effect of cash flow on
investment, an influence of stock market wealth on consumption, or a direct channel from
money to price inflation would, at times in the recent past, have had significant consequences
for the macroeconomic outlook. Ultimately, the staff must take a stand on these issues for
purposes of presenting analysis to the policymakers. But, in our view, the emphasis on
alternative specifications that characterizes the staff’s judgmental approach to forecasting
allows a fuller articulation of the risks facing policymakers than might a more narrowly
focused model-based forecast. (Our colleague, Peter Tinsley has proposed the development of
a facility in FRB/US that would allow the flexibility to “plug in” alternative specifications of
equations or perhaps blocks of equations. This would allow a more thorough analysis of
specification uncertainty within the context of the FRB/US model.)




-364.15

Incorporation of Extramodel Information

Finally, a judgmental approach makes it easier to incorporate “extra-model”
information and anecdotal evidence. Strikes, natural disasters, and other idiosyncratic shocks
to the economy can have important timing effects on real output and inflation, even if they
rarely have persistent effects. A purely model-based approach might misinterpret a strikeinduced decline in production likely to have averaged persistence, when as a judgmental
analyst would be able to recognize the shock as shorter lived
Anecdotal evidence, which plays no role in a pure model forecast, also can be
exploited by a judgmental forecaster. If we were entirely confident of the quality and
timeliness of our official data, the role of anecdotal information likely would be negligible.
But our data systems are flawed and often subject to substantial revision, and unfortunately,
large-scale econometric models are captives of the available data. A vast amount of
information is collected by the staff, much of it on a relatively systematic basis, but some of
it not For example, about three weeks before each meeting of the FOMC, the staff of the
Reserve Banks conduct an extensive set of interviews with firms of all types in their districts,
including retailers, bankers, manufacturers, service providers, and others. This information is
issued as the so-called Beige Book. The Beige is thought to have been useful on occasions in
the past in shedding light on trends in the economy—especially trends too recent to have
shown up in formal statistics. Board staff routinely make contacts with businesses to gather
information of the conditions of their firms and industries. The Board also has various
advisory councils, which can provide assessments of economic developments. The ongoing
nature of these sources of informal evidence allows the staff to develop at least an intuitive




-37-

sense of the reliability of the resulting information. Nevertheless, it must be admitted that the
quality of much anecdotal information is poor, the motives of those supplying the anecdotes
often are suspect, and there are strong temptations to hear only those anecdotes that confirm
one’s present views. Consequently, while anecdotal information cannot be neglected, its
contribution to macroeconomic analysis obviously will be quite limited.




-384.2

The weaknesses of the judgmental approach to policy analysis
Although the judgmental approach offers significant advantages over a purely model-

based method, there are significant potential pitfalls inherent in this less formalized
methodology. First, the behavioral relationships underlying the judgmental forecast are much
less susceptible than those embedded in the purely model-based approach to statistical
validation and diagnostic testing. As noted above, in principle, as data samples grow
infinitely large, a careful researcher should be able to discriminate between any two
competing statistical representations of a given behavioral relationship. However,
conventional statistical methods seem to us to be of dubious applicability to the problem of
validating a judgmental forecast, not least because the formal “structure” generating the staff
forecast is continually evolving over time. Therefore, an assessment, for example, of how the
staff forecast would change in response to an oil price shock might not be the same in 1996
as the reaction embodied in the forecast of 1986. Of course, we monitor the out-of-sample
track record of the staff judgmental forecast and assess performance with a variety statistical
forecast evaluation techniques. (See Romer and Romer (1996) for an example of such an
effort by two authors outside the Federal Reserve Board.)
A second difficulty with the judgmental approach stems directly from the very
flexibility that is seen as the principal advantage of this approach over the purely algorithmic
approach. This flexibility can result in a loss of discipline and rigor in at least three related
but distinct ways. First, a tendency could exist for judgmental forecasts to be too heavily
influenced by the most recent data observations. For example, in early stages of business
cycle recoveries, it is difficult to disentangle the cyclical rebound on labor productivity from




-39any change in trend that might be occurring. During these periods, economic observers
frequently cite many examples of industries or firms that are accomplishing significant
productivity improvements. Nevertheless, statistical evidence confirming a broad-based shift
in trend productivity often is slim or nonexistent when viewed over a complete cycle. In a
similar vein, Fischer (1996) has noted that economists often adjust their estimates of the
natural rate of unemployment toward the actual unemployment rate to a greater extent than
might be justified by straight reading of the statistical evidence. While a purely algorithmic
computer reading of the data would have no difficulty assigning a small weight to the most
recent readings, such detachment can be more difficult for judgmental forecasters.
Loss of discipline can also be manifested in inconsistency over time. For example, in
a judgmental process, there is no guarantee that a given set of extra-model information will
provoke exactly the same reaction on the part of the forecaster at two different points in time.
In part, such inconsistency can reflect nothing more than the fact that is different people are
engaged in the assembly of the forecast from one FOMC meeting to the next, and inevitably
different individuals will evaluate the same packet of information differently.20
Operationally, a problem related to consistency is that application of judgment can
obscure the relative roles of changes in conditioning assumptions, shifts in economic
relationships, or changes in views about key behavioral relationships in influencing the
forecast. (Only the latter is inherently “judgmental.”) In practice, the staff attempts to deal
with this problem by focusing heavily on being able to explain the revision in the forecast

20The severity of these problems is minimized by the continuity of the judgmental
forecasting staff and the consistency of econometric and statistical techniques used in
assembling the judgmental forecast.




-40from the previous round (or from some more distant date). This focus compels the staff
always to attempt an explicit quantification of the role of judgmental factors in influencing the
innovation in the forecast from the previous round.21
The hazards we describe here would doubtless be even greater were it not for the case
that the staff develops, maintains, and consults macroeconometric models in the preparation of
our forecasts and policy analysis. This attention to models makes us well aware of when we
are deviating from established econometric relationships, and compels us to examine the
strengths and weakness of the arguments for this divergence. Consequently, the problems that
might be associated with an excess sensitivity to incoming data or a lack of consistency over
time in reactions to recurring or idiosyncratic surprises can be minimized; the models provide
an important reference point for ensuring consistency of reaction and analysis over time. We
look forward to the time when a stable and reasonably complete “consensus” model of the
economy exists that could eliminate our reliance on an ample dose of judgment. Until that
time, we view our approach, which relies on large-scale macroeconometric models as inputs
in our analysis but also considers alternative specifications and information from outside the
model, as a necessary compromise.
5.

Credit, balance sheets, and macroeconomic performance from 1989 to 1992
The cyclical episode running from 1989 to 1992 highlights both the difficulties and

benefits associated with using a large-scale macroeconometric model for policy analysis. The

21The difference between a judgmental and purely algorithmic forecast can be easily
understood within this framework: A judgmental forecast allows the variance of the third term
(pertaining to reassessment of behavioral relationships) to be non-zero, whereas a purely
algorithmic approach sets that variance equal to zero.




-41-

recession of 1990-91 and the subsequent slow recovery was, at the time, attributed to a
number of unusual influences including: the so-called “credit crunch” associated with
changes in capital requirements and an increased reluctance to lend owing to deteriorating
loan performance; and a deterioration in the balance sheets of business and households.
(Collectively, these unusual influences were referred to by Chairman Greenspan as “financial
headwinds.”) The basic challenge that we confronted was that these explanations for the
weak recovery suggested channels of influence that were not present in the MPS model. In
this section, we shall review the basic macroeconomic setting and describe how we interpreted
some of these events in light of the MPS model, accumulating research on the credit channel
of monetary policy, and readings from other sources of information. Our emphasis here will
be on how the MPS model was used in the analysis of these issues, rather than on resolving
the questions surrounding the sources of the weak recovery from the last recession. Even
now, we do not believe that there is a well understood or widely accepted answer as to why
economic activity was as weak as it was during that period.
The recession, which began in the third quarter of 1990 and ended in the second
quarter of 1991, was about of average depth and duration compared with other downturns in
the postwar period. However, as seen in chart 1, the subsequent recovery, measured by either
real GDP or industrial production, was much weaker than the norm. The upturn was
especially delayed in labor markets, with nonfarm payrolls having failed to reach their
previous peak by the end of 1992 and with the unemployment rate rising for nearly one and
half years after the trough in activity. The contour of monetary policy, as indexed by
movements in the federal funds rate, also was atypical both before and after the recession




-42(chart 2). The Federal Reserve began to lower the funds rate in the spring of 1989, five
quarters prior to the business cycle peak. By contrast, for the average postwar experience, the
federal funds rate continued to rise until the quarter preceding the peak. In large measure,
this reflected the fact that, in the late 1980s, the Federal Reserve was not confronted with
inflation pressures of the magnitude experienced in past episodes. However, a sharper
difference in behavior occurred after the trough in activity. In the early 1990s, the funds rate
declined for six quarters after the trough;for the average experience, the funds rate turned up
shortly after the trough.
Among the explanations of the unusual weakness in this recovery was the contraction
occurring in the depository sector. Growth of depository credit was exceptionally slow during
this period, even by standards of past recessions, and, for a time, credit actually contracted
(chart 3). During this period, the share of securities holdings in bank portfolios rose to a high
level. There were also abundant anecdotal stories of businesses having difficulties acquiring
credit—in some cases, from banks with which they had long-standing relationships. These
stories received some support from the Senior Loan Officer Survey of 50 to 60 large banks
conducted by the Federal Reserve Board. As seen in chart 4, from early 1990 until mid-1992,
banks included in this survey reported tightening credit standards for loans to firms of all
sizes, and growth in aggregate business loans from banks (chart 5) decelerated sharply.
Survey readings from the National Federation of Independent Businesses (NFIB) also strongly
suggested that small- and medium-sized businesses were having difficulty obtaining credit.
Meanwhile, a growing body of research—with substantial contributions from Federal
Reserve economists—suggested that banks play a special role in the monetary policy




-43transmission process (see Kashyap and Stein (1993) for a review of this literature). This work
stresses the function that banks serve in the presence of asymmetric information and
emphasizes the transmission of monetary policy shocks through the asset side of the bank
balance sheet to boiTowers who rely heavily on bank lending—particularly small businesses
and households. Empirical tests find support for this hypothesis using firm-level data, which
showed that small firms' investment is more sensitive to changes in cash flow than that of
larger firms (Gertler and Gilchrist (199x)).
The supply of credit was not the only unusual financial feature of this period. The
balance sheets of both businesses and households were suffering from the decline in real
estate values and rising debt burdens. In the corporate sector, gross interest payments
absorbed about 40 percent of cash flow (chart 6), and in the household sector, the debt service
burdens associated with mortgage and consumer credit outstanding were estimated to have
risen above 18 percent of disposable income—both highs for the postwar period. Some
observers expressed concern that cash flow and income were being diverted from spending
and toward repair of balance sheets, thus implying that there was more restraint on spending
for any given real interest rate than might normally be the case.
Both of these arguments—credit constriction and balance sheet pressures—certainly
had some degree of plausibility. But neither of these factors played a direct role in the MPS
model, where the principal transmission mechanisms were interest rates, exchange rates, and
asset prices. (This remains the case in FRB/US.) Moreover, there were other explanations of
the weakness that operated through the conventional channels of influence that were
incorporated in the model: fiscal restraint associated with the downsizing of defense spending,




-44the collapse of the commercial construction, and the sharp slowing in activity abroad.
The MPS model served as a useful point of departure for assessing the relative merits
of these arguments. To this end, the structural errors of the model equations were used to
provide an accounting of model surprises, which could then be examined for their congruence
with these alternative explanations of the restraint on activity (Stockton, 1993). The
contribution of the series of structural errors from each sector was assessed by simulating the
full model with the structural errors for that particular sector (or collection of errors, for
sectors characterized by more than one equation) set equal to zero. Thus, the contribution of
errors to aggregate output was taken to include both the direct effects of those errors on
spending and the indirect multiplier-accelerator effects. The results of this exercise are
displayed in table 1 for four major spending categories of domestic spending and for the set
of equations determining productivity. On the spending side, the structural errors in the
consumption equations accounted for the largest negative shock to output, taking an average
of 0.5 percentage point per year off of real GDP growth over this period. The errors in the
inventory investment block of equations accounted for an another 0.1 percentage point
average reduction in real GDP growth over the period. On the supply side, the productivity
equations also accounted for an important share of the effect of the structural errors on real
GDP growth—amounting to a 0.3 percentage point per year restraint on the growth of
activity; the immediate effect of a negative productivity surprise on the MPS model is to raise
demand as income shifts in favor of labor income, out of which there is a larger propensity to
consume. However, over time, the effects of lower productivity on profitability, equity
values, and household net worth act to depress real activity.




-45This pattern of errors did not immediately point to credit constriction or balance sheet
stress as an obvious alternative explanation of the slow growth in activity during this period.
The fact that the equipment spending equations were tracking well during this period
suggested that unmodeled credit constraints and balance sheet problems probably were not
exerting an important depressing influence on outlays for capital equipment. By contrast, the
consumption equation errors were consistent with the hypothesis that credit availability might
have been a problem. But most of the shortfall in spending was for nondurables and services,
rather than durable spending, where credit constraints might have been expected to exert
greater restraint on spending. Moreover, the results of our Senior Loan Officer Survey never
detected any reduced willingness to lend to consumers in this period (chart 4).

And, when

the errors from the consumption equations were regressed on measures of debt-service
burdens no clear correlation emerged (table 2). The errors in the inventory investment
equations also provided some evidence in favor of a credit supply channel on spending. This
view received further support from research by Kashyap, Stein, and Wilcox (199x), which....
On balance, the pattern of errors from the model provided a hint that spending may have been
affected by the unusually sharp contraction in bank lending, but the results were far from
conclusive.
The errors in consumption equations were also noted by other researchers during this
period (Hall (1993); Blanchard (1993)), who argued that there had been an exogenous shift in
consumption spending relationships. It was observed that consumer sentiment was much
weaker than could be explained by its usual correlates, including income growth, the
unemployment rate, and inflation. Indeed, inclusion of the Michigan survey measure of




-46consumer sentiment in the consumption equations of the MPS model virtually eliminated the
error in the equation for nondurables and services. Thus, the unusual weakness in consumer
spending could have resulted more from heightened insecurity about job and income prospects
than from the supply or demand for credit (Carroll, 1992). As a data footnote to this
exercise, we should note that flow of funds estimates of household net worth were
subsequently revised down by a substantial amount—largely on the basis of data showing
much weaker real estate wealth than indicated by the preliminary figures—further erasing the
consumption puzzle of this period. At the time, we recognized that this was a possibility
(Stockton, 1993), but the magnitude of the revision was, nonetheless, surprising.
Despite the lack of a smoking gun from the examination of the performance of the
MPS model during this period, the staff gave weight to the possibility that credit constraints
and balance sheet problems were holding back aggregate demand. The micro-level research
on the role of bank credit, the anecdotal reports of credit availability difficulties, and survey
evidence gathered from the banks themselves suggested that these influences could not be
dismissed. Certainly, judging from public pronouncements, many Fed policymakers also were
of the view that these influences were exerting a significant drag on activity. And, beyond
public statements, the actions of the FOMC also suggest that there was a view that unusual
forces were weighing on the economy; the federal funds rate was well below the level
consistent with a mechanical reading of the Taylor rule, which has been suggested by some to
describe fairly well the behavior of the FOMC since 1987.
This description of the events and analyses of the 1989 to 1992 period is intended to
highlight some of the difficulties that are encountered in the “real time” use of a large




macroeconometric model. We doubt that any model can avoid encountering periods when
events, theory, or data raise questions about key elements of specification. However, having a
well developed macroeconometric model is essential at least for framing questions about
current economic developments and for testing that model against alternative explanations of
events. But, we must recognize that the current state of our knowledge about empirical
macroeconomics suggests that no one model is likely to provide a fully adequate description
of the many complex phenomena that will confront policymakers. For this reason, we believe
the eclectic approach chosen by the staff for developing macroeconomic analysis, at present,
has no superior alternative.
6.

Conclusion
Econometric models play an important role in at the Federal Reserve Board. The staff

has long devoted considerable effort to the design, estimation, and implementation of largescale macroeconometric models. These models are important inputs into the forecast process
and into much of the staffs macroeconomic and monetary policy analysis. That said, the
economic environment is a forbidding one for models: Appropriate specification and
identification of models is elusive; data are faulty and subject to revision; the economy is in a
constant state of flux; and events not contemplated at the time of model design frequently
buffet the economy. Given the current state of econometric knowledge, these considerations
imply that judgment will continue to be a necessary ingredient in the policy process for the
foreseeable future.




.

1
“

:t

t.

Table 1
Effects of Structural Errors on Real GDP

Average annual
effect on Q4/Q4
real GDP growth,
1980-1992

Equipment inventment
Housing
Inventory investment
Productivity

1990

-1.2

-2.7

to
t—
1

Consumption

1989

00

Simulated effect on the level of real GDP
1991

1992

-0.5

-.5

-.1

-.1

.2

0.0

.4

.2

-.2

-.1

0.0

-.2

-1.1

.2

-.3

-0.1-

.3

-.7

-1.5

-1.6

-0.3

Table 2
Consumption and Debt-Service Burdens
Coefficients on:
Consumption category

Total

Mortgage

Installment

Nondurables and services

.005
(.83)

.052
(.33)

.066
(.86)

Motor vehicles

-.012
(.38)

-.004
(.11)

-.034
(.93)

Other durables

.015
(1.70)

-.004
(.20)

.023
(2.39)

NOTES:
1. Each consumption equation was re-estimated with a debt-service measure.
parentheses. Sample periods end in 1989:Q4

T-statistics are indicated in

2. For nondurables and services, the estimated coefficients indicate the effect on the marginal propensity to consume
out of disposable income of a one percentage point change in debt burden ratio. Coefficients in the two durables
equations are semi-elasticities, measuring the short-run percentage change in spending that results from a one
percentage point change or one index point change in the regressor.







Chart l
Cyclical Comparisons
Real GDP

Number of quarters from peak
Current episode: peak * IWO Q3
Includes peaks: 53Q3 57Q3 60Q269Q4 73Q4 8IQ3

Current I pisode
Average History

Industrial Production

Number of quarters from peak
Current episode: peak = IWOQJ
Includes peaks. 53Q3 S7Q3 60Q2 69Q4 73Q4 8IQ3

Nonfarm Payroll Employment

Number of quarters from peak
Current episode: peak = 1990 Q3
Includes peaks: 53Q3 57Q3 60Q269Q4 73Q4 8IQ3

Number of quarters from peak
Current episode: peak = iy?5KJQ3
Includes peaks: 53Q3 57Q3 60Q2 69Q4 73Q4 8IQ3

Chart 2

The Federal Funds Rate
INDEXED AROUND THE CYCLICAL PEAK

Index. peaKsiOO
140

Current

120

100

80

Average

—o

-4

60

-2

0

2

4

INDEXED AROUND THE CYCLICAL TROUGH

40

Index, trough* 100
250

200

Average
150

100

50

Current

-4

•2

0

•Average is the average index of the 57. '60. '69. 73. and ’81 cycles.




2

4

0
6

Chart 3

Debt Growth
Funds Raised in U.S. Financial Markets
Percent of GDP

+o I

* Excluding borrowing related to deposit insurance.




Chart 4

Net Percentage of Banks Tightening Credit Standards

Bank Willingness to Lend to Consumers*
Index

* Weighted response of banks more willing minus banks less willing to lend.




Chart 5

Bank Loans and Securities
Bank Commercial and Industrial Loans
4-quarter percent change

20

16

12

8

4

+
0

4

8
1977

1973

1981

1985

1989

1993

Bank Securities as a Percent of Bank Credit
Percent
36

32

28

24

I

i

i

1973




i

I

I

1977

I___ I___ I___ I___ I___ I___ I___ I___ I___ I___ I___ I___ I___ I___ I___ I___ I___ I------- 20

1981

1985

1989

1993

Chart 6

Nonfinanciai Corporations
Gross Interest Payments
Percent of cash flow
45

36

27

18

9
1972

1976

1968 1980

1984

1988

1992

1996

Debt Service Burden as a Percent of Disposable Personal Income
Percent of DPI

20
Consumer and mortgage

18

16

14

12
1965

1969




1973

1977

1981

1985

1989

1993

Bibliography
Ando, A. and F. Brayton (1995) “Prices, Wages, and Emloyment in the U.S. Economy: A
Traditional Model and Tests of Some Alternatives,” in The Natural Rate of
Unemployment: Reflections on 25 Years of the Hypothesis, ed. R. Cross, Cambridge
University Press.
Blanchard, 0.(1993) “Consumption and the Recession of 1990-1991” American Economic
Review, 83(2), 270-4
Brayton, F., A. Levin, R. Tryon, and J. Williams (1996)“The Evolution of Macro Models at
the Federal Reserve Board,” prepared for the Camegie-Rochester
Conference Series, October 1996.
Brayton, F. and P. Tinsley (1996) “Effective Interest Rate Policies for Price Stability,”
Economic Modelling, vol. 13,289-314.
Brayton, F. and E. Mauskopf (1985) “The Federal Reserve Board MPS Quarterly Econometric
Model of the U.S. Economy,” Economic Modelling (July) 170-212.
Braun, S. (1990) " " Journal of Business and Economic Statistics, Vol. 18, No.2, 293-?
Brayton, F. and P. Tinsley (1996) “Effective Interest Rate Policies for Price Stability,”
Economic Modelling, vol. 13, 289-314.
Brayton, F. and P. Tinsley (1996) "A Guide to FRB/US: A Macroeconometric Model of
the United States," FEDS 96-42
Carroll, C. (1992) “The Buffer-Stock Theory of Saving: Some Macroeconomic Evidence”
Brookings Papers on Economic Activity (2) 61-135.
Edison, H. (1991) "Forecast Performance of the Exchange Rate Models Revisited,"
Applied Economics, Vol. 23, 187-196
Edison, H. and D. Pauls (1993) "A Re-Assessment of the Relationship Between Real
Exchange Rates and Real Interest Rates: 1974-1990," Journal of Monetary Economics,
vol. 31, 165-187
Eischer, S. “Why are Central Banks Pursuing Long-run Price Stability,” conference
sponsored by the Federal Reserve Bank of Kansas City, August 1996
Fuhrer, J. and G. Moore (1995) “Monetary Policy Trade-offs and the Correlation between
Nominal Interest Rates and Real Output,” American Economic Review. (March) 219239.
Fuhrer J. and B. Madigan “Monetary Policy when Interest Rates are Bounded at Zero,”
Review of Economics and Statistics, forthcoming.




Gertler, Mark and Simon Gilchrest (1994) “Monetary Policy, Business Cycles, and the
Behavior of Small Manufacturing Firms,” Quarterly Journal of Economics, 309-340.
Hall, R. (1993) “Macro Theory and the Recession of 1990-1991” American Economic Review
(May) 83(2), 275-9
Kashyap and Stein (1993) “Monetary Policy and Bank Lending,” in N. Gregory Mankin, ed.,
Monetary. University of Chicago Press for the National Bureau of Economic
Research, 221-262.
Kashyap, Stein, and Wilcox. (1993) “Monetary Policy and Credit Conditions: Evidence from
the Composition of External Finance,” American Economic Review (March) 78-98.
Levin, A. (1996) “A Comparison of Alternative Monetary Policy Rules in the Federal
Reserve Board's Multicountry Model,” Bank for International Settlements Conference
Papers, vol. 2, 340-366.
McCallum, Bennett (1984) “Monetarist Rules in Light of Recent Experience,” American
Economic Review Proceedings, vol. 74, 388-391.
McCallum, Bennett (1988) “Robustness Properties of a Rule for Monetary Policy, vol. 29,
173-204.
Meese, R. and K. Rogoff. “Empirical Exchange Rate Models of the Seventies: Do They Fit
Out of Sample,” Journal of International Economics, vol. 14 (1983) 3-24
Meyer, L. (1996) “Monetary Policy Objectives and Strategy,” speech before the National
Association of Business Economists, Boston Massachusetts.
Oliner, S., G. Rudebusch, D. Sichel (1996) “The Lucas Critique Revisited: Assessing the
Stability of Empirical Euler Equations for Investment,” Journal of Econometrics, vol
70, 291-316.
Oliner, S., G. Rudebusch, D. Sichel (1995) “New and Old Models of Business Investment: A
Comparison of Forecasting Performance,” Journal of Money, Credit and Banking,
vol. 27, 806-826
Orphanides, A., D. Small, V. Wieland, and D. Wilcox “A Quantitative Exploration of the
Opportunistic Approach to Disinflation,” mineo in process, Board of Governors of the
Federal Reserve System
Porter, R. and J. Feinman. “The Continuing Weakness in M2,” FEDS working paper no.
209, Board of Governors of the Federal Reserve System, October 1992
Romer, C. and D. Romer. “Federal Reserve Private Information and the Behavior of Interest
Rates,” NBER working paper no. 5692, July 1996




Stevens, G., R. Berner, P. Clark, E. Hemandez-Cata, H. Howe, and S. Kwack (1984) The
U.S. Economy in an Interdependent World: A Multicountry Model. Board of
Governors
of the Federal Reserve System (Washington D.C.).
Stockton, D. “The Business Cycle in the United States: Evaluating Traditional and
Alternative Paradigms,” The Business Cycle and Financial Policy Coordination, The
Tenth International Symposium of the Economic Planning Agency of the Government
of Japan, March 1993.
Stockton, D. and J. Glassman “An Evaluation of the Forecast Performance of Alternative
Models of Inflation,” Review of Economics and Statistics, February 1987, 108-117
Stockton, D. and C. Struckmeyer. “Tests of the Specification and Predictive Accuracy of
Nonnested Models of Inflation,” Review of Economics and Statistics, May 1989, 275283
Summers, L. (1991) “How Should Long-term Monetary Policy be Determined?” Journal of
Money, Credit, and Banking, vol 23, 625-631.
Tinsley, P. (1993) “Fitting Both Data and Theories: Polynominal Adjustment Costs and
Error- Correction Decision Rules,” FEDS working paper 93-21.
Yellen, J. (1996) “Monetary Policy: Goals and Strategies,” speech before the National
Association of Business Economists, Washington D.C.