View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

FIRST QUARTER 2018  FEDERALRESERVE RESERVEBANK BANKOF OFRICHMOND RICHMOND FEDERAL  Are Markets Too Concentrated?  Industries are increasingly concentrated in the hands of fewer firms. But is that a bad thing?  Do Entrepreneurs Pay to be Entrepreneurs?  Private Currency Before Cryptocurrency  Interview with Jesús Fernández-Villaverde  VOLUME 23 NUMBER 1 FIRST QUARTER 2018  COVER STORY  10  Are Markets Too Concentrated? Industries are increasingly concentrated in the hands of fewer firms. But is that a bad thing?  Econ Focus is the economics magazine of the Federal Reserve Bank of Richmond. It covers economic issues affecting the Fifth Federal Reserve District and the nation and is published on a quarterly basis by the Bank’s Research Department. The Fifth District consists of the District of Columbia, Maryland, North Carolina, South Carolina, Virginia, and most of West Virginia. DIRECTOR OF RESEARCH  Kartik Athreya EDITORIAL ADVISER  Aaron Steelman EDITOR  FEATURES   14  Renee Haltom SENIOR EDITOR  David A. Price  Paying for Success State and local governments are trying a new financing model for social programs  MANAGING EDITOR/DESIGN LEAD  Kathy Constant STAFF WRITERS  17  Do Entrepreneurs Pay to Be Entrepreneurs? Some small-business owners are motivated more by values than financial gain  Helen Fessenden Jessie Romero Tim Sablik EDITORIAL ASSOCIATE  Lisa Kenney  ­  CONTRIBUTORS  Selena Carr Santiago Pinto Michael Stanley DESIGN  Janin/Cliff Design, Inc.  DEPARTMENTS  	 1		 President’s Message/Taxes and the Fed 	 2		 Upfront/Regional News at a Glance 	 3		 At the Richmond Fed/The Great ATM Shakeout NEW 	 4		 Federal Reserve/A Taxing Question for the Fed 	 8		 Jargon Alert/Human Capital 	 9		 Research Spotlight/Did Workers Get Worse at Finding Jobs? 	21		 Policy Update/Reforming Corporate Taxes 	22		Interview/Jesús Fernández-Villaverde 	28		 	Economic History/When Banking Was ‘Free’ 	31			Book Review/Capitalism without Capital: The Rise 			 of the Intangible Economy 	32		 District Digest/Land-Use Regulations: A View from the Fifth District 40	 Opinion/TFP, Prosperity, and the FOMC  Published quarterly by the Federal Reserve Bank of Richmond P.O. Box 27622 Richmond, VA 23261 www.richmondfed.org www.twitter.com/ RichFedResearch Subscriptions and additional copies: Available free of charge through our website at www.richmondfed.org/publications or by calling Research Publications at (800) 322-0565. Reprints: Text may be reprinted with the disclaimer in italics below. Permission from the editor is required before reprinting photos, charts, and tables. Credit Econ Focus and send the editor a copy of the publication in which the reprinted material appears. The views expressed in Econ Focus are those of the contributors and not necessarily those of the Federal Reserve Bank of Richmond or the Federal Reserve System. ISSN 2327-0241 (Print) ISSN 2327-025x (Online)  MESSAGE FROM THE PRESIDENT Taxes and the Fed  I  t’s a pleasure to meet you, the readers of Econ Focus, for the first time. I’ve come to the Richmond Fed after a 30-year career in consulting at McKinsey & Company, including serving as chief financial officer, chief risk officer, and the leader of our five offices in the South. While I’m new to Richmond, I’m not totally new to the Federal Reserve System; between 2009 and 2014, I was a member of the Atlanta Fed’s board of directors, including two years as the board’s chair. I will be relying on these experiences as I look at how economic policy translates “on the ground” and affects businesses’ and consumers’ decisions. And I’m still learning. Since I became president of the Richmond Fed in January, I’ve been spending time with business and community leaders so I can better understand local economic conditions and the economic issues facing the people who live and work in our district. One question many people are asking is how the Tax Cuts and Jobs Act, which Congress passed at the end of 2017, will affect the economy and thus the appropriate path for monetary policy. Among other changes, the legislation lowered corporate and individual tax rates. Many observers expect lower tax bills to give the economy at least a moderate boost. This is a reasonable expectation; if people are paying less in taxes, then they have more money to spend on other things. But how people respond to tax cuts depends on a number of variables, as Helen Fessenden discusses in the article “A Taxing Question for the Fed” in this issue. One factor she notes is whether the tax cut is expected or a surprise. In general, research suggests that unanticipated tax cuts have a larger net effect on output and investment than anticipated tax cuts. That’s because anticipated tax cuts actually appear to cause some contraction in the short term. She also says it matters if the tax cut is permanent or temporary. It is a basic economic theory that households try to keep their consumption smooth over time, so consumers might be less likely to change their behavior if they know a tax cut is going to expire. The effects of tax cuts could also be muted if people expect they will lead to large deficits, which would eventually need to be addressed via tax increases or spending reductions. The aggregate effects of corporate tax cuts are especially hard to predict, as companies with different corporate structures may respond in different ways. Corporations also vary in their decisionmaking about whether to channel tax savings into capital purchases or upgrades, employee compensation, or returning value to shareholders. Tim Sablik reviews the recent changes to corporate tax policy  in this issue’s “Policy Update.” Given these many uncertainties, the Federal Open Market Committee has been cautious when assessing the future impacts of the recent tax legislation. Moreover, fiscal policy is just one of many factors that influence the committee’s policy decisions. For example, we’re also looking closely at labor force participation. The labor force participation rate has been drifting downward since around 2000, as baby boomers reach retirement age and more young people enroll in college, among other factors. During and after the Great Recession, the decline accelerated when some people became discouraged about their job prospects. Now, it’s possible that a strong economy and labor market have induced some people to look for work who might otherwise not have, thus pushing participation above its long-term trend. This, among other factors, might help moderate the upward pressure on wages from further declines in the already-low unemployment rate. Another important factor is productivity growth, which influences the economy’s growth potential and thus the appropriate policy rate for monetary policymakers to target. Our director of research, Kartik Athreya, discusses this topic in more detail in his “Opinion” column. I hope you enjoy this issue, and I look forward to continuing the conversation. EF  TOM BARKIN PRESIDENT FEDERAL RESERVE BANK OF RICHMOND  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  1  UPFRONT  Regional News at a Glance  BY L I S A K E N N E Y  MARYLAND — In March, the Maryland State Department of Assessments and Taxation launched Maryland Business Express, a website to help entrepreneurs start, plan, grow, and manage small businesses. The website consolidates information from several state agencies to give users practical tips as well as clear tax and regulatory information. It features “Chatbot,” a digital assistant that users can access 24 hours a day, seven days a week; according to the agency, it marks the first time this kind of digital assistant technology has been used by a state government for business purposes. In April, the site was one of the winners of “State IT Innovation of the Year” from StateScoop 50, an annual nationwide awards program that honors achievements in government IT. NORTH CAROLINA — In April, Corning announced that it will build a new manufacturing facility in Durham that will focus on one of its newest products, Valor Glass. Valor Glass is a material for pharmaceutical packaging that is more damage- and contamination-resistant than other packaging. Corning expects to invest $190 million in the plant, which will create 300 jobs with average annual salaries projected to be more than $65,000. Corning was awarded economic development grants by the state and the county, including two tax-reimbursement grants. SOUTH CAROLINA — Clemson University industrial engineering professor Sara Riggs received a $550,000 grant from the National Science Foundation to fund a five-year research project on adapting workplaces for workers with disabilities. The grant was announced in March and funding will begin in July. Riggs will bring her research into the real world by collaborating with ClemsonLIFE, a university program for students with intellectual disabilities, and Walgreens, which has a distribution center nearby. VIRGINIA — In late March, Microsoft announced it will buy 315 megawatts of power from the planned Pleinmont I and II solar facilities in Spotsylvania County; Pleinmont is part of a larger 500-megawatt solar development, which will be the largest in Virginia once it is operational. It marks the largest-ever corporate purchase of solar energy in the United States. Microsoft is the first company to buy energy from the Pleinmont site, which the developers say will help them offer competitive pricing to other potential buyers. Microsoft says the purchase will allow the company’s data centers in Virginia to be powered completely by solar energy. WASHINGTON, D.C. — The arts sector generates 8.4 percent — $10.2 billion — of D.C.’s GDP, a larger share than that of any state, according to a new report from the Bureau of Economic Analysis and the National Endowment for the Arts. The report measures economic activity related to arts and culture between 2001 and 2015. D.C.’s high concentration of federal museums and monuments, plus the Smithsonian Institute, helped push it to the top of the report’s GDP percentage list. Nationwide, according to the report, the arts contributed $763.6 billion to the U.S. economy. WEST VIRGINIA — In late March, three projects were awarded $5 million in business development grants from the Commerce Department’s Economic Development Administration. The projects are projected to create or retain 755 jobs and lead to $7.5 million worth of private investments. The three projects are the rehabilitation of the Wheeling Corrugating Plant in Beech Bottom, the expansion of a business park in the city of Weirton, and the rehabilitation of the Rahall Business and Technology Center in Greenbrier County. 2  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  ATTHERICHMONDFED  The Great ATM Shakeout BY DAV I D A . P R I C E  Editor’s Note: Welcome to the inaugural “At the Richmond Fed”  column, a new series profiling economic research activities at the Richmond Fed. Highlighted Research  “Innovation, Deregulation, and the Life Cycle of a Financial Service Industry.” Fumiko Hayashi, Bin Grace Li, and Zhu Wang. Review of Economic Dynamics, October 2017, vol. 26, pp. 180-203.  S  oon after banks introduced automated teller machines (ATMs) in the late 1960s and early 1970s, shared networks of the machines — that is, networks in which multiple banks participated — began to emerge. The number of shared networks grew steadily until reaching a peak of around 120 in the mid-1980s. Then a massive shakeout took place: Only half of the networks remained by the mid1990s, and less than half of the rest remained by the mid2000s. Around 35 percent of the exits took place through mergers or acquisitions, while the other networks simply disappeared. What happened? That’s the question asked and answered in recent research by Richmond Fed economist Zhu Wang, together with Fumiko Hayashi of the Kansas City Fed and Bin Grace Li of the International Monetary Fund. Several suspects were on the scene. One was the general banking deregulation that started taking effect in the mid-1980s that allowed banks to operate statewide and across state lines, which led to consolidation in the banking industry. Another was an ATM-specific deregulation: The U.S. Supreme Court in 1986 let stand an appeals court ruling that national banks were allowed to use ATMs in shared networks across state lines without violating federal branching restrictions. Finally, an important technological innovation occurred in the mid-1980s with the introduction of debit cards that could be used at both retail locations and ATMs, increasing the usefulness of the cards. Wang, who joined the Richmond Fed in 2011, had done extensive research on payment systems. But before he became interested in the life cycle of the ATM industry, he was mainly involved in another line of research — namely, the life cycle of the television manufacturing industry. In that work, he studied the shakeout of TV manufacturers in the United States and the United Kingdom. The ATM industry seemed to present a related, yet novel, research frontier. “It’s popular to look at the shakeout pattern in manufacturing industries from the industry’s birth to its peak number of firms and then its consolidation,” Wang says. “But for the service sector, there are very few studies of the life cycle. And the  shakeout pattern among ATM networks was intriguing — what can explain this?” Hayashi, a former colleague at the Kansas City Fed, collected detailed historical data on the ATM industry. For the years in their study (1983 to 2005), the data they looked at included the entry and exit of networks, the number of ATM-only and hybrid ATM-debit networks, the number of cards in circulation for each network, and the number of ATM transactions for each network. (Some of the networks were owned by a single bank, others by multiple banks in partnership.) An early version of their article focused on the effect of the debit innovation. But during the revision process, an editor and an anonymous referee encouraged them to be more ambitious — to create a more complete model that would also capture the regulatory changes. “At that point,” Wang recalls, “we added a third author, Bin Grace Li, who made major contributions to the structural modeling and computation.” The results indicated that the introduction of retail debit cards was the most important cause of the shrinking number of networks. “The debit innovation probably accounted for most of the shakeout,” Wang says, “but banking and ATM deregulation added quite a bit of welfare gain to the consumers by reducing the cost of providing the service.” The researchers concluded that the debit-card innovation drove the shakeout by creating a “technological gap” between networks that adopted it early and those that didn’t, a gap that continued to widen — eventually blocking new entrants and causing the laggard networks to exit. With regard to the welfare gain to consumers, the researchers estimated that as of 2008 (roughly a quarter-century after the various shocks that they were studying), 43 percent of the gain was due to deregulation and 57 percent to the debit innovation. The study also found that large ATM networks had a higher annual adoption rate for the debit-card innovation. The researchers noted that with a large base of cardholding customers, large networks may have had an advantage in persuading retailers to accept their debit cards — which, in turn, may have made it easier for those networks to justify investing in debit technology. The hardest part of the project for Wang and his co-authors, he says, was working out “how to put things together” — how to act on the editor’s request to assemble a broader structural model and match it with the data. But it was worth it, he says, to build up “a coherent framework to understand the evolution of a service industry and the roles played by innovation and deregulation.” EF E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  3  FEDERALRESERVE  A Taxing Question for the Fed  The Fed has long emphasized uncertainty in assessing the economic effects of tax cuts. Both history and theory might help explain why BY H E L E N F E S S E N D E N  O  n Dec. 13, 2017, the Federal Open Market Committee, or FOMC, concluded its policy meeting by raising the fed funds rate for the fifth time since the Great Recession, citing a strengthening labor market and “solid” economic activity. The day was also the occasion of the final press conference of outgoing Fed Chair Janet Yellen, which immediately followed the meeting. The initial round of questions from reporters, however, focused on fiscal policy — specifically, President Donald Trump’s tax bill, which was about to clear Congress. (See “Policy Update,” page 21.) By cutting individual and corporate taxes by $1.5 trillion over 10 years, its backers argued, it would encourage greater investment, ultimately spurring productivity and boosting economic growth. Yet when asked how the legislation would affect the Fed’s outlook on output, inflation, and monetary policy, Yellen struck an agnostic tone. “Much uncertainty remains about the macroeconomic effects of the specific measures that ultimately may be implemented,” she said, referencing the economic projections that the committee issues on a quarterly basis. “Changes in tax policy [are] only one of a number of factors, including incoming data that has, to some extent, altered the outlook for growth and inflation.” Other Fed officials issued similar caveats in the following months. The FOMC’s January 2018 minutes, for example, noted that “several participants expressed considerable uncertainty about the degree to which changes to corporate taxes would support business investment and capacity expansion.” And in his first press conference as Fed chairman, Jerome Powell said that while the bill had “elements that should encourage investment, which should help productivity [and] encourage labor force participation,” the committee also didn’t “know how big those effects are going to be.” This episode isn’t an exception. For long-standing reasons, the Fed has generally been guarded in assessing the degree and timing of tax cut effects. Former Fed Chairman Ben Bernanke explained this approach in a blog post last year. For one, he wrote, the Fed tends to be a cautious institution in the face of uncertainty. In particular, the details of tax changes are often unclear until passage and sometimes well thereafter. The Fed also faces the daunting task of incorporating all relevant variables into its forecasts — including factors that might mitigate the impact of tax policy — while steering clear 4  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  of making public statements on political matters. As Bernanke also noted, however, the Fed’s caution reflects what economists know about the likely effects of tax changes. Most tax cuts since 1981 have been temporary, phased in, later offset, or passed at times when any stimulus was facing economic headwinds — and theory suggests all these factors affect the magnitude and timing of the resulting stimulus. The Long and Short of It Economic theory holds that people seek to smooth out consumption over time based on their expectations of income and wealth for their entire lifetime — what economists call the “permanent income hypothesis.” If you get a one-time bonus, for example, theory would imply that your boost in consumption today will be less than the full amount of the bonus because you’ll want to save at least some of it for future consumption. On the other hand, if you have an increase in your salary and you believe it’s permanent, you’ll spend a larger share of the increase. The ideas behind the permanent income hypothesis are one reason why economists have long argued that if fiscal policymakers truly want to boost investment, consumption, and output, permanent tax cuts are far more effective than temporary ones. A related idea, called “Ricardian equivalence,” suggests that to the extent people believe tax cuts today will be financed by tax hikes in the future, they will save an equivalent amount of the tax cut in preparation. This behavior could dampen any intended stimulative effect, but whether this actually happens has been long debated — in part because tax cuts, in practice, tend not to be offset in the long run. History does, indeed, offer examples of permanent income changes. During the postwar decades, Americans got a series of long-standing boosts. One was permanent tax cuts that Congress passed fairly frequently from the 1940s through the 1970s, in part as a way to adjust tax brackets, which weren’t indexed for inflation. (This “bracket creep” meant that without such adjustments, inflation eroded the real value of incomes, leaving taxpayers stuck with de facto higher marginal rates.) Another route was the steady increase in Social Security benefits, a positive income shock for recipients. In a 2016 study of the program over decades, University of California, Berkeley economists Christina and David Romer found that household consumption  responded much more strongly to permanent benefit increases than it did to temporary ones, lending some support to the permanent income hypothesis. Wait and See To be sure, in the real world, people don’t have perfect knowledge of their lifetime earnings. But the evidence suggests that households and firms do react in anticipation of future events. In the case of tax cuts, they might respond when a politician or party running on a tax cut platform has secured an electoral win. Tax cuts can also be anticipated before they take effect if policymakers phase them in over multiple years. Research suggests that this delay between when expectations are set and when tax cuts are implemented influences their economic impact. Economists Karel Mertens of the Dallas Fed and Morten Ravn of University College London tested this idea in a study analyzing all U.S. tax cuts and hikes from 1947 through 2003 to see whether expected and unexpected changes had different outcomes. They found that unexpected tax cuts (implemented within 90 days of passage) did boost hours worked, consumption, investment, and output, taking about two and a half years to peak. But if these changes were expected (beyond the 90-day window), then investment, hours worked, and output often fell before the cuts kicked in — and picked up only after the cuts took effect — while consumption saw a smaller, briefer drop. Expected cuts still had a net stimulus effect, but it was more muted than when tax cuts came as a surprise. More broadly, the authors concluded, these “anticipation effects” could explain one-fifth to one-quarter of business-cycle volatility in this entire period. So how might delayed implementation of a tax cut on labor income cause a temporary drag on growth? Theory suggests that if you expect a tax cut in the future, you’ll feel richer and “buy” more leisure and consumption in the present — what economists call an “income effect.” As you take in more leisure, you also work fewer hours and, all else equal, output falls. In standard models, firms respond to this reduced supply of labor by offering higher wages; these higher wages, in turn, draw some people back toward work, potentially dampening or even completely offsetting the income effect. Then, once the tax cut on labor income kicks in, working becomes even more relatively lucrative, so you’re inclined to choose even more work over leisure. Under standard assumptions, these “substitution effects” are generally stronger than the income effect once the tax cut is implemented, prompting people to work more on balance, which boosts output. The effects of an anticipated tax cut that relates to capital can be even more complex since they depend on factors such as depreciation and firms’ decisions on investment over time. In fact, given that the returns to capital are accrued over long periods of time, investment may even rise right away, before the tax cut is implemented. In the  For long-standing reasons, the Fed has generally been guarded in assessing the degree and timing of tax cut effects.  broad empirical sample that Mertens and Ravn analyzed, however, investment (like hours worked) fell ahead of anticipated tax cuts but then responded positively once they were in effect; if the cuts are unexpected, investment rose right away, peaking at 10 percentage points for every 1 percentage point drop in tax rates. In particular, Mertens and Ravn saw the 1981 cuts under President Reagan as a strong test case. They were significant in size — slicing the top marginal rate from 70 to 50 percent, and the lowest from 14 to 11 percent — and were phased in over five stages from 1981 through 1984 while indexing all rates for inflation starting in 1985. In 1981-1982, as consumers and firms waited for most cuts to kick in, Mertens and Ravn found that the drag caused by “anticipation effects” had an even greater recessionary pull than did the Fed’s tight monetary policy at the time. Conversely, once the cuts were fully enacted from 1983 on, they helped spur the recovery. The FOMC, for its part, paid close attention to the phased-in structure of the tax cuts, but its chief concern at the time was taming historically high inflation. While many members feared the combination of higher defense spending and lower taxes would yield higher deficits and, over time, pose greater inflationary risk, the FOMC didn’t adjust its strategy — controlling the price level by controlling money-supply growth — in a significant way. Rather, it assessed the tax cuts primarily as a question of how their phase-in would affect money supply. In the committee’s June 1982 meeting, for example, then-Chairman Paul Volcker noted the upcoming $30 billion in cuts as one factor that could lead to a seasonal “bulge” in money supply; the Fed could “tolerate” this if needed, he added, “if that makes people happier.” Bust to Boom The Bush tax cuts of 2001 and 2003 — the most sweeping since 1981 — are another well-studied case. Many of these were also phased in rather than implemented immediately. As such, they yielded similar patterns to the cuts of the 1980s. The details differed, however. One big change was that the Bush tax cuts were set to expire across the board in E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  5  January 2011. (The Reagan cuts were permanent, although Congress raised certain types of taxes in the 1980s and 1990s, partially offsetting them.) The reason for the expiration date was that Senate budget rules had changed to require a 60-vote supermajority to pass any permanent legislation that added to the deficit over a decade. Lacking those 60 votes — and bearing a $1.35 trillion price tag over 10 years — the 2001 cuts were required to lapse. Deficit concerns also meant that the tax cuts were phased in (initially, over five years) to lessen the cost. Two years later, the Bush administration secured its second tax cut victory. It accelerated the 2001 bill’s phased-in tax cuts, moving implementation up from 20042006 to 2003, and added new provisions cutting capital gains and dividend taxes; these cuts further added to the deficit, so under budget rules, they were also set to lapse. But in contrast to the expected passage of the 1981 and 2001 tax cuts, the 2003 measure was something of a surprise: The Senate vote was so close that Vice President Dick Cheney had to break the tie, and consumers and firms didn’t know their future tax cuts would kick in so quickly until the bill passed. The phased-in component of the Bush tax cuts yielded results comparable to the 1980s, research suggests. In a 2006 study, University of Michigan economists Christopher House and Matthew Shapiro tested the cuts’ “anticipation effects” with a simulated model based on the economy’s parameters at the time. From 2001 to 2003, people cut back on hours worked and firms scaled back investment. This was quickly reversed when the 2003 bill sped up the timetable. The full onset of these changes, the authors concluded, contributed to about half of the rebound in economic growth that year, with hours worked and investment suddenly jumping. (They also point to the possibility that the dampening effects on the economy of reduced labor supply in 2001-2003 could have overwhelmed any positive boost that the phased-in cuts for investment might have produced early on.) In their empirical study, Mertens and Ravn make a comparable finding on the pace of the economic pickup in the mid-2000s. How did the FOMC approach these changes as it deliberated? The published record from 2003 shows that many on the committee remained cautious even as the economy was picking up pace in the summer and fall. The minutes from that time, for example, cited the stimulative role of the accelerated tax cuts in the near term. But some members also noted that some of that could dissipate in coming years, as most provisions were set to expire due to the sunset feature; much discussion focused on whether the recovery would be strong enough to endure. In a speech in January 2004, for example, then-Governor Bernanke noted the possible dissipation of the effects of the tax cuts as one possible risk in the year ahead that “could adversely affect household spending.” (In fact, GDP growth did start slowing down in the summer of 2004, amid dropping consumption.) 6  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  Cuts during Crisis Tax policy doesn’t occur in a vacuum. The tax cuts of 2008-2009 — passed in the face of recession, rising unemployment, and high household indebtedness — are a good example. Although those measures were meant to encourage spending, some economists contend the high degree of economic distress shifted households away from consumption and toward saving or paying down debt —which, in turn, lessened the intended boost. One measure in the 2009 stimulus legislation was the “Making Work Pay” tax cut, a small tax cut for lowand middle-income earners. Rather than a one-off payment, it was implemented over two years through reduced withholding in paychecks, producing a slight bump in take-home pay. It was paired with a 2 percent payroll tax cut for two years (taken from Social Security withholding), which was ultimately extended through 2012. How well did it work? In a 2012 study, economist Claudia Sahm of the Federal Reserve Board, joined by Matthew Shapiro and Joel Slemrod of the University of Michigan, found that only 13 percent of households surveyed said they would mostly spend the Making Work Pay tax cut. By contrast, in the previous year, about 25 percent of households said they would mostly spend money from another stimulus measure, a one-time rebate check that was enacted early that year. While this finding might lend support to the idea that incremental income boosts are less effective than one-time bumps, the authors also suggested that households in 2009 might have been more reluctant to consume due to broader economic pessimism and even higher indebtedness. A 2015 Federal Reserve Board study found a similar balance sheet effect with the payroll tax cut: Households saved most of that cut and then actually reduced spending once it expired so that they could continue diverting income to savings. For its part, the FOMC broadly supported the 2009 stimulus and noted its impact in subsequent meetings as positive for consumer spending. But members also warned about its short-term duration and its small size relative to the fiscal contraction at the state level and the broader housing collapse. More generally, the Fed, as well as many economists and policymakers, worried about what would happen once those measures, combined with the much larger Bush tax cuts, were set to expire in 2010-2011, just as the economy was starting its fragile recovery. A wholesale tax cut expiration, many warned, would severely hurt consumption, especially since monetary policy, constrained by near-zero interest rates, had less scope for stimulating the economy. On these grounds, Congress extended the Bush tax cuts and the payroll tax cut through Jan. 1, 2013. That extension didn’t resolve the political deadlock, however, and it was only through the December 2012 “fiscal cliff” deal that the impasse was resolved. (The compromise made most cuts permanent while ending those for the wealthy.) It was a high-stakes episode for the Fed as well. At the December  2012 FOMC meeting, for example, San Francisco Fed President John Williams pointed to both the economic and confidence risks under such a scenario of high fiscal policy uncertainty. “There’s a danger that households and businesses could lose confidence in the ability of our elected leaders to govern,” he warned. It also prompted a rare rebuke by Bernanke in February 2013, as he called upon Congress to do more in reviving the economy through consistent, sustainable fiscal policy aimed at healing the labor market rather than leaning on the Fed for stimulus through monetary accommodation. “Monetary policy … cannot carry the entire burden,” he told senators.  Today, fiscal policy debates remain as heated as ever. But they often gloss over the fact that tax changes can bring uncertain and complex effects. As research has shown, rational behavior by consumers and firms doesn’t necessarily result in the immediate boost that some might expect. How tax cuts are timed and expected, whether the tax applies to labor or firms, and where the broader economy stands are all variables that have made each past tax change a unique experiment unto itself. These realities of fiscal policy can help explain the Fed’s preference for caveats and caution when it comes to forecasting fiscal policy’s impact on the macroeconomy. EF  Readings Barro, Robert J. “The Ricardian Approach to Budget Deficits.” Journal of Economic Perspectives, Spring 1989, vol. 3, no. 2, pp. 37-54. Gale, William G., and Peter R. Orszag. “Economic Effects of Making the 2001 and 2003 Tax Cuts Permanent.” International Tax and Public Finance, March 2005, vol. 12, no. 2, pp. 193-232. House, Christopher L., and Matthew D. Shapiro. “Phased-In Tax Cuts and Economic Activity.” American Economic Review, December 2006, vol. 96, no. 5, pp. 1835-1849. Mertens, Karel, and Morten O. Ravn. “Empirical Evidence on the Aggregate Effects of Anticipated and Unanticipated US Tax Policy Shocks.” American Economic Journal: Economic Policy, May 2012, vol. 4, no. 2, pp. 145-181.  Romer, Christina D., and David H. Romer. “Transfer Payments and the Macroeconomy: The Effects of Social Security Benefit Increases, 1952-1991.” American Economic Journal: Macroeconomics, October 2016, vol. 8, no. 4, pp. 1-42. Romer, Christina D. and David H. Romer. “The Macroeconomic Effects of Tax Changes: Estimates Based on a New Measure of Fiscal Shocks.” American Economic Review, June 2010, vol. 100, no. 3, pp. 763-801. Sahm, Claudia R., Matthew D. Shapiro, and Joel Slemrod. “Check in the Mail or More in the Paycheck: Does the Effectiveness of Fiscal Stimulus Depend on How It Is Delivered?” American Economic Journal: Economic Policy, August 2012, vol. 4, no. 3, pp. 216-250.  The Richmond Fed recently hosted a conference on the dynamics of cities. Six leading researchers presented work on a variety of topics in urban and regional economics, including the decline and redevelopment of cities as well as gentrification and other issues facing urban neighborhoods. A compendium, published by the Richmond Fed, includes a summary of the conference as well as interviews with the presenters.  Visit: https://www.richmondfed.org/-/media/richmondfedorg/conferences_and_events/research/2017/pdf/cities_in_transition_conf_compendium.pdf  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  7  JARGONALERT  Human Capital  T  heodore Schultz, a University of Chicago economist, gave a talk on a novel subject at the December 1960 annual meeting of the American Economic Association, of which he was president. His subject was “Investment in Human Capital,” a young area of economic inquiry at the time. “The mere thought of investment in human beings is offensive to some among us,” Schultz felt it necessary to acknowledge. “Our values and beliefs inhibit us from looking upon human beings as capital goods, except in slavery, and this we abhor.… And for man to look upon himself as a capital good, even if it did not impair his freedom, may seem to debase him.” Today, the term “human capital” is far more widely accepted. The concept of human capital — a person’s stock of knowledge and skills, including soft skills, that are valued in the labor market — has become central to the thinking of economists and policymakers on education, labor markets, productivity, and economic growth. Economists treat people as forward-looking investors in their own human capital, adding to it through schooling, training, or work experience if their expected rate of return on the additional human capital is sufficient. Harvard University economist Claudia Goldin has called the 20th century the human capital century — a reference to the widespread increase in schooling during that period. The “high school movement” in the United States early in the century boosted the share of people entering and finishing high school from less than 10 percent in 1910 to around 50 percent in 1940; high school graduation rates reached approximately 70 percent by the end of the century (over 80 percent counting GED recipients). College education has also risen significantly in recent decades. According to a 2016 paper by Camille Ryan and Kurt Bauman of the U.S. Census Bureau, a little more than 15 percent of Americans aged 25 to 29 had completed a four-year college degree in 1970, compared to 36 percent in 2015. Human capital theory holds that this trend has been driven in large part by the expected payoff; as with all investments, people accumulate more human capital when they expect the returns to be higher. The returns to college are high and growing: Students who complete a four-year undergraduate degree receive, on average, a large wage premium over those who do not, a premium that has been rising since the late 1970s. One 8  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  analysis has concluded that workers with undergraduate degrees (and who stopped there) received 1.75 times the wages of a high-school-only graduate in 2005, up from 1.4 times in 1980 — a trend that economists believe is a reflection of changes in the demand for skills in a more high-tech-based economy. (Of course, the wage premium depends on the student’s field, among other factors.) Moreover, graduate and professional education is of growing importance to earnings: According to a 2012 study by Jonathan James, then of the Cleveland Fed, the college premium is increasingly conditional on the student also completing a graduate or professional degree. All of the growth in the college wage premium since the 2000s, James found, has gone to holders of advanced degrees. But although higher education is a lucrative human-capital investment for many, it can also be a risky one. Around half of students who enter college end up leaving without a degree — perhaps as a result of inadequate preparation before college or personal difficulties — and the return to attending college without actually earning a degree is generally low. Thus, these students face a depressing combination of debt (or lost savings) and low earnings. Richmond Fed research director Kartik Athreya and co-author Janice Eberly of Northwestern University have argued that such risks have slowed the growth of college-going. (See also the Richmond Fed’s 2017 Annual Report essay, “Falling Short: Why Isn’t the U.S. Producing More College Graduates?”) Young people who eschew the four-year college route will often still make investments in their human capital — through a two-year associate’s degree, on-the-job experience, or formal job-based training programs such as apprenticeships. (See “Learning in the Fast Lane,” Econ Focus, Fourth Quarter 2017.) In Schultz’s 1960 remarks on human capital, he noted one of its unusual attributes: Unlike typical investments in physical capital, investments in human capital — especially formal education — are often a consumption good as well, which has the effect of “improving the taste and quality of consumption of students throughout the rest of their lives.” This, he said, may increase the true rate of return to education far above its observed financial rate of return. So take heart when your next student-loan payment is debited from your bank account. EF  ILLUSTRATION: TIMOTHY COOK  BY DAV I D A . P R I C E  RESEARCH SPOTLIGHT  Did Workers Get Worse at Finding Jobs?  E  BY R E N E E H A LT O M  conomists view labor markets as one big matchmakThey estimated job-finding probabilities for each of the ing process: job seekers being matched with jobs. 16 groups and compared those probabilities in 2003 and The unemployment rate is the outcome of how well 2013, years when the business cycle was at similar points. this matching process works. A selling point of the matchThe likelihood that a given group of job seekers found a new ing framework is that it acknowledges that workers are not job between one and three months was lower in most cases. identical; they have unique skills, abilities, and preferences. The employment probability for the 12- to 15-month period This can help explain a number of labor market phenomeshowed no obvious trend across the groups and was not na, such as why the overall labor market can be very weak much higher, which the researchers interpreted as showing while certain types of workers are doing well, or why there the importance of relatively short-duration jobs for certain can be many job openings with many still unemployed. types of job seekers. Finally, the researchers adjusted for This last issue became a focus after the Great Recession, each group’s sensitivity to labor market tightness — since when both openings and unemployment rose dramatitightness should, in principle, boost job finding — to cally. This could suggest that produce a measure of matching high unemployment during the efficiency for each category of “Measuring Job-Finding Rates and Matching recession resulted not only from job finder. the downturn, but also from a The takeaway from this Efficiency with Heterogeneous decline in “matching efficiency” effort is clear: Matching effiJobseekers.” Robert E. Hall and — that is, that the economy ciency for most categories of Sam Schulhofer-Wohl. American Economic had gotten worse at connectjob finders steadily declined Journal: Macroeconomics, January 2018, vol. 10, ing workers with jobs. But to between 2001 and 2013 — but know the extent to which fallwith no special decline from no. 1, pp. 1-32. ing matching efficiency caused 2007 to 2010 (2010 being the elevated unemployment roughly when unemployment rates, one would need a detailed model of the factors that peaked). In other words, it does not appear that a decline influence matches. in matching efficiency is the dominant explanation for the A recent article by Robert E. Hall of Stanford University large spike in unemployment during the Great Recession. and Sam Schulhofer-Wohl of the Chicago Fed has offered But aggregate job finding rates did fall sharply during just that. The standard matching model implicitly assumes the recession — so what explains the apparent contradicthat the unemployed are the only people looking for work. tion? The key is the heterogeneity of workers. Assuming However, people often transition directly from one job to that all job finders locate jobs at the same rate makes the next, and individuals whom economists consider to be matching efficiency look as much as 50 percent worse than out of the labor force, such as discouraged workers who it is, the authors calculated. Once one accounts for differwould like a job but have stopped actively looking, often ent job-finding rates among job finders, it becomes clear find jobs as well. The latter group is large and, predictably, that it’s not that the labor market got particularly worse tends to find jobs at slower rates, so ignoring them could at matching, but instead that groups with low job-finding make the labor market seem rosier than it is. The authors rates simply grew in relative size. measured matching efficiency across 16 categories of job These findings are consistent with research by Richmond seekers: one for current workers, two for those out of the Fed economist Andreas Hornstein and San Francisco Fed labor force, and a full 13 categories of unemployed based on economist Marianna Kudlyak (formerly of the Richmond their durations of and reasons for being jobless (down to Fed). In a 2016 study, they found that in a matching framespecifics such as “on furlough for months,” “lost permanent work that differentiates among a broader array of job seekers job months ago,” and “temp job recently ended”). and factors in their respective likelihoods of finding work, Another innovation of the researchers is looking at aggregate matching efficiency steadily declined after 2000. job-finding success over a long period of time. People out (Using a similar idea, with Fabian Lange of McGill University of work may take jobs more readily even if the position is they developed the “Non-Employment Index” as an alternabrief, which could overstate the labor market’s true matchtive to the unemployment rate. An additional analysis allows ing success. They measured the probability of employment variations in search effort over time across groups.) both near term (between one and three months) and long The conclusion seems unanimous: Accounting for difterm (after a full 15 months). For each group and timespan, ferences among workers can better help explain episodes of they held personal characteristics constant. higher unemployment. EF E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  9  Are Markets Too Concentrated? Industries are increasingly concentrated in the hands of fewer firms. But is that a bad thing? By Tim Sablik  10  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  I  n its heyday in the late 19th and early 20th centuries, Standard Oil Company and Trust controlled as much as 95 percent of the oil refining business in the United States. Domination of markets by large firms like Standard Oil was emblematic of the so-called Gilded Age, and it sparked an antitrust movement. Ultimately, in 1911 the U.S. Supreme Court would order Standard Oil broken up into more than 30 companies. Today, many sectors of the economy exhibit similar levels of concentration. Google accounts for more than 90 percent of all search traffic. Between them, Google and Apple produce the operating systems that run on nearly 99 percent of all smartphones. Just four companies — Verizon, AT&T, Sprint, and T-Mobile — provide 94 percent of U.S. wireless services. And the five largest banks in America control nearly half of all bank assets in the country. In response to rising concentration in these and other industries (see chart), commentators and politicians from both sides of the political spectrum have expressed alarm. William Galston and Clara Hendrickson of the Brookings Institution wrote in a January report, “In 1954, the top 60 firms accounted for less than 20 percent of GDP. Now, just the top 20 firms account for more than 20 percent.” And a 2017 article in the American Economic Review by David Autor, Christina Patterson, and John Van Reenen of the Massachusetts Institute of Technology; David Dorn of the University of Zurich; and Lawrence Katz of Harvard University reported that concentration increased between 1982 and 2012 in six industries accounting for fourfifths of private sector employment. If rising market concentration means there is less competition, it could have a variety of economic consequences, from higher prices to lower productivity. As the Fed and other policymakers debate causes of macroeconomic puzzles like the recent productivity slowdown and slow wage growth, some economists have argued that rising concentration levels hold the key to explaining these mysteries.  Market Concentration on the Rise  Aggregate Herfindahl-Hirschman concentration index for all public firms 1400 1200  HHI  1000 800 600 400 200 0 1972  1977  1982  1987  1992  1997  2002  2007  2012  NOTE: The aggregate Herfindahl-Hirschman Index (HHI) is a weighted average of the HHIs in the United States for all industries. The HHI, in turn, is a common measure of concentration, constructed by summing the squared total sales for firms in an industry and dividing by the total number of firms in that industry. Shaded region indicates the Great Recession. Data through 2013. SOURCE: Gustavo Grullon, Yelena Larkin, and Roni Michaely. “Are U.S. Industries Becoming More Concentrated?” Manuscript, April 2017.  Fewer New Firms Being Born Entry and exit rates for firms 16 14 12 PERCENT  Efficiency vs. Market Power For much of the first half of the 20th century, it was generally assumed that concentration allowed firms to exercise market power. In the 1950s, University of California, Berkeley economist Joe Bain developed models that directly related industry concentration and competition. As markets became more concentrated, Bain reasoned, surviving firms would naturally collude to keep out competitors and increase prices. Courts and agencies during this time took a similar view, ruling against mergers that would increase a firm’s market share beyond a certain threshold. In the 1970s, economists and legal scholars from the University of Chicago began to challenge the idea that concentration should necessarily be viewed with great suspicion. They noted that concentration could rise simply from efficient firms outperforming their rivals and increasing their market shares. In his highly influential 1978 book, The Antitrust Paradox, Robert Bork argued that mergers often benefited society through lower prices and higher productivity, which antitrust policy should take into account. (For more on this history, see “A Matter of Antitrust,” Region Focus, Summer 2009.) Several recent studies have attempted to determine whether the current trend of rising concentration is due to the dominance of more efficient firms or a sign of greater market power. The article by Autor, Dorn, Katz, Patterson, and Van Reenen lends support to the Chicago view, finding that the industries that have become more concentrated since the 1980s have also been the most productive. They argue that the economy has become increasingly concentrated in the hands of “superstar firms,” which are more efficient than their rivals. The tech sector in particular may be prone to concentration driven by efficiency. Platforms for search or social media, for example, become more valuable the more people use them. A social network, like a phone network, with only two people on it is much less valuable than one with millions of users. These network effects and scale economies naturally incentivize firms to cultivate the biggest platforms — one-stop shops, with the winning firm taking all, or most, of the market. Some economists worry these features may limit the ability of new firms to contest the market share of incumbents. (See, for example, “Interview: Jean Tirole,” Econ Focus, Fourth Quarter 2017.) Of course, there are exceptions. Numerous online firms that once seemed unstoppable have since ceded their dominant position to competitors. America Online, eBay, and MySpace have given way to Google, Amazon, Facebook, and Twitter. “It’s easy to say that because there are scale economies in these businesses there can never be competition,” says Richard Schmalensee, an economist at the Massachusetts Institute of Technology who has written extensively on the industrial organization of platforms. “But there are scale economies in a lot of businesses. They limit the extent of competition, but they don’t wipe it out.”  10 8 6 4 2 0 1978  1982  1986  Firm Entry Rate  1990  1994  1998  2002  2006  2010  2014  Firm Exit Rate  NOTE: Shaded region indicates the Great Recession. Data through 2015. SOURCE: U.S. Census Bureau Business Dynamics Statistics and author’s calculations.  On the other hand, some researchers have argued that this time may be different. Entry rates for new firms have fallen in recent years, perhaps signaling that challengers are finding it increasingly difficult to gain a foothold. (See chart.) This could be the result of anticompetitive behavior on the part of incumbent firms. Last year, European Union antitrust authorities hit Google with a record-setting 2.42 billion euro fine for allegedly manipulating its search engine results to favor its own services over those of competitors. “You don’t want to inhibit firms from taking advantage of economies of scale,” says Schmalensee. “On the other hand, you don’t want those economies to get baked into monopoly positions that are defended by unfair means.” Technology, and the patents on that technology, may be another way incumbents create barriers for challengers. In E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  11  R  ising market concentration may have a negative effect on  innovation and economic dynamism. a 2017 working paper, Gustavo Grullon of Rice University, Yelena Larkin of York University, and Roni Michaely of Cornell University found that since 2000, firms in concentrated markets have had more patents than firms in less concentrated ones. Those patents held by firms in concentrated markets also tended to be the most valuable, representing an expensive hurdle to new firms seeking to enter those markets.  Price and Wage Effect Prices may provide another signal of how much competition exists in concentrated markets. Firms that are able to protect themselves from competitors have more power to raise prices above marginal costs with less fear of being undercut. In a perfectly competitive market, such markups would induce new firms to enter the market and offer lower prices, eventually bringing markups closer to zero. Actually measuring markups is tricky, however. It requires some knowledge of firms’ underlying costs, which are typically not fully available to researchers. Researchers must infer marginal costs from total cost data. Additionally, in order to analyze markups across an entire industry, economists may assume that all firms in that industry face the same marginal cost structure. Depending on how realistic that assumption is, it may skew the results. Given these challenges, it is perhaps unsurprising that economists have found conflicting evidence on markups. A 2018 working paper by Jan De Loecker of Princeton University and Jan Eeckhout of University College London and Universitat Pompeu Fabra Barcelona found that markups have risen substantially since 1980 — from 18 percent above cost to 67 percent above cost today. They argue this increase is the result of rising market power. On the other hand, higher markups could be driven by changing costs. In a recent working paper, James Traina of the University of Chicago found that the growth in markups reported by De Loecker and Eeckhout largely disappears after accounting for the increase in marketing costs as a share of firms’ total operational costs during the same period. Thus, it is not entirely clear whether rising market concentration today is allowing firms to exercise market power and charge higher markups. Firms in concentrated industries could also exercise market power over the inputs to their production, such as labor. In highly concentrated markets, firms might collude to reduce competition for workers and thus pay lower wages. In 2010, the Department of Justice investigated claims that Apple, Google, Intel, Intuit, Pixar, and Adobe 12  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  had entered into agreements not to poach each other’s employees, suppressing competition for tech workers. The firms agreed to end the practice as part of a settlement. Even without collusion, firms with greater market power may be able to pay lower wages. A 2018 National Bureau of Economic Research working paper by Efraim Benmelech of Northwestern University, Nittai Bergman of Tel Aviv University, and Hyunseob Kim of Cornell University found that higher industry concentration is associated with lower wages at the local level, and this link has strengthened since 1981. Efficiency gains could also explain these trends. Autor and his co-authors argue that “superstar” firms in concentrated industries rely on fewer workers due to the firms’ higher productivity. This would reduce the share of economic output that accrues to workers, slowing overall wage growth. (See “Will America Get a Raise?” Econ Focus, First Quarter 2016.) Declining Dynamism Higher prices and lower wages are just two potential costs of rising concentration. Policymakers at the Fed are also interested in the long-term growth potential of the economy, and some economists have argued that rising concentration may have a negative effect on innovation and economic dynamism. Harvard University economist Joseph Schumpeter famously coined the phrase “creative destruction” to describe the process whereby competition from innovative new entrants drives productivity growth. In theory, nimble and inventive startups will outperform and replace stagnant and less efficient incumbent firms, reallocating workers to more productive uses. Research suggests that this process has slowed in recent decades. Young firms, which have historically accounted for a significant share of job creation, are employing a shrinking share of the labor force. On the other hand, some economists have disputed the idea that creative destruction is what drives economic growth. In a 2018 paper, Chang-Tai Hsieh of the University of Chicago and Peter Klenow of Stanford University found that innovation and productivity gains largely come from incumbent firms improving their own processes and products rather than from dynamic startups. Under this view, increased concentration and falling startup rates might not be a concern, as long as incumbents possess the right incentives to continue innovating. The effect of competition on incentives to invest and innovate is an open question, however. “One of the potential issues with innovation is that you pay the cost today, but if you can’t protect your innovation, then you won’t reap the benefits in the future,” says Thomas Philippon of New York University. This may be particularly true in industries where initial research and development costs are high but the cost of replication is low, such as in the pharmaceutical industry.  The United States and other governments award patents — temporary monopolies — to incentivize firms in such industries to innovate. But it is also possible that firms with strong market power will choose to innovate less, preferring instead to reap the rewards from maintaining high prices on their existing products. The two theories aren’t mutually exclusive. Economists have suggested that the relationship between competition and innovation may follow an inverse U-shaped pattern. At low levels of competition, more competition incentivizes firms to innovate. But if competition levels are already high, innovative firms are more likely to be imitated by competitors, diminishing incentives to innovate. The question is, where do firms in concentrated industries today fall on the curve? “For most industries in the United States, it looks like we are the side of the curve where more competition leads to more innovation, not less,” says Philippon. Firms’ investment levels have been low since the early 2000s relative to their profitability, according to recent work by Philippon and Germán Gutiérrez, his colleague at New York University. After accounting for market conditions, such as lingering scars from the Great Recession, they found that firms in more concentrated industries invested less than those in more competitive markets. They argue this is due to lack of competition. “When industry leaders are challenged, they actually invest more, both in physical assets as well as intangibles like intellectual property,” says Philippon. “I’m sure you can find examples where competition has discouraged innovation, but I think we are far from that today.” No Easy Solutions Many signs point to rising industry concentration in recent years. What that means for the economy is less clear. Some evidence suggests that rising concentration levels are tied to weakening competition, which is likely to have negative effects on consumer welfare and economic productivity. Other work suggests that efficiency is driving firm consolidation, which is beneficial for consumers. To complicate matters further, both forces could be happening at the same time depending on the industry, making it difficult to disentangle effects in the aggregate economy.  Context also matters for assessing concentration. Two localities can have similar levels of concentration in an industry sector but very different levels of competition. For example, a 2016 study of payment choices in the Fifth District by Richmond Fed economists Zhu Wang and Alexander Wolman found that having fewer banks in a rural setting corresponded with lower card and higher cash usage by customers, suggesting banking services were expensive and not competitive. But they found the opposite in metropolitan areas. Customers of banks in highly concentrated urban markets had higher card adoption. For rural banks, concentration appeared to be a sign of market power, while for metropolitan banks it reflected consolidation driven by efficiency gains. Still, many have called for more vigorous antitrust enforcement or new laws to address the rise in industry concentration. Carl Bogus, a professor of law at Roger Williams University, wrote in a 2015 article that antitrust law prior to the rise of the University of Chicago view was concerned not only with the economic consequences of large firms, but also with the political consequences as well. Bogus argues for using antitrust law to curtail corporate political power, even if doing so may result in some economic inefficiencies. Others are skeptical that antitrust is the right tool for this job. Carl Shapiro of the University of California, Berkeley, who served in the Antitrust Division of the Department of Justice under President Barack Obama, has written that he supports vigorous antitrust enforcement but that other policies, such as campaign finance reform, are better suited to addressing concerns about corporate political power. More than a century after the passage of the 1890 Sherman Act, which established American federal antitrust law, it remains a challenge for policymakers to balance concerns about large firms wielding too much market power with a desire not to punish companies that have succeeded on their own merits. “You worry about a firm that has market power, ceases to innovate, and just charges high prices,” says Schmalensee. “But competition sometimes has winners, and one of the worst things you can do as a policymaker is pick on the winners.” EF  Readings Autor, David, David Dorn, Lawrence F. Katz, Christina Patterson, and John Van Reenen. “The Fall of the Labor Share and the Rise of Superstar Firms.” National Bureau of Economic Research Working Paper No. 23396, May 2017. De Loecker, Jan, and Jan Eeckhout. “The Rise of Market Power and the Macroeconomic Implications.” National Bureau of Economic Research Working Paper No. 23687, August 2017.  Gutiérrez, Germán, and Thomas Philippon. “Declining Competition and Investment in the U.S.” National Bureau of Economic Research Working Paper No. 23583, July 2017. Traina, James. “Is Aggregate Market Power Increasing? Production Trends Using Financial Statements.” Chicago Booth Stigler Center for the Study of the Economy and the State New Working Paper Series No. 17, February 2018.  Grullon, Gustavo, Yelena Larkin, and Roni Michaely. “Are U.S. Industries Becoming More Concentrated?” Manuscript, August 2017.  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  13  Pay for success is helping fund projects like the green roof at the Fort Reno Reservoir in Washington, D.C.  PAYING FOR SUCCESS  14  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  PHOTOGRAPHY: COURTESY OF DC WATER  S  ince 1990, the federal government for success could siphon resources away State and local has conducted randomized confrom social issues that aren’t a good fit for governments are trolled trials of 11 large social prothe model. grams, totaling more than $10 billion in trying a new spending per year. Ten of those programs How Do You Pay for Success? were found to have “weak or no positive financing model for In a PFS contract, an investor or group effects” overall. Many other programs are of investors gives a nonprofit service prosocial programs never evaluated, a state of affairs that vider the money to deliver its service for n has led some critics to deem government a set amount of time. Over the course By Jessie Romero spending on social programs “a triumph of of the project, a third party assesses the hope over evidence.” program’s results, and as predetermined In recent years, however, governments at all levels have milestones are achieved, a payor (typically but not always made increasing use of data and rigorous evaluations to assess a government agency) makes “success payments” to the programs, a practice generally known as “evidence-based investors. If all goes well, at the conclusion of the projpolicymaking.” Cities and states in particular have begun ect the investors have been paid back, potentially with using a new financing model known as “pay for success,” or some interest, and the service has proven cost-effective PFS, which has links to evidence-based policymaking. In enough for the government to continue and possibly this model, private investors provide the upfront payments expand it. for a social service such as job training or supportive housing, The deals aren’t easy to put together, Giovannitti notes. and the government repays them only if the service achieves “It takes a lot of capacity on the ground. A lot of players predefined outcomes. (PFS financing is also referred to as have to be involved.” In addition to the investors, the “social impact bonds.”) Since the first PFS project in the service provider, and the payor, those players include legal United States was launched in New York City in 2012, counsel, a third-party project manager — sometimes several an additional 19 projects have officially gotten underway, project managers — to help structure and oversee the deal, including two in the Fifth District. (The first project in the and an independent evaluator to assess the results. world ran in England between 2010 and 2015.) More than 50 David Hunn is the president and CEO of the U.S. projects are in some stage of development. SkillSource Group, which administers federal and state Pay for success has attracted bipartisan support as well funding for workforce development efforts in Northern as the attention of many community development practiVirginia. SkillSource just launched a PFS project targettioners, including at the Fed. “This is a new way to apply ing young adults who have been involved with the foster community finance to chronic social issues,” says Jennifer care or justice systems; success payments will depend on Giovannitti, a regional community development manager the youths’ employment and education outcomes after at the Richmond Fed. “We can bring in new thinking leaving the program. “Planning the project and assessing and new efficiencies.” Still, some people in the nonprofit its feasibility required a whole new degree of rigor,” he community are concerned that focusing too much on pay says. “Most of us in the local workforce boards don’t have  that expertise — we needed an outside expert to help us move the ball forward.” SkillSource worked with Third Sector Capital Partners, one of several nonprofits in the United States dedicated to PFS projects. There are also three academic centers, including the Government Performance Lab at Harvard University, the Sorensen Impact Center at the University of Utah, and the Pay for Success Lab at the University of Virginia (UVA), where Giovannitti is on the advisory board. (See “Growing the Pipeline of Pay-for-Success Projects,” Richmond Fed Community Practice Papers, February 2018.) “Our job is part education, part research, and part analysis,” says Josh Ogburn, the director of the UVA lab. “People have heard about the concept, but they don’t know the ins and outs or how to get started. So we help communities develop a project idea and connect them with other advisers.” Investing in Success In 2016, Denver launched a five-year, $8.6 million project to provide permanent housing and other support services to 250 chronically homeless individuals. The goal is to reduce the amount of time they spend in jail, detox centers, and emergency rooms — services that typically cost the city $7 million annually, or $28,000 per person. A group of private foundations put up the $8.6 million; at the end of 2017, the city made its first success payment of $188,000 based on initial reductions in jail time. If the program weren’t meeting its benchmarks, the city wouldn’t have to pay anything, and the program could be discontinued. That’s what happened to the first U.S. PFS project, which was intended to lower recidivism among juvenile offenders at Rikers Island in New York City. Three years after it launched in 2012, it had failed to meet the minimum goal of reducing recidivism by 8.5 percent, and the Goldman Sachs Urban Investment Group exercised its option to cancel the project. The New York City Department of Corrections didn’t make any payments to Goldman, which by that point had invested $7.2 million of a pledged $9.6 million. (Goldman didn’t actually lose that total amount, as Bloomberg Philanthropies had guaranteed three-quarters of the investment. If the project has succeeded, Goldman would have received the entire return.) The potential return to investors in a PFS project varies considerably. Goldman estimated it would earn a return of between 11 percent and 22 percent, depending on if the program met or exceeded its performance goals. The estimated return to the Denver investors, however, assuming the project is successful, is just 3.5 percent. But the investors in a PFS project aren’t necessarily looking to make a lot of money; they’re what are known in the nonprofit world as “impact investors” rather than “return investors.” Goldman’s Urban Investment Group, for example, is committed to “double bottom line” investing, which emphasizes both financial and social returns. And about  half of PFS investors so far have been philanthropies or foundations rather than banks or investment funds. Enthusiasm for Evidence Technological changes in recent decades have made it easier and cheaper to collect, link, and analyze data, and have contributed to a bipartisan push for evidence-based policymaking. Both the George W. Bush and Obama administrations increased funding for data collection and tried to incentivize federal agencies to use more of it. In 2016, Congress passed a bipartisan bill creating the Commission for Evidence-Based Policymaking, which was charged with assessing how the government could make better use of data. The commission issued its final report in September 2017. In line with the greater focus on evidence, the federal government has supported state and city efforts to develop PFS projects. In 2014, the Workforce Innovation and Opportunity Act, which revamped workforce development, allowed local workforce boards to set aside 10 percent of their funding for PFS programs. (The SkillSource project is the first one to use that funding.) Also in 2014, the Social Innovation Fund (SIF), which was established by the Obama administration in 2009, began awarding “technical assistance” grants to help nonprofits and local governments conduct feasibility studies and gather data. The fund gave out nearly $17 million for PFS projects. The SIF was defunded in 2017, but the budget deal that passed in February 2018 included $100 million in competitive grant money to help cities and states develop PFS projects. Pay for success also has attracted the attention of community development practitioners at the Fed. Several regional Reserve Banks, including the Richmond Fed, have hosted events for community groups and other stakeholders interested in setting up a PFS project; the San Francisco Fed devoted an entire issue of its journal Community Development Investment Review to the topic in 2013. “We’re investment oriented,” says Giovannitti. “How can investment move the needle on social issues?” The Fed’s community development function grew out of its role assessing banks’ compliance with the Community Reinvestment Act (CRA), which encourages banks to meet the credit needs of their local communities, including low- and moderate-income communities. In the future, it’s possible banks could earn CRA credit by investing in PFS projects, as one of the San Francisco Fed articles explored, although regulators have not yet given banks any specific signals regarding how or if PFS financing might satisfy CRA requirements. Still, the potential is there, says Giovannitti. “Pay for success is a natural fit for some of the issues banks are likely to be interested in, such as affordable housing and workforce development.” The Limits of Pay for Success Not every issue can be addressed with a PFS project; the model requires readily available data, clearly measurable outcomes, and defined cost savings within a reasonable E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  15  time frame. Recidivism is a good fit, for example, because “the outcome is straightforward,” explains Ogburn. “Did the person go back to jail or not? It’s easy to verify, and everyone can agree on the definition.” Certain objectives in education however, may be more challenging. “The outcomes are harder to quantify because the social benefits and fiscal value accrue further in the future.” (There are PFS projects in development targeted toward early childhood education, with short-term metrics such as kindergarten readiness.) In addition, the number of service providers with the capacity and expertise necessary for the rigorous data collection required for PFS is relatively small, creating the potential for the same few high-performing providers to receive the majority of PFS funding. Some observers are concerned that “rather than motivating the rest of the pack to ‘lift’ their game and demonstrate effectiveness, the inability of these other organizations to raise PFS funding could hamper their ability to deliver social services,” as V. Kasturi Rangan and Lisa Chase of Harvard University wrote in a 2015 article in the Stanford Social Innovation Review. Other concerns are that the focus on clearly measurable — and successful — outcomes will lead governments and service providers to focus on the populations most likely to succeed to the detriment of those who are harder to serve. In addition, while a major selling point of pay for success is saving the government money, some projects could end up costing more than they would have under a traditional contract. In 2013, for example, Maryland’s Department of Legislative Services concluded that the cost of designing a program and negotiating a PFS contract would probably exceed the pilot program’s projected benefits. Even if the long-term savings are potentially large, PFS projects could be a hard sell to the many states facing immediate budget shortfalls. Detractors of PFS point to the failure of the Rikers Island project as proof that the model doesn’t always work — or at the very least is overhyped. But others say that’s the wrong conclusion. “PFS is a financing model, not an intervention,” wrote Paula Lantz and Samantha Iovan of the University of Michigan in a 2017 article. “The ‘does it work’ question should be focused on the quality and impact of the interventions selected for a PFS performance-based contract, not the model itself.” Pay for Success in the Fifth District Several cities and states in the Fifth District are moving ahead with PFS projects. In South Carolina, a group of investors including the Boeing Company, the Duke  Endowment, and the BlueCross BlueShield of South Carolina Foundation have put up $17 million to expand the Nurse-Family Partnership, a program that pairs nurses with first-time, low-income mothers. (The investors plan to reinvest their success payments in the program.) Multiple studies of the partnership in other states have found that it reduces preterm births, that children are more likely to be vaccinated and less likely to visit the emergency room, and that the mothers wait longer before having a second child — all of which potentially reduce spending on safety-net programs. The PFS project will enable South Carolina’s Nurse-Family Partnership to roughly triple its reach to 4,400 families. A similar program is undergoing a feasibility study in Virginia. Another program targeting the health needs of children is being developed in Richmond, which consistently ranks as one of the worst cities in the country for asthma sufferers, according to the Asthma and Allergy Foundation of America. Asthma is a leading cause of missed school days, emergency room visits, and hospitalizations for children; low-income children, who typically have less access to health care and more exposure to environmental pollutants, are at greater risk. In May 2017, the Richmond City Health District was awarded a $350,000 grant to determine the feasibility of creating a PFS program. Baltimore is also in the planning stages of a program targeting childhood asthma. In a very different vein, Washington, D.C.’s Water and Sewer Authority is using pay for success to finance improvements to its stormwater runoff system. Goldman Sachs and the Calvert Foundation purchased a $25 million bond issue, which DC Water will repay only if the new infrastructure reduces runoff by a certain amount. Other projects being discussed in the Fifth District include expanding the scope of Baltimore’s Meals on Wheels program, which delivers meals to older adults, to include safety checks and case management; training emergency personnel in Greenville and Oconee counties in South Carolina to provide primary and preventive health care to people without other access to health care; and, akin to the project in Denver, creating supportive housing in Richmond. As a model in its infancy, the evidence on PFS is minimal. But supporters are optimistic that at the very least, the effort will increase policymakers’ reliance on evidence rather than on good intentions. “It’s a much more rigorous process than the status quo,” says Ogburn. “Entering into one of these projects reorients everyone around an outcome-based mindset.” EF  Readings Giovannitti, Jen, and Joshua Ogburn. “Growing the Pipeline of Pay-for-Success Projects.” Federal Reserve Bank of Richmond Community Practice Papers, February 2018. Federal Reserve Bank of San Francisco. “Pay for Success Financing.” Community Development Investment Review, April 2013, vol. 9, no. 1. 16  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  Lantz, Paula, and Samantha Iovan. “When Does Pay-for-Success Make Sense?” Stanford Social Innovation Review, Dec. 12, 2017. Liebman, Jeffrey B. “Using Data to More Rapidly Address Difficult U.S. Social Problems.” Annals of the American Academy of Political and Social Science, January 2018, vol. 675, no. 1, pp. 166-181.  Do Entrepreneurs Pay to Be Entrepreneurs?  Richlands Dairy Farm in Blackstone, Va., has been in the Jones family for generations. Entrepreneurs like the Joneses face many challenges in running their own businesses.  Some small-business owners are motivated more by values than financial gain  By Tim Sablik  IMAGE: COURTESY OF RICHLANDS DAIRY FARM  T  he Jones family has been farming in Blackstone, Va., longer than the United States has been a country. From the mid18th century to the mid-20th century, they grew mostly tobacco. In 1954, “Grandpa” Jones returned to the farm with a degree in agronomy from Virginia Tech and decided to try his hand at dairy farming. His decision proved to be prescient. Milk prices rose steadily over the next several decades, while tobacco lost its luster in the wake of growing health concerns over its use. Coley Jones Drinkwater, her brother Thomas “T.R.” Jones, and her sister-in-law Brittany Willing Jones are the third generation of dairy farmers at Richlands Dairy Farm. But despite the long history of their family’s business, their parents never pressured them to follow in their footsteps. “They wanted a better life for us,” says Drinkwater. “It’s a good life, but it’s a very hard life.” It has become even harder in recent years. Milk prices, which normally move in three-year cycles, have been in a slump for the last three years. Most farmers don’t expect a rebound anytime soon. The weather has been unusually dry in Virginia over the last decade, affecting how much corn the Joneses can grow to feed their cows, and requiring them to rely more on feed from outside suppliers. And the Trump administration’s recently announced tariffs on steel and aluminum have introduced some uncertainty about the costs of maintaining their aging equipment. “The most challenging thing about dairy farming is that there are so many variables over which you have no control,” says Tracey Jones, Drinkwater’s mother. “It makes it hard to plan.” That uncertainty is something dairy farmers have in common with most entrepreneurs. One in five new businesses fail in their first year, and only half survive to their fifth anniversary, according to the Bureau of Labor Statistics. In theory, those greater risks should come with the chance for great rewards. The typical image of an entrepreneur is someone like Henry Ford or Bill Gates — an innovator who starts with an idea and a small company and eventually grows their business into a cornerstone of the economy.  While it’s true that most businesses start small, most also stay that way. According to the Census Bureau, nearly 90 percent of firms in America employ fewer than 20 people. Most of these entrepreneurs never experience the windfall profits and success of a Ford or Gates. In fact, owning a business can often seem like a losing proposition. A 2000 study by Barton Hamilton of Washington University in St. Louis found that the median entrepreneur earned 35 percent less over 10 years than they would have if they had been traditionally employed. Given the risks and costs of running a business, what motivates entrepreneurs to keep going? And what role do they play in the overall economy? Being the Boss It’s unlikely anyone would choose to be a farmer if they didn’t enjoy it. The work is hard and dangerous, and it can be a lonely, all-consuming way of life. “Forty years ago, you had grange organizations, and everyone went to church on Sundays,” says Jones. “Now, all of these organizations are losing members. What used to be the social life of a farmer is disappearing.” “I have friends who live 15 minutes away who I hardly see because I’m always here working,” adds Drinkwater. The monetary rewards for all of this work also don’t look too appealing on paper. In a 2017 working paper, John Bailey Jones of the Richmond Fed and Sangeeta Pratap of Hunter College and the Graduate Center, City University of New York, studied entrepreneurial behavior using data from dairy farmers in New York. They found that some farmers earned significantly less than what they might have earned in an alternative occupation. It is possible this gap could be overstated, as some research has E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  17  found that entrepreneurs underreport their income. On the other hand, the wage gap could be even greater than the data suggest. Many employers provide fringe benefits such as health insurance or retirement contributions that are not formally counted as part of their workers’ salaries. Business owners must provide these things for themselves, further reducing their effective take-home pay. Why then undertake all the hardships of farming? “You get to be your own boss,” Drinkwater says. It’s a sentiment echoed by many entrepreneurs. In a 2011 paper, Erik Hurst of the University of Chicago and Benjamin Pugsley of the New York Fed found that over half of small-business owners surveyed in the Panel Study of Entrepreneurial Dynamics said that nonmonetary rewards such as being their own boss or setting their own schedule were key motivations for starting their own businesses. Accounting for these nonmonetary benefits may explain why some small-business owners are willing to work for less than they could potentially earn as employees at another firm. Drinkwater remembers her family having numerous discussions about selling the farm as they struggled during year after year of low milk prices. It would mean giving up not only doing what they love, but possibly also being together as a family. Drinkwater and her brother would likely have to leave home in search of new work. She remembers her father asking her what she would do if they sold the farm. “I never had an answer for him,” she says. “I would be lost for a while before I found something else. Dairy farming is what I feel called to do.” The relative attractiveness of outside work options can also explain how many risks entrepreneurs are willing to take with their business, according to a 2017 paper by Joonkyu Choi, a recent economics Ph.D. graduate from the University of Maryland. Choi found that entrepreneurs with better outside options were willing to take more risks with the hope that they might strike it big and become the next Bill Gates or Jeff Bezos. If they failed, returning to the traditional labor market was still an attractive option. Entrepreneurs who had fewer outside labor options, or who placed a lot of value on nonmonetary benefits like being their own boss, were more cautious and unwilling to take risks that might jeopardize the future of their businesses. Growing Pains Ultimately, the Joneses decided not to sell their farm. But to stay in business, they would need a plan to make money. They first looked at going bigger. Like many modern dairy farms, Richlands uses machines to milk their cows (a setup referred to as a milking parlor). Larger farms with over a thousand cows can run these machines nearly around the clock. With only 250 cows, Richlands cannot take full advantage of their equipment. “Our milking parlor is built to run about 18 hours a day, but we only run it for eight. So it’s not running as efficiently as it could,” says Drinkwater. 18 E C O N F O C U S | F I R S T Q U A R T E R | 2 0 1 8  Quadrupling the size of their herd would require more land than they had available, though. Drinkwater proposed that they instead build a creamery. That would allow them to process and sell dairy products, like milk and ice cream, on site. Currently, all of their milk is sold wholesale to processors before winding up on grocery store shelves. This leaves Richlands at the mercy of price fluctuations in the national milk market. If the price of milk suddenly declines, Richlands may find itself earning less than expected for the milk it produced. By processing and selling milk to consumers themselves, the Joneses will have more power to set prices for their milk products, which in turn gives the farm more certainty over the price it will receive for its raw milk. “It will be two separate businesses. The creamery will buy milk from the farm at a set price, which gives the farm the ability to budget for the first time ever,” says Drinkwater. Richlands already has an eager customer base. The farm began offering tours and hosting agritourist events such as their fall festivals four years ago, and they have averaged hundreds of visitors each weekend. The only stumbling block was securing the more than $1 million needed to finance construction of the creamery. Obtaining the credit they need to grow and thrive can often be a stumbling block for small businesses young and old. Entrepreneurs starting out may have little to offer lenders in terms of collateral. Additionally, lenders may not understand enough about the business to assess its risk or may simply be unwilling to take a chance on any startup given that a large share of them fail. The New York Fed regularly publishes a Small Business Credit Survey to assess startups’ access to financing. According to a report published in 2017, nearly 70 percent of startups that applied for loans said they received less than they asked for. In their study, Jones and Pratap found that borrowing constraints reduced the profitability of dairy farming. Farms that want to undertake a project to boost their productivity may simply be unable to. “In agriculture, financing is very important,” says Eric Paulson, executive secretary and treasurer of the Virginia State Dairymen’s Association. “But if you go into a local bank, most won’t understand a lot about how a dairy farm operates. Fortunately, we do have a few good lending institutions that have the specialized knowledge to work with farmers.” One of those institutions, Farm Credit, provided a loan for Richlands’ creamery project. Policymakers have long had an interest in supporting entrepreneurs by facilitating access to credit and through government programs intended to mitigate some entrepreneurial risk. For example, farm price support programs and crop insurance have attempted to reduce the price variance faced by famers. The rationale for this support is that what’s good for entrepreneurs is good for the overall economy. “New firms contribute disproportionately to job creation,” says Ryan Decker of the Federal Reserve Board of Governors. In a 2014 paper with John Haltiwanger of the University of Maryland and Ron Jarmin and Javier  Miranda of the U.S. Census Bureau, he found that the fastest-growing businesses are disproportionately young and small (at least to start) and account for half of overall job growth. “We also see an important role for young businesses in aggregate productivity growth,” he says. 	But when it comes to driving economic growth, not all entrepreneurs are the same. Engines of Growth In their 2011 paper, Hurst and Pugsley showed that while some entrepreneurs desire to grow and innovate, most simply enjoy the nonmonetary benefits of running their own business and being their own boss. They express little desire to expand or innovate significantly. “Economists who study entrepreneurship often use a distinction between subsistence or lifestyle entrepreneurs and transformational entrepreneurs,” says Decker. It is the smaller, latter group that accounts for the outsized role startups play in employment and productivity growth. Transformational entrepreneurs express more of a desire to grow their business and tend to be risk-takers, even exhibiting a greater propensity to engage in illicit activity when young. Lifestyle entrepreneurs are more likely to run businesses that are similar to many others, such as restaurants, auto repair shops, or law offices. Some economists have argued that policymakers interested in fostering employment and productivity growth in the economy would be better served investing in transformational entrepreneurs rather than small businesses as a whole. In fact, transformational entrepreneurs may need the help now more than ever. Decker’s research with Haltiwanger, Jarmin, and Miranda shows that the entry of high-growth startups has slowed since 2000. The reasons for this are unclear, though one possibility is that growing market concentration across sectors has given incumbents more market power to block or absorb would-be competitors. (See “Are Markets Too Concentrated?” p. 10.) It would be a challenge to determine up front which new businesses aspire to grow and which do not, however. Economists have identified various differences between lifestyle and transformational entrepreneurs, but many of these characteristics are not easily observable. Ross Levine of the University of California, Berkeley and Yona Rubinstein of the London School of Economics and  Political Science suggest one novel way of distinguishing between the two groups for research purposes. In a 2017 article in the Quarterly Journal of Economics, they compared owners of incorporated and unincorporated businesses. Incorporated businesses enjoy some protections from legal and financial risk at the cost of fees and increased regulatory requirements. Levine and Rubinstein reasoned that only entrepreneurs interested in growing their business and taking large risks will incur the costs of incorporation. Some economists, like Hurst and Pugsley, have suggested that broadly promoting small businesses when most do not express a desire to grow or innovate may not substantially increase economic growth and could even have distortionary effects. But to the extent that it is difficult to distinguish between the two types, it may be worthwhile for society to support all entrepreneurs because some of them will have large positive effects on the economy. Decker and his co-authors found that transformational startups play an important role in keeping the economy innovative and nimble in the face of supply and demand shocks. Societies might also value the role that entrepreneurs play in local communities as leaders or role models. Nonmonetary considerations such as these could motivate support for entrepreneurs broadly, even those who do not go on to innovate or expand their business. “My dad started a CPA firm that provided for our family for decades,” says Decker. “He became an important fixture in the local business community. And while he worked a lot of hours, he still had the flexibility to come to my basketball games. So we might value these kinds of businesses for a lot of reasons, though this may not translate into targeted policy support.” At Richlands Dairy Farm, the agritourism events and creamery expansion that grew out of the Joneses’ desire to save their family farm have given them an opportunity to educate visitors who are, in many cases, several generations removed from life on the farm. 	 “It’s always satisfying to see people more comfortable with where their food comes from,” Drinkwater says. “We had one woman on a tour who said she had switched to soy milk because she thought that cows in the dairy industry were mistreated. But after our tour, she said she was going to switch back to regular milk. What better compliment could you get than that?” EF  Readings Choi, Joonkyu. “Entrepreneurial Risk-Taking, Young Firm Dynamics, and Aggregate Implications.” November 2017. Decker, Ryan, John Haltiwanger, Ron Jarmin, and Javier Miranda. “The Role of Entrepreneurship in US Job Creation and Economic Dynamism.” Journal of Economic Perspectives, Summer 2014, vol. 28, no. 3, pp. 3-24.  Jones, John Bailey, and Sangeeta Pratap. “An Estimated Structural Model of Entrepreneurial Behavior.” Federal Reserve Bank of Richmond Working Paper No. 17-07, May 2017. Levine, Ross, and Yona Rubinstein. “Smart and Illicit: Who Becomes an Entrepreneur and Do They Earn More?” Quarterly Journal of Economics, May 2017, vol. 132, no. 2, pp. 963-1018.  Hurst, Erik, and Benjamin Wild Pugsley. “What Do Small Businesses Do?” Brookings Papers on Economic Activity, Fall 2011, no. 2, pp. 73-118. E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  19  Hot Topics  Federal Reserve Bank of Rich mond 2017 ANNUAL REPORT  Annual Report essays address issues that are important to the Richmond Fed, the Fifth District, and the United States  Falling Shor t Why Isn’t the U.S. Producing More College Graduates?  Educational Attainment The 2017 essay asks why the United States is not producing more college graduates in response to a large and persistent wage gap between workers who graduate from college and those who do not. The article considers several factors that may help answer this question, including inadequate preparation during students’ K-12 years. The essay discusses how K-12 preparation varies with socioeconomic status and how “school-choice” initiatives are intended to give more children access to high-quality schools.  Federal Reserve Bank of Richmond 2016 ANNUAL REPORT  Fed AR 2017 Producti  on.indd 1  Urban Decline  Understanding Urban Decline  The 2016 essay provides a framework for understanding and responding to urban decline. The article discusses the economic advantages of cities, patterns of development, cycles of development and redevelopment, and guidance for policy responses to urban decline.  Long-Term Economic Growth  Fed AR Layout Covers production.indd 1  5/22/17 10:06 AM  The 2015 essay examines the argument that the U.S. economy faces headwinds that will make robust growth difficult for the foreseeable future. The article suggests that innovation and thoughtful public policy may yield meaningful improvements in economic performance.  The Annual Report is available on the Bank’s website at www.richmondfed.org/publications/research/annual_report/ 20  5/3/18 11:14 AM  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  A “New Normal”? THE PROSPECTS FOR LONG-TERM GROWTH IN THE UNITED STATES  FEDERAL RESERVE BANK OF RICHMOND  POLICYUPDATE  Reforming Corporate Taxes  O  n Dec. 22, 2017, President Donald Trump signed into law the Tax Cuts and Jobs Act, one of the most sweeping changes to the nation’s tax code in over 30 years. In addition to reducing income tax rates for most individuals through 2025, the law makes a number of changes to the U.S. corporate tax system in an effort to encourage American firms to invest more domestically instead of shifting profits and production to lower-tax jurisdictions overseas. The United States has long had one of the highest corporate tax rates among developed countries at 35 percent (effectively around 39 percent once average state taxes are included). The 2017 act reduces the federal rate to 21 percent. The law also lowers tax rates that apply to S corporations and limited liability companies, or so-called “pass-through” entities. These companies do not pay corporate taxes; instead, owners are taxed on the firms’ income at the individual level. Under the new law, these individuals can deduct 20 percent of eligible business income that is received via a pass-through entity from their total taxable income. For firms with global operations, the law substantially changes how income from their foreign subsidiaries is treated. Previously, income earned by foreign subsidiaries of American companies was subject to the U.S. corporate tax, an arrangement known as a worldwide tax. Firms could defer paying the U.S. tax by keeping those earnings outside the United States and investing them in their foreign subsidiaries. Under the new law, the United States will not tax foreign-source income stemming from certain tangible investments, such as plants and equipment. This approach, known as a territorial tax, is used by many other developed countries. The new law still maintains some elements of the worldwide tax, however. Foreign-source income from intangible assets, such as patents, is subject to a minimum U.S. tax. The law also establishes a minimum tax on deductible payments made by U.S. firms to foreign subsidiaries in an effort to discourage income shifting to low-tax countries. Finally, the law requires U.S. multinational firms to pay taxes on foreign income currently held overseas over a period of up to eight years. That income is subject to a reduced tax rate of 15.5 percent for liquid assets (such as cash) and 8 percent for other assets. With these changes, policymakers sought to encourage firms to shift more money back into the United States as well as spur them to make more domestic capital investments. Firms’ channeling of profits and investments overseas to avoid U.S. taxes has been a long-standing concern of policymakers in both political parties. It is estimated  BY S E L E N A C A R R A N D T I M S A B L I K  that nonfinancial U.S. companies hold around $1 trillion in cash reserves overseas. (See “Taxing the Behemoths,” Econ Focus, Third Quarter 2013.) To further encourage investment, the law also allows businesses to deduct 100 percent of expenses for certain fixed assets over the next five years. In theory, reducing firms’ corporate tax burden may encourage them to invest in new projects because owners of the firms retain more of the profits from those investments. Reducing the overall corporate rate and changing how foreign earnings are taxed make the United States more competitive with the rest of the developed world and may therefore encourage U.S.-based firms to invest more at home. Those investments may drive up demand for workers and therefore push up wages. While several economic studies have found that higher corporate taxes have a negative effect on investment, evidence on the effect of corporate tax cuts is less clear. One study found that a 2005 tax cut for domestic manufacturers led to increased investment, while another study of a 2003 dividend tax cut found no evidence of increased investment. A 2015 paper by Alexander Ljungqvist of New York University and Michael Smolyansky of the Fed Board of Governors used variations in state corporate tax rates to compare the effects of tax hikes and cuts in bordering counties. They found that while tax increases reduced employment and income, tax cuts generally had no stimulative effect unless implemented during a recession. Still, economists acknowledge that it is difficult to study federal corporate tax cuts empirically because they have been rare — the 2017 Act marks only the third time federal corporate tax rates have fallen in nearly four decades. And while state corporate tax changes have been more numerous, Ljungqvist and Smolyansky noted that they are also on a much smaller scale, which could explain why tax cuts did not appear to have much effect on employment or wages in their study. Ultimately, it will take some time before all of the changes in the Tax Cuts and Jobs Act are implemented and the full impact of the law is known. One unknown is how much the tax cuts will cost. The Joint Committee on Taxation estimated that the corporate changes alone may add roughly $1 trillion to the federal debt over the next 10 years before accounting for any economic growth generated by the reform. Another outstanding question is how other countries will respond to the changes. The new headline U.S. corporate tax rate is more competitive with the rest of the developed world, but that advantage may prove temporary if other countries respond by also lowering rates. EF E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  21  INTERVIEW  Jesús Fernández-Villaverde Editor’s Note: This is an abbreviated version of EF’s conver-  Jesús Fernández-Villaverde of the University of Pennsylvania has broader interests than most economists. The work for which he is perhaps most well-known resides at the forefront of formal macroeconomics: theoretical modeling, methods for taking models to the data, and techniques for solving models with computers. But Fernández-Villaverde also has a passion for the gamut of historical, cultural, and economic forces that shape policy. In recent years, he has studied how politics determine macroeconomic outcomes, the rise of Nazi Germany, the enduring significance of the Magna Carta, and even how contraceptive technologies influence the way societies socialize children about sex. On top of all this is what he calls “a second life” of writing prolifically about economics and policy in Spanish. “It’s like golfers who play both the U.S. tour and the European tour,” he says with characteristic humor. With a keen interest in the future of macroeconomics, and as the director of graduate studies at the University of Pennsylvania’s economics department, he helps shape the next generation of economists by advising them on how to best invest in their training. He is a research associate at the National Bureau of Economic Research, is a research affiliate at the Centre for Economic Policy Research, and has books in progress on macroeconomics on economic history. Renee Haltom interviewed Fernández-Villaverde in his office at Penn in February 2018. u  EF: You’ve been active in the debate over the state of macroeconomics as a discipline. There are prominent economists who say much of what is studied is nonsense, while others argue that macro is thriving if you understand what it is designed to do. What is your view? Fernández-Villaverde: I’m much more sanguine about the state of macro. Just to give a little bit of background: After World War II, there had been a generation of large macro Keynesian models, as people called them at the time. Larry Klein, who was a professor here at Penn, was a leading proponent and 22  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  got the Nobel Prize because of that work. In those models, you’d have an equation for consumption and an equation for investment and an equation for exports and an equation for imports, and then you’d go and estimate them. Then in the 1970s, the generation of Bob Lucas and Tom Sargent and Neil Wallace said we want to build models where the economy is a system, where rational agents interact in a purposeful way. In the late 1990s and early 2000s, we learned how to econometrically estimate those models. That was, in my opinion, the first and most important advance in macroeconomics in the last 30 or 40 years. In the mid-1990s, we learned as a profession how to build models that are dynamic, that take the randomness of the economy seriously, and that incorporate price and wage stickiness. That class of models started being called DSGE, which is the terribly unsexy Dynamic Stochastic General Equilibrium acronym. I think these models really clarify a lot of aspects of, for instance, how monetary policy interacts with aggregate activity, and we learn a lot from them. The second big leap, which we have had over the last 10 years, is a big revival in models with heterogeneity. In the standard basic model that we teach first-year graduate students, there is one household. But, of course, we know this is not a description of reality; we have people who are older versus younger, college-educated versus not college-educated, unemployed versus employed, high-income versus low-income. Both solving these  PHOTOGRAPHY: LISA J. GODFREY  sation with Jesús Fernández-Villaverde. For additional content, go to our website: www.richmondfed.org/publications  models and taking them to the the large state schools, you make data was such a large task that, a lot of money. Similarly, there “I really envision a whole new until around 10 years ago, not that are incentives to write papers generation of models that will take very that will go to the American many people wanted to use them. seriously everything we know about the Economic Review; then I can go This led to criticisms of represenmicroeconomy to build a much more tative agent models with only one to my dean and say, “Dear dean, type of agent, but we didn’t have increase my wage 10 percent.” coherent view of the macroeconomy.” that many alternatives. But there are not a lot of But over the last 10 years incentives for your average econthere has been a tremendous jump in our computational omist to write a textbook that is a little more advanced capabilities. This iPhone on my desk is computationally and that may impact in the long run the way the profession more powerful than the best supercomputer on the planet thinks about the world. For instance, I am writing a textin 1982. That means we can do a lot of things that even 10 book with Dirk Krueger, one of my colleagues here. Even years ago we couldn’t. if we are successful doing it, which remains to be seen, it is not very clear to me what we are going to get out of it EF: What explains the divergence of views on those beyond self-satisfaction and perhaps the recognition of our developments? colleagues. I think that that is a little bit of a problem that we suffer in this profession, but also many other fields. Fernández-Villaverde: The problem is that a lot of this At the end of the day, I think a positive case for macro exciting, backbreaking research has not transpired outside can be made, and it is a pity that sometimes there are not of the relatively small group of people working on the very good incentives for those who can make it. At the frontier. This is due, I would say, to three reasons. same time, there are strong incentives for those with more First, the people who are doing this are quite busy. negative views to be very vocal about them and try to make When you are in your mid-30s or early 40s, you are trying a splash. to establish yourself as a senior member of the profession. You don’t really have a lot of time to do interviews or write EF: Where do you think macro has performed best blogs or go to purely policy-oriented conferences. versus not so well? Second, many times it takes a generation of students to distill the lessons of frontier research and express them Fernández-Villaverde: Where I think we have done well in ways that other researchers, let alone policymakers and is the well-understood result among macroeconomics that the public, can understand. This happens all the time in quantitative easing was going to be nearly irrelevant. By the history of mathematics and other fields. Until that “quantitative easing,” I mean what sometimes is called happens, it’s difficult for people to really appreciate how QE3, not QE1 and QE2. The latter two were, “Oh my god, important the tools are. the world is about to end,” and then the Fed came and said, If you take the best 20 macroeconomists of my gen“Don’t worry, if you have some paper, we will buy it to show eration, of course they don’t agree on everything, but the that the world is not ending.” Whereas QE3 was buying a things they talk about are very different from the type of lot of long-run bonds and issuing reserves against it. things you will see on Twitter or the blogosphere. The There is a classic paper by Neil Wallace, who is of conversation sometimes looks like two very different the generation of Bob Lucas, Tom Sargent, Ed Prescott, worlds. Sometimes I see criticisms about the state of and Chris Sims — Wallace is the only one that unfortumacro saying, “Macroeconomists should do X,” and I’m nately has not got the Nobel Prize yet but is someone I thinking, “Well, we have been doing X for 15 years.” admire very deeply. He proved in 1981 that these types Third, sometimes you get a biased view of where the of operations were going to be irrelevant. And later Mike state of a field is just because of who has incentives to talk Woodford proved that the result holds even more so when to the general public. Many of the people who are currently you are at the zero lower bound. very critical of macro are in another generation, and some So when the Fed announced QE3, most people in monof them may not be fully aware of where the frontier of etary economics said the most likely effects were going to research is right now. They also have plenty of free time, so be very small. I actually wrote something in Spanish saying it’s much easier for them to write 20 pages of some type of that, and you should have seen the amount of hate mail exposé, if they want to use that word, on the state of macro. that I got – most people thought either that QE3 would This raises a more general issue of whether academia cure all illnesses of the day or that we’d get hyperinflation. in general and the economics profession in particular have I think that the evidence is in and nearly everyone has conthe right incentives to transmit some of these learnings cluded that QE3 had very small effects (the only discussion from the frontier to the general public. Unfortunately, seems to be whether the effects were very small or really sometimes those incentives do not exist. If you write a very small). So that was a clear prediction that has been successful introductory textbook and it gets adopted in supported nicely by the data. E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  23  A place where macro may not have done so well is the consequences of the zero lower bound. The zero lower bound is when nominal interest rates get to zero, and then it’s difficult for monetary authorities to lower it below zero. In the standard New Keynesian model, at the zero lower bound the economy is going to suffer deflation and a very severe contraction. We didn’t have that. I’m not saying that 2012 to 2016 were great years, but inflation was around 1 percent and there was a moderate expansion. We also don’t understand inflation dynamics very well. We understand that if you are Zimbabwe or Venezuela and you start printing money like crazy, you are going to have a great inflation, but do we really understand why inflation is 1 percent and not 3 percent? In fact, one of the puzzles we have had during the recovery over the last two or three years, even more so in Europe than in the United States, is why inflation has been so subdued. I had many people in central banks asking me why, and I said that I wish I knew because I would be writing a paper about it. EF: What are you most excited about in macro?  Jesús Fernández-Villaverde ➤ Present Positions Professor of Economics, University of Pennsylvania; Research Associate, National Bureau of Economic Research; Research Affiliate, Centre for Economic Policy Research ➤ Selected Past Positions National Fellow, Hoover Institution at Stanford University (2014-2015); Kenen Fellow in International Economics, Princeton University (2013-2014); Director, Penn Institute for Economic Research (2011-2012) ➤ Education Ph.D. (2001), University of Minnesota; B.Sc. in Economics and Management (1996), ICADE, Spain; LL.B. (1995), ICADE, Spain ➤ Selected Publications “Estimating Macroeconomic Models: A Likelihood Approach,” Review of Economic Studies, 2007 (with Juan F. Rubio-Ramírez); “Political Credit Cycles: the Case of the Eurozone,” Journal of Economic Perspectives, 2013 (Luis Garicano and Tano Santos); “Fiscal Volatility Shocks and Economic Activity,” American Economic Review, 2015 (with Pablo Guerrón-Quintana, Keith Kuester, and Juan F. RubioRamírez); “Solution and Estimation Methods for DSGE Models,” Handbook of Macroeconomics, Volume 2A (with Rubio-Ramírez and Frank Schorfheide); and numerous other articles and papers  Fernández-Villaverde: Where I really believe the next generation of students can make big contributions is the integration of micro data with macro data. The amount of information that we have about economic activity at a very, very granular level is absolutely incredible. I’m working on a project for the Philadelphia Fed involving electricity consumption to better understand the dynamics of the business cycle in their district. We actually have information about how much electricity is consumed in the district second by second. As we get better at putting all those numbers together, we are going to have a much better view of what’s happening in the economy. With respect to inflation, it may be the case that, say, butter is a good indicator of how inflation is going to move over the next three months. If I were the president of one of the regional Feds, and I had very detailed information about how the price of butter is evolving in all the supermarkets in my district, I may have an early warning system for inflation. A more concrete example is labor construction accidents. The first time I understood that the real 24  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  estate bubble was getting out of control was when a friend of mine who is involved in labor administration in Spain told me that construction-related injuries were going up — when the market is hot, you push your workers hard and they start doing awful things to their hands with nails. Another example would be models where we really understand in detail how people make decisions about coming in and out of the labor force. The reason I think this area of research is going to be enormously important is because we are going to be able to combine those immensely rich data at individual levels with powerful computers that are able to handle them. I really envision a whole new generation of models that will take very seriously everything we know about the microeconomy to build a much more coherent view of the macroeconomy. That’s what I tell my students they should spend a lot of time trying to think about — the investment in methods, but also the learning about the economics of these types of problems. That’s what I am the most excited about. EF: Let’s focus on the zero lower bound for a second. What do we understand reasonably well about the zero lower bound and what do we not?  Fernández-Villaverde: Well, we understand well why it’s a problem. People who want to save for the future take resources from today and move them into the future, and people who want to invest borrow today — say, to build a factory — and pay it back tomorrow. The saving-investment market clears by a price, and that price is the interest rate. The zero lower bound implicitly introduces a price control in that market. Since that market determines how much we save and invest, things don’t get right intertemporally and the economy ends up operating at a lower level of activity than you could get in normal times. But in the standard model, the negative consequences of the zero lower bound are much more acute than what we actually have seen in the real world, which suggests that there are issues we don’t fully understand. For example, I described before a very simple model with one type of investment and one type of saving. In the real world, there is a whole set of investment opportunities and a whole set of saving opportunities. What some people have argued  —and I’m trying to write papers on it now — is that the real constraint right now is not so much a general savings and a general investment market, but markets for safe assets. This research was started by Ricardo Caballero at MIT and by Emmanuel Farhi at Harvard. The idea is you have a lot of aging Chinese, Japanese, and Germans who want to invest in very safe assets, and there are just not enough of those assets. And that pushes the price of the asset high, which is the same thing as pushing the interest rate down. So maybe it’s not as much that all the investments and savings clear at this zero interest rate, just the market for safe assets, and that’s why things have not been quite as bad as the basic New Keynesian model forecasted. A lot of very great economists have been doing fantastic work on this. I mentioned Caballero and Farhi, but also Pierre-Olivier Gourinchas and Ben Bernanke himself. What I’m trying to think a little bit about right now, in new research with Robert Barro at Harvard University and Oren Levintal here at Penn, is how economies generate these safe assets and what determines the total amount of safe assets.  way to handle financial regulation. The president of the Minneapolis Fed has come out with a simpler system where you just require financial institutions to hold large equity. That’s fine; I can write a simple model on my white board to understand that argument. But when you get to the concrete question of whether we need 15 percent, 16 percent, or 25 percent equity, you cannot get the answer without a quantitative model. It’s the same way that an aerospace engineer will tell you, “We kind of understand Bernoulli’s principle, but there are a couple of things here and there that we are not very sure about.” To say a model is difficult to understand is, to me, a little bit of a nihilistic view. A much more sensible approach is to understand the things for which we want simple models and the things for which we want complex models. Now, can you offer me examples of where people use very complex models to do silly things? Yeah, but people buy bananas to do silly things, and we are not going to prohibit selling bananas.  EF: Another common criticism of the profession is that economists routinely use models that are so complex they can’t even understand them. How would you respond to that?  Fernández-Villaverde: You know, I take a little bit of a different view on that. First of all, I’m an editor of a journal and an associate editor of other journals. As an editor, you need to understand you are never going to get all your calls right. If you only accept papers you are 100 percent sure are right, you will end up not publishing any papers. I don’t even say that my own papers are 100 percent right, and I agree with everything in them. Whether a paper makes a big advance or not isn’t that consequential. Pick any American Economic Review from 1990 and randomly select a paper, and you will see that many of them have been sleeping for eternity and no one cares. Then there are papers that are very important, that people are going to look at again and again to learn from their strengths and weaknesses. Recently I was writing a report about a very famous paper, and I thought at the end of the day, the main result hadn’t held water after 10 years of empirical investigation. But the paper opened such an important door for people to think about the problem, and for that the paper has become a classic. So are there mistakes in publishing? Of course there are. But the process of science is much more dialectic than sometimes is expressed. This notion of the perfect paper getting published and then we learn something is an idealized view of the way science works.  Fernández-Villaverde: Did you come here by plane? EF: Yes. Fernández-Villaverde: Are you aware that aerospace engineers do not fully understand the turbulences that keep that plane in the air? EF: I don’t want to think about that until after I’ve flown home. Fernández-Villaverde: OK, well, my father is an aerospace engineer, so I know that firsthand. (Laughs.) We have a very limited understanding of what makes planes fly. But we have very good computational methods that allow us to simulate how the plane is going to work, and so we are more than happy to get inside a carbon fiber and aluminum tube and go 35,000 feet above the ground at almost 600 miles per hour. I’m not going to deny that having clean, intuitive models that help us understand the mechanisms at work is important. For instance, they play a tremendously important role in undergraduate education; the book I’m trying to write with Dirk Krueger tries to not use a computer at all so the student can understand really what is going on. But once you want to go to the next step, you need a computer. Consider the following scenario. There is now a lot of talk about whether the Dodd-Frank law is the best  EF: Presumably not many bananas get published in top journals.  EF: One research agenda of yours that did, in fact, have enduring success was on the particle filter. How did that idea come about? Fernández-Villaverde: I once made a joke at a conference that the particle filter pays for my mortgage. Now a E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  25  lot of people ask, “How is your mortgage going?” and I say, “Nearly done.” Let me give you an example of what the particle filter does. In early 2018 we entered a time of high volatility in the stock market. The problem with volatility is that it is not directly observed: I can go to the back pages of the Financial Times and find a value in the table for a stock’s price, but there is no number to express its volatility. What you need is a statistical model that will let you learn about volatility from things you can actually observe, in this case, the variations of the stock market from one day to the next. This is called filtering — learning about things that you haven’t seen from things you can see. The original filters were developed for the space program. The idea is you are the guy in Houston with a joystick, and you see the satellite but can’t get its exact position because you are measuring with radar and there is noise. What you are trying to figure out is how much to push the joystick to the left or right given what the radar is telling you. For the longest time the most important filter was the Kalman filter. It requires two assumptions: that the world is linear, and that noise comes from a normal distribution, or is “well behaved.” Those assumptions prevent it from handling many, many questions in macroeconomics. The best example is volatility because it can only be positive: You can have a lot of volatility or very little, but you cannot have negative volatility. So when I was a graduate student, I was very interested in coming up with methods that could extend filtering to these types of environments. I spent a lot of hours browsing through math journals, and I heard about this new generation of methods called sequential Monte Carlo, which is a complex name for something quite simple: A classic question in a basic probability class is if you throw two die, what is the probability that the sum of the two is five. You have to calculate the probability that the first is a one and the second is a four, and so on, and when you do that homework you always make a mistake because you forget one combination. Alternatively, you could throw the dice one million times. Of course, in real life you can’t do that, but computers can do it for you. In the 1990s, some people came up with the idea of applying Monte Carlos recursively to filtering problems. I learned about these new methods, and I thought gee, this can be done in economics as well. So I came back to my office and got my dear friend and co-author Juan Rubio and I explained to him, “This can work,” and he said, “Yeah.” I said, “Well, let’s write a paper.” So we wrote the paper, my most-cited paper probably, and it still pays for my mortgage. EF: The eurozone crisis is still in the news. There is little agreement among economists on the fundamental causes of the eurozone’s economic troubles. Depending who you ask, the crisis is about forcing fundamentally different countries to share a common 26  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  currency, lack of competitiveness in the periphery, or weak and improperly designed institutions. What is your view? Fernández-Villaverde: My view is a mix of poor institutions and not being an optimal currency area. On the latter, the case for creating the Euro was mainly political and not economic. In Europe by the mid-1990s, a lot of the gains from integration had already been accomplished. It would have been more important to continue eliminating administrative barriers to a unified market, for instance, than to adopt a single currency. But the political process decided for a combination of reasons that a common currency needed to be introduced, so it happened. The problem is this currency has very asymmetric effects in different countries depending on their institutional framework. Interestingly enough, a lot of economists were aware at the time that the euro’s design had fundamental flaws and that those flaws would eventually have nefarious consequences. For instance, Franco Modigliani, who was a Nobel Prize winner, argued that the introduction of the euro would force countries such as Italy and Spain to undertake the right institutional changes. He argued that once you have the discipline of a monetary union with Germany — he called it an iron straitjacket — you will not have an alternative to reforms. The euro lowered the interest rates at which peripheral countries could borrow. The reaction of the political system in 2000-2001 was not, “In 10 years we may have a crisis, we need to reform now.” How politics works is, “Hey, now I can borrow at 3 percent where before I was able to borrow at 10 percent, let’s have a party!” What I argue with my co-authors in a paper is the political system tends to expel those who want to impose tough decisions in moments where there is a lot of money. That’s exactly what happens in most European countries. This was not conservative versus socialist or left versus right; it’s even within the same party. So there were two countries, Greece and Portugal, that had a public debt party. The governments basically engaged in fiscal expenditures that were not sustainable in the long run. Ireland and Spain went for private debt; they say, “Fantastic, this is a great moment to build houses, to borrow from the rest of Europe, and to have a gigantic boom that lasts for six or seven years.” Houses in Spain, for instance, pay the value-added tax, which means the government was getting extraordinary income. In 2005-2007, Spain had a government surplus, not because our fiscal position was healthy in the long run (as often mistakenly argued by U.S. economists who do not understand our budgetary structure but only look at headline numbers), but because we were building so many houses. The second problem that we highlight is that the big boom lets bad managers get away with it. For instance, we had what were called cajas which is roughly equivalent  to a savings and loan. The board of directors was elected by the regional politicians. If I’m the leader of a political party in 2002 and I want to get rid of you, I make you CEO of the local caja. Now you are making $3 million a year, so you happily ride off into the sunset. You may have never run a banking business in your life, but when the economy is growing at 6 percent a year it’s nearly impossible to lose money. It’s even worse than that: We document that the worst managers are the ones making the most money, because they are taking the really crazy bets that pay a lot in the short run but then collapse the bank when the euro crisis comes. So the Achilles heel of Europe, at least in the peripheral countries, was these changing incentives within the context of a bad institutional setup. EF: What does this imply about the way forward for Europe? Fernández-Villaverde: Using an old-fashioned terminology, the eurozone has an original sin, which is that it is not an optimal currency area. At the same time, if you ask me, “Should I marry my friend X?” I may tell you, “No, I don’t think you are compatible, you are going to end up divorced.” But that’s a very different question from, “Should I get a divorce now that we are married and have a mortgage, three kids in school, two cars, and a dog?” Like it or not, we got married to the Germans, and the Germans got married to the Spaniards. We need to make this work, because breaking up now would be way too costly. What we need is a reform of the euro. In terms of incentives, you need to tell countries that they will not face economic crises alone, that there is going to be money from the European Union that will help the Netherlands going through a rough patch in the same way that federal taxes and transfers will help if California suffers a bad period. That would imply, for instance, moving toward a bigger European Union budget and creating some European bond system. There is a lot of discussion among European economists about how to design such a thing. But there also need to be constraints. For this to be sustainable, fiscal discipline and cleaning up the house really needs to be done. There has to be a great bargain between those who point out the need for making financial and economic crises easier to go through and those who emphasize that, in the long run, rules are very important. That’s the big question mark: Is the political process within Europe going to be able to deliver that solution? EF: Which economists have influenced you the most? Fernández-Villaverde: Let me start with an economist I have only read about: Milton Friedman. The reason I  became an economist is that I read Free to Choose when I was in high school. It’s not that I got convinced by all his arguments; it was his enormous ability to show that economics could help you think about many problems, that economics was not the stock market. That book was really eye opening in the sense of truly appreciating the power of economics as a general field of inquiry. With respect to people I have met: Tom Sargent. I actually told him when we became co-authors that part of the reason I wanted to write textbooks is because of his. He wrote some very influential textbooks about macro in the late 1970s, and we used the first chapters of that textbook translated into Spanish in my undergrad macro class. That showed me how beautiful macro research could be and that I should go to the United States. I have always admired Tom because of his ability to combine data with theory. He also has a couple of great books in economic history that by themselves would probably make him a top professor at a university, even forgetting about everything he wrote in macro. The third person I would say influenced me the most is Ed Prescott, who was the chair of my dissertation committee in Minnesota. What is amazing about Ed is his incredible ability to say what standard economics can explain and what standard economics cannot. One of his most-cited papers is the one about the equity premium puzzle, where he asked a trivial question: Can a standard model account for the equity premium? No. And that generated 30 years of finance literature. EF: What are you working on next? Fernández-Villaverde: I mentioned before the work on safe assets. The second thing I am working on is machine learning, which perhaps is the new hype. But I use it a different way than other people do — not to understand how people behave or to make predictions about the world, but as a way to solve a model. Agents within the model act as machine learners, and that helps you solve the model in situations that otherwise you would not be able to solve. This makes sense because in real life none of us accomplish perfect computations. Rather, we use algorithms as a way to solve our problems in ways that resemble machine learning. A paper that I am presenting these days is a model where agents use a machine learning algorithm to keep track of the distribution of assets and equity in the model. This is relatively easy to incorporate into standard macro and then you can solve many, many more models. The third thing I’m working on is trying to wrap up the textbook with Dirk and another one that I have on economic history. I have written roughly 700 pages. I need to write another 100, and that’s about 50 percent of the total. The problem I’m having is every time I reach a new chapter I think I need to read all these other books. Then it takes me a month to read all the books! EF E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  27  ECONOMICHISTORY  When Banking Was ‘Free’ From 1837 until the Civil War, currency issuance and banking were left to the states. Can this era offer lessons for today’s cryptocurrency boom? BY H E L E N F E S S E N D E N  F  ew assets were hotter in 2017 than cryptocurrencies, including bitcoin. The surge was dramatic enough that New York Fed then-President William Dudley disclosed in November that the Fed was “starting to think about” offering a digital currency — although he quickly downplayed the chance of this materializing soon. What’s behind this boom? A central feature of cryptocurrencies is that they rely on “blockchain” technology, which, advocates claim, enables them to take on the functions of money and ultimately compete with conventional currency. Thanks to blockchain’s open-source nature, anyone can design his or her own version of cryptocurrency and cater to market demand through “initial coin offerings” (IPOs for cryptocurrencies); today, there are more than 1,600 cryptocurrencies available. Based on a decentralized global network of computers, blockchain enables speedy, transparent, and cheap financial transactions that anyone, anywhere, can access with an Internet connection, without going through banks. It also allows its users complete anonymity — which means it’s become a favored conduit for illegal transactions. The black market stigma is one reason why this market has cooled a bit in 2018; Bitcoin’s trading price is now around $9,000, after spiking to $20,000 last year, amid rising regulatory pressure in Asia and elsewhere. Other concerns have emerged as well, including vulnerability to hackers and heightened scrutiny of coin offerings in regards to violation of investor-protection laws. But many skeptics cite volatility as a chief hurdle preventing cryptocurrencies from fulfilling the functions of money — specifically, as a store of value, unit of account, and medium of exchange. What does this mean in practical terms? Investors can make or lose money on cryptocurrencies as a speculative asset, but this also means they serve poorly as a common and stable measure of the value of goods and services. Money’s function as legal tender — to be liquid enough to be accepted widely — is also difficult. Cryptocurrency issuance is finite in that it’s determined by how many computers and programmers are mining it rather than the macroeconomic goals of a central bank’s monetary policy, and payments are accepted by only a fraction of vendors. The idea of an “unregulated” currency, however, isn’t new. Before the Civil War, the United States ran a vast natural experiment by leaving “free banking” to the states, even while other major economies were adopting central banking. From the demise of the Second Bank of 28  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  the United States in 1836 until the passage of National Banking Acts of 1863 and 1864, the United States lacked a federal authority to issue and redeem banknotes, act as a fiscal agent for the federal government, or keep banknote issuance in check. Instead, banking was run by the states, and “free banks” could issue their own banknotes. But just how much did this amount to the kind of free-entry, highly decentralized currency competition that some cryptocurrency backers advocate today? Back to the Future Under the traditional narrative of this era, free banking had a poor reputation. The absence of national regulation was seen as one reason for the extreme booms and busts of the pre-Civil War years, as well as the high frequency of bank failures. Free banking is also often conflated with the term “wildcat” banking, which refers to short-lived (and sometimes fraudulent) banks in more remote regions where banknotes couldn’t easily be redeemed. More recent scholarship, however, has suggested that true wildcat banking was in fact quite rare and that there were often multiple drivers behind banking and economic turmoil. Moreover, free banking wasn’t one uniform model; rather, it was established in only 18 out of 32 states, with considerable variation. In general, free banking was less developed in the South, and in some cases, states formally adopted free banking but saw very few such banks established. Notably, free banking didn’t mean a complete absence of regulation. Instead, regulation was conducted at the state level, which was often idiosyncratic to each state’s jurisdiction. And the design of regulation — which included requirements that banknotes be backed by particular assets — was one factor that helped determine a currency’s stability. But perhaps an even more important and interrelated factor was liquidity. In states where participating banks ensured deep market liquidity in banknotes, such as New York, currency values were far steadier than elsewhere. It was in these cases where the banknotes came closest to fulfilling the basic functions of money, through their stable value and wide acceptance. How did free banking work? “Free” meant “free entry”: Anyone who could put up the required amount of capital could start a bank, which, once established, could issue its own notes. (This stood in contrast to state-chartered banks, which needed the approval of a state’s legislature to be established.) The bank had to deposit with a state  IMAGE: GIFT OF RICHARD S. SCHLEIN, NUMISMATICS COLLECTION, NMAH, SMITHSONIAN INSTITUTION  authority a set amount of approved bonds that backed those banknotes. The bank would then earn interest on those bonds as long as their value matched the nominal value of the notes; in most cases, the bank also had to hold fractional reserves in gold and silver to honor note redemption. If the value of the deposited bonds fell below the notes’ value, a bank had two options: either add more bonds to the deposit to make up the difference, or take the equivalent amount of notes out of circulation. If it didn’t do that by a set time, it had to close and sell off the bonds to repay its note holders. A major challenge was interstate redemption. If people held out-of-state notes and wanted to avoid interstate travel, they typically would sell those notes to a local “note broker” at a discount if they wanted to cash in those notes for gold or silver coin, or “specie.” The discount rates reflected the broker’s cost of redemption, consisting of the default risk of the issuing bank (related to its financial strength and the bonds that backed up the notes in their home state), as well as other factors like travel costs and local competition. As such, these rates varied widely. Some of this risk was made public through “banknote reporters,” publications that provided data on banks’ health as well as discount rates across states. Still, people who traveled across state lines often found that specie was easier to deal with; the Rutgers University economist Hugh Rockoff found, for example, that the amount of gold and silver in circulation rose considerably before the Civil War. Success Stories As one the first states to establish free banking, New York was the model that other states often followed. In 18371838, New York state-chartered banks began to acquire the stigma of political favoritism by the government, akin to President Andrew Jackson’s “pet banks.” To create an alternative, the state passed a free-banking law in 1838 that required participating banks to use state government bonds or relatively secure mortgages as collateral (they weren’t required to redeem notes for specie until later). After some initial turbulence, this new sector stabilized, and by the end of that decade most New York banks had converted to free banking. Research by economists such as Clemson University’s Gerald Dwyer Jr., formerly of the Atlanta Fed, and the University of Minnesota’s Arthur Rolnick, formerly of the Minneapolis Fed, has pointed to these changes as an important reason why New York free banks tended to survive longer than banks in other states, around eight years on average. And when they did fail, the losses borne by noteholders were often smaller than elsewhere, in some cases as little as 3 percent, thanks to the banks’ relatively secure asset holdings. When taken out of state, New York banknotes also held their value, usually around 99 percent — far higher than other states — reflecting in part  A $100 bank note issued around 1854 by the Quassaick Bank of Newburgh, N.Y., a free-banking state.  the stability of state bond prices and New York City’s strengthening financial clout. By the Civil War, the success of New York free banking was one reason why New York City pulled ahead of Philadelphia in attracting bank business. New England is another notable test case, even though free banking wasn’t as widespread. Rather, it was home to an innovation known as the Suffolk Bank System (SBS), which presaged in some ways the structure of the Federal Reserve System. Established in the 1820s, the SBS was a private clearing consortium for banks that was managed by the Suffolk Bank of Boston. To join the consortium, member banks had to fulfill a collateral requirement with the Suffolk Bank by keeping a deposit amounting to 2 percent of their capital. In turn, every day, Suffolk accepted and net cleared at par all banknotes deposited by member banks. (Under net clearing, all debits and credits are tallied at once, which makes it easier for a bank to manage liquidity.) SBS banks were also required to redeem notes in specie. These notes circulated widely in New England (and occasionally even beyond), and bank failure rates were low, even during panics. For example, when Philadelphia banks suspended specie redemption from 1839-1842, notes of SBS banks were so popular they traded at a premium rather than a discount. “In the SBS, banks could deposit other banks’ notes at par in a central account that looked very much like the Fed,” says Warren Weber, a former economist at the Minneapolis Fed. “Suffolk basically acted like a ledger and charged for that service — and even sometimes was willing to act as a lender of last resort.” The SBS is often considered a separate case from free banking, because it allowed any type of bank to join as long as it met the collateral requirement. Massachusetts, in fact, was home to a mix of state-chartered and wholly private banks and didn’t have any free banks until 1859. But just as the New York law over time brought banks into the system that were, by selection, strong enough to meet the asset requirement, the SBS had a self-selection effect through its capital contribution requirement, as well as strong supervision. This group of relatively healthy banks, in turn, saw a lower bank failure rate than those in E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  29  other states. And as this consortium grew, it produced deep market liquidity in banknotes, providing a degree of currency stability and interstate redemption that most other states failed to achieve. Mishaps in the Midwest At the other end of the spectrum was Michigan, where many of the colorful tales of “wildcat banking” emerged. Like New York, Michigan was an early adopter of free banking (1837), but it took a different path in key respects. For one, it allowed a broader range of bonds, including those with backing in private-issue mortgages of dubious value. When these loans defaulted, many banks couldn’t make up their collateral after liquidating their assets. The state also temporarily suspended specie payments early on, making it easy for banks to issue worthless notes. In turn, noteholders found they couldn’t redeem their currency in full; Michigan notes typically lost 30 to 60 percent of their value in those early years. After a rash of bank failures, the state had only a handful of banks by the 1840s and remained widely underbanked. In other cases, free-banking states saw the value of their notes decline due to factors beyond their borders. In Wisconsin and Illinois, for example, banks were allowed to use bonds from border and Southern states as collateral. When the Civil War began, those bond prices plummeted, as did the value of those banknotes. Another example was Indiana, where banks were hit in 1854 when Ohio passed a law banning all out-of-state banknotes, including those from Indiana; the measure was intended to make (higher-taxed) Ohio banking more attractive. Demand for Indiana bonds and banknotes sharply fell as a result, wiping out much of their value. There were also broader problems across states that affected all banks, including the issuance of uneven denominations (which could make notes hard to use or break down out of state) and widespread counterfeiting. In the 1840s and 1850s, more generally, bank failure rates were high, often spiking during downturns and panics — although scholars still debate how much free banking played a direct role. One study of New York, Wisconsin, Indiana, and Minnesota found that about half of all banks in those states closed during the free-banking era, but that many of those failed banks still redeemed their notes at par — suggesting that banking instability tended to have multiple causes. Overall, the performance of free banks improved over time. As Dwyer has noted, free banking was not perfect, but it also “was not the disaster portrayed by some.” When  Congress passed the National Banking Acts of 1863 and 1864, it took a page from free-banking laws by keeping the guarantee of bond backing — in this case, with federal government bonds backing notes issued by national banks. But Congress also ended free banking decisively by taxing notes issued by state and local banks out of existence. In short, just as the Second Bank’s demise was a political decision at the hands of the Jackson administration, the end of free banking reflected a policy choice of the day rather than a failure of the free-banking model. Back to the Present What are the lessons from this era? Some banknotes in New York and New England did indeed come closest to fulfilling the functions of money under a regulatory regime, enforced by the government or the private sector. Given that the attraction of cryptocurrencies today lies in the fact that their issuance is not determined by government fiat and that they are not publicly regulated, then, this historical record might give pause to those who see them as a potential substitute for money. The free-banking era also illustrates numerous examples of failures, especially in the Midwest, due to idiosyncratic regulation. This history suggests that effective regulation should involve a way to ensure that a new currency enjoys stable liquidity. This was a clear challenge for some states before the Civil War and for cryptocurrencies today. Policymakers have recently pointed to some of these features as constraints on cryptocurrencies’ utility in the long run. Fed Vice Chairman for Supervision Randal Quarles noted in a speech last November that among the dangers posed by cryptocurrencies is that during crises, “the demand for liquidity can increase significantly, including the demand for the central asset used in settling payments.” “Even private-sector banks and certainly nonbanks can have a hard time meeting large-scale demands for extra liquidity,” he added. “Without the backing of a central bank asset and institutional support, it is not clear how a private digital currency at the center of a large-scale payment system would behave … in times of stress.” In a speech last March, Bank of England Gov. Mark Carney also underscored this point in a broader critique of cryptocurrencies, charging that they are “failing” as money for now. He warned that the inherently “fixed supply rules” of these currencies would run the risk of repeating another, less successful, historical experiment. “[R]ecreating a virtual global gold standard,” he said, “would be a criminal act of monetary amnesia.” EF  Readings  30  Dwyer Jr., Gerald P. “Wildcat Banking, Banking Panics, and Free Banking in the United States.” Federal Reserve Bank of Atlanta Economic Review, December 1996, vol. 81, no.3, pp. 1-20.  Rolnick, Arthur J., and Warren E. Weber. “New Evidence on the Free Banking Era.” American Economic Review, December 1983, vol. 73, no. 5, pp. 1080-1091.  Rockoff, Hugh. “The Free Banking Era: A Reexamination.” Journal of Money, Credit and Banking, May 1974, vol. 6, no. 2, pp. 141-167.  Weber, Warren E. “The Efficiency of Private E-Money-Like Systems: The U.S. Experience with State Bank Notes.” Bank of Canada Working Paper No. 2014-15, April 2014.  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  BOOKREVIEW  From Industrial to Intangible CAPITALISM WITHOUT CAPITAL: THE RISE OF THE INTANGIBLE ECONOMY BY JONATHAN HASKEL AND STIAN WESTLAKE, PRINCETON, N.J.: PRINCETON UNIVERSITY PRESS, 2017, 278 PAGES REVIEWED BY AARON STEELMAN  I  n the late 1990s, there was much talk of a “New Economy” centered around firms introducing and exploiting Internet technologies. The received wisdom suggests that the New Economy died when the dot-com bubble popped in 2001. But that takes an unduly narrow view of what might be thought of as the New Economy, argue Jonathan Haskel of Imperial College London and Stian Westlake, a policy adviser to the British government. According to Haskel and Westlake, the economies of the world’s developed countries have undergone a profound shift, but the type of technology that people had in mind in the late 1990s is just one of many components of a larger change. In particular, it is just one of the “intangible” investments that have become increasingly important over the last 40 years. Others include ideas, market research, training, and new business processes. As the importance of intangibles has increased, the importance of investment in “tangibles” — manufacturing facilities and the machinery that occupy them, for instance — has declined. By at least one measure, intangible investment began to outpace tangible investment in the United States by the mid-1990s. Intangibles have four defining properties, according to Haskel and Westlake, what they call “the four S’s”: scalability, sunkenness, spillovers, and synergies. Intangibles are scalable because they typically can be used over and over in multiple places at one time. “Once you’ve written the Starbucks operating manual in Chinese — an investment in organizational development — you can use it in each of the country’s 1,200-plus stores,” they state. Employing a somewhat unconventional use of the term “sunk,” Haskel and Westlake argue that intangible investments, such as branding campaigns, are sunk because they may have value for specific firms but not others, making them difficult to sell. This is in contrast to many tangible assets, such as buildings, that often can be sold to a wide range of firms. Intangibles often create spillovers. Firms can copy other firms’ ideas, taking advantage of investments they don’t make themselves in the absence of well-considered intellectual property laws. And intangibles frequently produce synergies. As Haskel and Westlake put it, “Ideas and other ideas go well together.” The authors argue that the move to a more  intangible-based economy — characterized by the four S’s — has, among other things, contributed to “secular stagnation,” inequality, challenges relating to the financing of business investment, and new requirements for infrastructure, broadly conceived, such as norms and standards that govern collaboration and interaction among firms. The connection between the growth in intangibles and secular stagnation — a term that shares no common definition among economists, but that Haskel and Westlake define, roughly, as tepid investment contributing to sluggish productivity and economic growth — is probably the most ambitious of their claims. It is also the one for which their argument is least clear. Scalability, they say, is allowing very large firms to emerge: firms that are better placed to appropriate the spillovers from other firms’ intangibles, reducing the incentives for smaller, lagging firms to invest. From here, the path to increasing inequality is pretty clear in their narrative. Those large firms have not only become bigger, they have also become more profitable and are able to increase the pay of their employees while the smaller, less profitable firms are strapped. This has increased income inequality. In addition, rising housing prices in places where intangibles are particularly important has increased wealth inequality. And, they argue, economies highly reliant on intangibles have produced inequality of esteem, as many who are skeptical of change and fearful of financial instability feel left behind. The authors claim for a variety of reasons that procuring financing for intangible investments is inherently difficult. They argue that venture capital can help alleviate this problem, but considerably greater public investment will be necessary. Finally, trust among people and firms is important because without it, they are less likely to interact in a way that creates synergies between different intangibles. For instance, trust can give rise to agreements regarding how one firm will combine an idea with another firm’s idea to produce goods that are greater than the sum of their parts — rather than attempting to poach that idea to capture all gains. Some of Haskel and Westlake’s arguments seem rather speculative. For instance, their explanation of secular stagnation needs greater elucidation to be a viable story. But even so, it would be just one of many that have been offered. And how some of their proposals will be achieved — such as producing a consensus to increase public funding of intangible investment and ensuring that funding will be directed effectively — is not immediately clear. But, on the whole, the book provides many well-argued, richly sourced insights that are relevant to some of the biggest issues facing advanced economies. EF E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  31  DISTRICTDIGEST  Economic Trends Across the Region  Land-Use Regulations: A View from the Fifth District BY S A N T I AG O P I N TO  L  and and housing can be costly in a city or region for a number of possible reasons. Places with recreational or cultural attractions or other amenities draw population so the demand for housing and, consequently, land is high in those areas. Prices could also be high at some locations if the supply of land is constrained by the geography. In some areas, however, the price of land is high as the result of heavy land-use regulations (LURs), which restrict the availability of houses. LURs are often justified on the basis that they intend to correct for market imperfections. Their cost-effectiveness has been questioned by many researchers, however. Regardless of their merits, the use of LURs by local governments has become widespread and their intensity has been steadily increasing. Understanding the impact of LURs is extremely important, but at the same time challenging. To the extent that LURs reduce housing availability and increase housing prices at certain locations, they may discourage productive labor migration from taking place. Moreover, since LURs tend to affect different interest groups in conflicting ways, some researchers simply view LURs as the outcome of a local political process. Due to the complexity of the large number of local rules in place, their consequences are still not completely understood. In the Fifth District, the importance and the role played by LURs is far from homogeneous. While LURs are notably constraining in places like Washington, D.C., and some parts of Maryland and Virginia, they are less important elsewhere. This article examines the determinants of LURs, reviews some of their consequences, and looks at their prevalence in the Fifth District. What Are LURs and How Are They Quantified? Urban life and the concentration of people and activities in a region have a number of advantages. High densities, at the same time, generate nuisances; zoning and other LURs are among the policy alternatives frequently adopted by localities to address the negative external effects associated with density. But the proliferation of LURs in the United States, a process that gained strength in the 1960s, has imposed substantial pressure on land costs, constrained the expansion of housing supply, and generated excessively high housing prices in some cities. Cities regulate the use of land in different ways. The term land-use regulations generally encompasses all the rules and policies that set the standards for the development of land and housing construction. These regulations include zoning ordinances that determine how the land should be used (commercial, multifamily, or single-family  32  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  use) and the type of structures that can be built. They also include rules that establish how the structures should interact with the surrounding area, such as minimum lot size requirements, maximum height of buildings, maximum units that can be placed on a lot, minimum setbacks for a building from its neighbors, and off-street parking requirements. Other frequently observed regulations are demands for developers to pay for infrastructure (roads, sewers, schools) and historic preservation policies. Together they constitute a fairly complex set of rules not only because they cover many different dimensions, but also because they generally involve the participation and intervention of several enforcement and control authorities. Making sure a particular development complies with all the regulations may result in a lengthy approval process for the construction of housing, raising the overall cost of the development. Due to the complexity of land-use policies, it becomes difficult to precisely quantify their stringency. One of the most recent and comprehensive measures of the intensity of LURs in the United States is the Wharton Residential Land Use Regulation Index (WRI). This index, developed by Joseph Gyourko and Anita Summers of the University of Pennsylvania and Albert Saiz of the MIT Center for Real Estate, is based in part on the results of a national survey of local LURs conducted across a large number of municipalities. The main purpose of the index is to characterize the regulatory environment in a community. The questions asked in the survey cover three different areas related to land-use policies. The first set of questions attempts to identify the authorities involved in the regulatory process. The second set asks about the type of regulations most commonly observed in the area (limits on new construction, minimum lot requirements, affordable housing requirements, open space requirements, or requirements to pay for infrastructure). The final set of questions focuses on the outcomes of the regulations. They ask, among other things, whether the cost of housing development has increased or if projects are delayed or take longer to be completed. The WRI combines this survey information with other data sources that include local environment and open-spacerelated ballot initiatives, and data on legal, legislative and executive actions involving land-use policies at the state level. In this way, the index captures the overall intensity of LURs in a specific local area. The WRI index is one of the most frequently used indicators of regulatory stringency in the academic literature; some examples will be discussed below. Another approach is to look at the evolution of the main cost components of housing: land and structures. In a 2003 paper, Edward Glaeser of Harvard University and  Gyourko suggest that the stringency of the regulatory environment in a community could be assessed by comparing the difference between the local home price and the cost of housing construction (that is, the cost of the structures built on the land) per square foot. The idea is that LURs impose an additional cost to housing development, so the difference between housing prices and material costs would in part capture the cost of the regulations. Empirical evidence shows that in the United States, the gap between the two has been steadily increasing since the 1980s, concurrent with the rise in adoption of LURs. The increasing gap is mostly driven by home prices rising more rapidly than material costs throughout the period. The latter seems to suggest that housing availability may be constrained by the high development costs imposed by local barriers to land development rather than by changes in the cost of the structural component of homes.  Undevelopable Land in Top Fifth District Metros Rank  Undevelopable area (%)  MSA  WRI  12  Charleston-North Charleston, SC  60.45  -0.81  13  Norfolk-Virginia Beach-Newport News, VA-NC  59.77  0.12  47  Baltimore, MD  21.87  1.60  54  Columbia, SC  15.23  -0.76  58  Washington, DC-MD-VA-WV  13.95  0.31  62  Greenville-Spartanburg-Anderson, SC  12.87  -0.94  75  Richmond-Petersburg, VA  8.81  -0.38  77  Raleigh-Durham-Chapel Hill, NC  8.11  0.64  83  Charlottee-Gastonia-Rock Hill, NC-SC  4.69  -0.53  88  Greensboro-Winston-Salem-High Point, NC  3.12  -0.29  NOTE: For more on the Wharton Residential Land Use Regulation Index (WRI), see text. Higher WRI values correspond to greater regulatory intensity. SOURCE: Saiz, A. “The geographic determinants of housing supply.” Quarterly Journal of Economics, 2010, vol. 125, no. 3, pp. 1253-1296.  Developable Land and Local Housing Supply The supply of land, and therefore its price, can be affected by a locality’s geographic conditions. In a 2010 article in the Quarterly Journal of Economics, Saiz estimates the percentage of undevelopable land in 95 U.S. metropolitan statistical areas (these are MSAs with population larger than 500,000). His approach incorporates topography and heavily relies on data from satellite images. It consists basically of calculating first the area within a 50-kilometer radius of the geometric center (or centroid) of each MSA and then removing the area lost to oceans, internal water bodies and wetlands, and the proportion of land with a slope in excess of 15 degrees. He later compares the percentage of developable land and the level and changes in housing values for the different MSAs and finds that they are positively associated. This corroborates the intuition that housing prices would be higher in certain areas simply because of geography. According to Saiz’s study, among the largest 95 metro areas in the United States (those with population greater than 500,000), MSAs in the Fifth District, such as Charleston-North Charleston, S.C., and Norfolk-Virginia Beach-Newport News, Va.-N.C. are relatively heavily land-constrained. (See table.) The percentage of undevelopable land is approximately 60 percent in those areas. According to the WRI, regulatory stringency in the two MSAs, however, is relatively low. The impact of LRUs is, in contrast, very large in Baltimore, Md., with a WRI of 1.60. Determinants of LURs In principle, the availability of buildable land should not restrict housing supply if housing could be constructed more densely. But in many cases, LURs implemented at the local level prevent such practices. Thus, geographic restrictions and legal restrictions may combine  to keep housing availability from responding adequately to demand. In Saiz’s article, he finds that the response of housing supply to price increases is also low in geographically constrained areas, a phenomenon he attributes to LURs. In fact, Saiz shows that regulatory restrictiveness, measured by WRI, tends to be higher in locations that face important geographic constraints on land development. One possible explanation is the “homevoter hypothesis” originally developed by William Fischel of Dartmouth College in his 2001 book of the same name. In the book, he states that homeowners tend to support and promote local policies that protect the values of their homes. In this case, homeowners ultimately decide the intensity of LURs and their decisions would depend, among other things, on the initial price of their investment. Specifically, homeowners in locations where land prices are initially high would promote the adoption of stringent local regulations, which would eventually lead to even higher home prices. Homeowners in those areas presumably have stronger incentives to protect their investment compared to homeowners in areas with initial lower land prices. The latter includes regions where development occurs at low densities, home prices are close to their replacement costs, and investment in housing is possibly less risky. In sum, according to this explanation, less developable land entails higher land and housing prices; higher housing prices, in turn, lead to more strict regulations, which ultimately push home prices even higher. In light of the conflicting effects LURs have on different economic agents, understanding the impact of LURs is critical. But it is also challenging. One issue is reverse causation: As noted above, while LURs influence housing prices, housing prices may also influence LURs. In other words, LURs may be partly endogenous, the outcome of a political process that involves the participation of E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  33  Regulatory Intensity in the Fifth District  State  U.S. Rank  WRI  MD  6  0.79  DC (MSA)  16  0.33  U.S. average  -0.02  VA  27  -0.20  NC  30  -0.35  SC  41  -0.76  WV  44  -0.92  NOTE: For more on the Wharton Residential Land Use Regulation Index (WRI), see text. Higher WRI values correspond to greater regulatory intensity. SOURCE: Gyourko, J., Saiz, A., and Summers, A. “A new measure of the local regulatory environment for housing markets: The Wharton Residential Land Use Regulatory Index.” Urban Studies, 2008, vol. 45, no. 3, pp. 693-729.  different interest groups. Disentangling the causal effects of LURs in this context is complicated: Regulations in a community may induce households to sort by income and other demographic characteristics, and the latter may determine the types and intensity of regulations that are chosen in a specific community. A recent study by Matt Turner of Brown University and Andrew Haughwout and Wilbert van der Klaauw of the New York Fed performs a thorough economic analysis of LURs that controls for the endogenous determination of LURs. They distinguish the differential impact of LURs on different economic agents. For instance, to the extent that LURs effectively prevent the development of undesirable projects, property values may increase. But LURs would have the opposite effect on property values if they discouraged beneficial developments, such as a sought-after grocery store. Finally, while LURs may protect the interests of existing property owners, they deter the entry of new residents. LURs and the Regional Distribution of Labor Shifts in population from less-productive areas to more-productive ones are desirable since they would increase the overall well-being in a country. LURs make it difficult for local housing markets to respond to growing demand, however, and thus affect the migration of workers. It becomes more costly in the presence of LURs for workers to change locations and benefit from cities that are more productive. Local wages need to become, under these circumstances, higher to attract workers. In a 2017 study, Chang-Tai Hsieh of the University of Chicago Booth School of Business and Enrico Moretti of the University of California, Berkeley study this possible consequence of LURs. According to Hsieh and Moretti, to the extent that artificial barriers, such as zoning laws or minimum lot sizes, explain high local housing prices, they would contribute to making the process of moving to thriving regions more difficult, beyond the normal costs of 34  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  changing residential locations. Moreover, when households face these additional hurdles to moving, they may end up being trapped in less-productive areas. By introducing additional frictions, LURs induce an inadequate spatial distribution of workers across regions, and such mismatch would entail lower aggregate production and welfare. In their work, the researchers claim that LURs in exceptionally productive cities, namely New York City, San Francisco, and San Jose, are particularly responsible for curtailing aggregate economic growth in the United States. By blocking the access of workers to high-productivity areas, the proliferation of LURs generates a growing dispersion of wages across regions. Stringent local regulations combined with local productivity increases translate into excessively high housing prices and nominal wages, rather than more workers and more production. Alleviating the intensity of these regulations, specifically in productive cities, would generate a positive external effect on the entire economy. Importance of LURs in the Fifth District The work by Hsieh and Moretti also quantifies the costs of local LURs by measuring how much they affect aggregate economic growth. Their analysis indicates that the stringency of LURs (as measured by the WRI), particularly in locations with high productivity growth, decreased U.S. growth from 1964 to 2009 by approximately 50 percent. The researchers also perform a counterfactual exercise that attempts to determine the impact on other cities of a reduction in housing supply restrictions in high-productivity cities, such as New York, San Francisco, and San Jose, to the level of regulation observed in the median city in their sample, which happens to be Richmond, Va. They find, among other things, that employment growth in Richmond would be much lower, since workers would tend to move toward the high-productivity cities. Another way of looking at this result is that cities like Richmond benefit from excessive LURs in high-productivity locations. Within the Fifth District, there is a wide range in the intensity of LURs at the local level. (See table.) Maryland, D.C., and Virginia show the highest regulatory intensity levels. They are followed, in decreasing order, by North Carolina, South Carolina, and West Virginia. In fact, the last two are among the states with the lowest WRI values — that is, the least restrictive LURs. The approach suggested by Glaeser and Gyourko to assess the impact of LURs tells a similar story. The figures show the evolution of home prices and residential land prices for Maryland, D.C. and Virginia, the three cases with the highest regulatory intensity in the Fifth District. (See chart.) The indices of real home prices and residential land prices are constructed by Morris Davis of Rutgers University and Jonathan Heathcote of the Minneapolis Fed. The data, reported by the Lincoln Institute of Land Policy, indicate that changes in home prices are largely driven by changes in the price of land for the three cases  Real Home and Residential Land Prices Maryland, District of Columbia, and Virginia 1600 1400 PRICE INDEX (1980=100)  presently examined. The cost of land as a proportion of the value of the home is also the highest in those places: The share of land costs is 78 percent in Washington, D.C., 48 percent in Maryland, and 38 percent in Virginia. While a number of locations in D.C., Maryland, and Virginia are moderately constrained by the amount of land that could be developed, which could explain part of the price behavior, the WRI seems to indicate that LURs play a much more important role than geography in restricting housing availability in those jurisdictions.  1200 1000 800 600 400 200 0 1980  1984 Land Prices MD Home Prices MD  1988  1992  Land Prices DC Home Prices DC  1996  2000  2004  2008  2012  2016  Land Prices VA Home Prices VA  SOURCE: Author’s calculations using data from: Davis, Morris A. and Jonathan Heathcote, 2007, “The Price and Quantity of Residential Land in the United States,” Journal of Monetary Economics, vol. 54 (8), pp. 2595-2620; data located at Land and Property Values in the U.S., Lincoln Institute of Land Policy http://www.lincolninst.edu/resources/  $MILLIONS  Conditional Zoning in Virginia One type of LUR largely used by Cash Proffers in Virginia local governments in Virginia is conditional zoning or proffers. 60 120 State legislation in Virginia allows 100 50 a landowner proposing rezon80 40 ing to perform an act or donate 60 30 money, land, or services to a locality to compensate for the effects 40 20 generated by such rezoning, such 20 10 as the need for new infrastructure. 0 0 When a local authority accepts 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 cash proffers, the locality has to Expenditures Collections Total Localities (right axis) begin working on the agreed construction or improvement within SOURCE: “Report on Proffered Cash Payments and Expenditures By Virginia’s Counties, Cities and Towns 2016-2017.” Commission on Local Government Commonwealth of Virginia, November 2017. a period of 12 years after receiving full payment. Even though state legislation entitles all jurisdictions to adopt some kind of conditional zoning, not every localof localities. In her work, she specifically examines the ity is eligible to accept cash proffers. extent to which granting local governments the ability to Cash proffers are given for various purposes; in the collect cash proffers restricts local housing availability. fiscal year 2016-2017, the most important ones were She conducted a regression analysis in which she evaluates road and other transportation improvements (43 perhow cash proffer activity in a given year affects housing cent), schools (26 percent), and fire, rescue, and public supply in a subsequent year. The analysis exploits the fact safety (13 percent). From 2000 until the beginning of the that throughout the years there has been some variation financial crisis, the use of cash proffers increased along in the number of localities eligible to accept cash proffers with the number of localities involved. (See chart.) The in Virginia. The study’s main conclusion is that past cash collection of cash proffers and the average amount of proffer revenue actually reduces housing development in cash proffers collected per locality have increased sigsubsequent periods. nificantly since 2011. While rules and standards are necessary to generate the Even though the use of cash proffers was originally best possible urban life, there is always the risk of shifting intended to serve a specific purpose, namely to address toward an excessively regulated environment in which the potential negative external effects of rezoning an area, the cost of the regulations overshadows their intended they have become de facto a very powerful growth manobjectives. The challenge is, of course, to determine what agement tool. Shannon McKay, research manager in the kind of minimal regulations would be necessary to ensure Community Development department at the Richmond a pleasant and, at the same time, productive environment Fed, has extensively studied the relevance of cash proffers without imposing unwarranted costs on both the local and in Virginia, focusing on how they have affected the growth the aggregate economy. EF 60 50 40  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  35  State Data, Q3:17   DC  MD  NC  SC  VA  WV  Nonfarm Employment (000s) 789.9 2,728.1 4,423.1 2,089.8 3,955.8 745.0 Q/Q Percent Change 0.0 0.2 0.4 0.2 0.1 0.2 Y/Y Percent Change 0.9 1.0 1.6 1.3 0.9 0.0  Manufacturing Employment (000s) 1.3 106.6 467.9 241.2 233.9 46.6 Q/Q Percent Change 0.0 -0.1 0.1 0.3 -0.1 -0.2 Y/Y Percent Change 8.3 0.8 0.7 2.2 0.6 -0.6  Professional/Business Services Employment (000s)	166.6 444.0 616.1 276.9 731.6 66.7 Q/Q Percent Change 0.2 -0.1 0.2 0.0 0.4 0.1 Y/Y Percent Change 1.1 0.5 1.5 2.3 2.0 1.3  Government Employment (000s) 239.9 503.8 737.6 366.9 717.7 153.7 Q/Q Percent Change -0.5 -0.3 0.8 0.6 0.1 -0.1 Y/Y Percent Change -0.3 0.0 0.9 0.8 0.3 -0.9  Civilian Labor Force (000s) 401.3 3,224.8 4,956.3 2,315.8 4,318.5 779.2 Q/Q Percent Change 0.0 0.1 0.5 0.3 0.3 0.4 Y/Y Percent Change 1.7 1.3 1.9 0.9 1.6 -0.4  Unemployment Rate (%) 6.1 4.0 4.4 4.2 3.7 5.2 Q2:17 6.2 4.1 4.5 4.2 3.8 5.0 Q3:16 6.0 4.4 5.0 4.8 4.2 6.0  Real Personal Income ($Bil) 47.6 320.4 395.3 180.2 408.7 61.4 Q/Q Percent Change 0.9 0.3 0.1 0.0 0.5 1.2 Y/Y Percent Change 1.3 1.1 2.0 1.5 1.3 1.2  New Housing Units 1,221 4,814 18,040 8,987 8,161 706 Q/Q Percent Change 13.0 0.3 23.5 3.6 -2.6 -7.6 Y/Y Percent Change -24.1 47.0 9.9 4.3 1.6 0.6  House Price Index (1980=100) 856.3 465.7 364.5 371.1 451.8 232.9 Q/Q Percent Change 1.5 0.9 1.0 1.7 0.4 0.1 Y/Y Percent Change 8.2 3.6 6.1 5.9 3.7 1.3 NOTES:  SOURCES:  1) FRB-Richmond survey indexes are diffusion indexes representing the percentage of responding firms reporting increase minus the percentage reporting decrease. The manufacturing composite index is a weighted average of the shipments, new orders, and employment indexes. 2) Building permits and house prices are not seasonally adjusted; all other series are seasonally adjusted. 3) Manufacturing employment for DC is not seasonally adjusted  Real Personal Income: Bureau of Economic Analysis/Haver Analytics. Unemployment Rate: LAUS Program, Bureau of Labor Statistics, U.S. Department of Labor/Haver Analytics Employment: CES Survey, Bureau of Labor Statistics, U.S. Department of Labor/Haver Analytics Building Permits: U.S. Census Bureau/Haver Analytics House Prices: Federal Housing Finance Agency/Haver Analytics  For more information, contact Michael Stanley at (804) 697-8437 or e-mail michael.stanley@rich.frb.org  36  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  Nonfarm Employment  Unemployment Rate  Real Personal Income  Change From Prior Year  Third Quarter 2006 - Third Quarter 2017  Change From Prior Year  Third Quarter 2006 - Third Quarter 2017  4% 3% 2% 1% 0% -1% -2% -3% -4% -5% -6% 06 07 08 09 10  Third Quarter 2006 - Third Quarter 2017  10% 9% 8% 7% 6% 5% 4% 11  12  13  14  15  16  17  3% 06 07 08 09 10  11  12  13  14  15  Fifth District  16  17  8% 7% 6% 5% 4% 3% 2% 1% 0% -1% -2% -3% -4% -5% 06 07 08 09 10  11  12  13  14  15  16  17  13  14  15  16  17  16  17  United States  Nonfarm Employment Major Metro Areas  Unemployment Rate Major Metro Areas  New Housing Units  Change From Prior Year  Third Quarter 2006 - Third Quarter 2017  Third Quarter 2006 - Third Quarter 2017  Change From Prior Year  Third Quarter 2006 - Third Quarter 2017  7% 6% 5% 4% 3% 2% 1% 0% -1% -2% -3% -4% -5% -6% -7% -8% 06 07 08 09 10 Charlotte  11  12  13  Baltimore  14  15  16  17  13% 12% 11% 10% 9% 8% 7% 6% 5% 4% 3% 2% 1% 06 07 08 09 10  Washington  Charlotte  40% 30% 20% 10% 0% -10% -20% -30% -40% 11  12  13  Baltimore  14  15  FRB—Richmond Manufacturing Composite Index  Third Quarter 2006 - Third Quarter 2017  Third Quarter 2006 - Third Quarter 2017  30  30  20  20  10  -30  -20  -40 11  12  13  14  15  16  17  -50 06 07 08 09 10  11  12  13  14  15  12  United States  Change From Prior Year Third Quarter 2006 - Third Quarter 2017  -20  -10  11  House Prices  -10  0  -50% 06 07 08 09 10 Fifth District  0  10  -30 06 07 08 09 10  17  Washington  FRB—Richmond Services Revenues Index 40  16  16  17  16% 14% 12% 10% 8% 6% 4% 2% 0% -2% -4% -6% -8%  06 07 08 09 10 Fifth District  11  12  13  14  15  United States  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  37  Metropolitan Area Data, Q3:17   Washington, DC  Baltimore, MD  Hagerstown-Martinsburg, MD-WV  Nonfarm Employment (000s) 2,683.1 1,403.2 105.0 Q/Q Percent Change -0.3 0.1 -0.2 Y/Y Percent Change 1.4 1.1 0.1  Unemployment Rate (%) 3.7 4.0 4.1 Q2:17 3.7 4.2 3.8 Q3:16 3.8 4.3 4.5  New Housing Units 6,566 2,293 358 Q/Q Percent Change -1.8 26.5 17.0 Y/Y Percent Change 3.8 77.8 50.4    Asheville, NC Charlotte, NC Durham, NC Nonfarm Employment (000s) 190.0 1,178.9 310.2 Q/Q Percent Change -0.3 -0.1 -0.4 Y/Y Percent Change 1.5 2.8 1.4  Unemployment Rate (%) 3.5 3.9 3.7 Q2:17 3.5 4.2 4.0 Q3:16 4.1 4.7 4.4  New Housing Units 826 6,651 1,134 Q/Q Percent Change 5.9 57.6 -4.8 Y/Y Percent Change 67.2 2.4 4.4    Greensboro-High Point, NC Raleigh, NC Wilmington, NC Nonfarm Employment (000s) 356.2 618.9 126.5 Q/Q Percent Change -1.1 0.7 -0.2 Y/Y Percent Change 0.1 2.9 1.7  Unemployment Rate (%) 4.4 3.6 3.9 Q2:17 4.7 3.9 4.2 Q3:16 5.2 4.3 4.8  New Housing Units 746 3,308 477 Q/Q Percent Change -8.1 -9.8 -6.8 Y/Y Percent Change -1.6 -16.6 49.1  NOTE:  Nonfarm employment and new housing units are not seasonally adjusted. Unemployment rates are seasonally adjusted.  38  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8   Winston-Salem, NC Charleston, SC Columbia, SC Nonfarm Employment (000s) 261.8 354.5 394.6 Q/Q Percent Change -0.6 -0.1 -1.1 Y/Y Percent Change 0.6 2.1 -0.2  Unemployment Rate (%) 4.0 3.3 3.8 Q2:17 4.3 3.4 3.8 Q3:16 4.8 4.0 4.4  New Housing Units 1,264 1,644 1,246 Q/Q Percent Change 106.9 -0.8 -16.7 Y/Y Percent Change 239.8 -11.8 5.2    Greenville, SC Richmond, VA Roanoke, VA Nonfarm Employment (000s) 414.3 673.5 159.6 Q/Q Percent Change -0.2 0.0 -0.5 Y/Y Percent Change 1.2 1.6 -1.0  Unemployment Rate (%) 3.6 3.8 3.8 Q2:17 3.6 3.9 3.8 Q3:16 4.3 4.2 4.1  New Housing Units 1,611 1,875 N/A Q/Q Percent Change 22.0 21.3 N/A Y/Y Percent Change -1.8 48.2 N/A    Virginia Beach-Norfolk, VA Charleston, WV Huntington, WV Nonfarm Employment (000s) 786.1 117.0 138.0 Q/Q Percent Change 0.1 -0.3 -0.6 Y/Y Percent Change 1.0 -1.1 0.5  Unemployment Rate (%) 4.2 5.2 5.6 Q2:17 4.3 4.7 5.5 Q3:16 4.7 5.8 6.2  New Housing Units 1,242 47 32 Q/Q Percent Change -25.4 0.0 0.0 Y/Y Percent Change -43.9 0.0 0.0    For more information, contact Michael Stanley at (804) 697-8437 or e-mail michael.stanley@rich.frb.org E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  39  OPINION  TFP, Prosperity, and the FOMC BY K A RT I K AT H R E YA  W  e have seen positive news over the past year or so for a variety of economic variables, including employment, consumption, investment, and overall GDP growth. But one lesser-known economic number of great significance has remained stubbornly weak: namely, total factor productivity, or TFP. It represents, in effect, the efficiency with which we’re able to convert inputs — capital, labor — into outputs. Thus, it reflects the state of the art of our technology and the capabilities of our labor force at a given moment in time. TFP growth rarely makes headlines, but its slowdown since before the Great Recession is worthy of attention — both for what it means for our future standard of living and what it may mean for monetary policy. Productivity growth in this country has gone through distinct high and low periods during the post-World War II era. According to research from the San Francisco Fed, TFP grew an average of 2.0 percent annually from 1960 to 1973. Then its growth declined on average to a quarter of that, 0.5 percent, from 1974 to 1994. A partial rebound in productivity growth to 1.6 percent followed from 1995 to 2004, probably driven by information technology (including the Web) and the movement of factory work overseas. But since then, productivity growth has been back in the doldrums, again averaging just 0.5 percent through 2017. One reason why this trend is troubling is that economists very broadly agree that improvements in TFP are the only boosters of long-run economic growth. To put it differently, sustained (or long-run) growth in per capita economic output, the usual measure of our material well-being, is exclusively determined by TFP growth. These improvements may come, of course, from advances in technology, ranging from the steam engine to the moving assembly line to semiconductors. It’s important to distinguish TFP from its similarsounding relative, labor productivity. Labor productivity is defined as the value of output produced per hour of work. It depends on all the forces that affect a worker’s ability to produce output. Thus, growth in labor productivity can come from increases in the equipment that workers have or from TFP growth. In contrast, TFP growth means a worker with the same level of equipment as before can produce more than before. Thus, while individual firms have relatively straightforward ways to influence labor productivity by changing their use of inputs, increases in TFP reflect more fundamental changes in the economy’s ability to turn inputs into output. Whatever its underlying origins, the new normal of lowered TFP growth has implications for monetary policy that are less than obvious. It turns out that in the long run, 40  E CO N F O C U S | F I R ST Q U A RT E R | 2 0 1 8  “real” interest rates — that is, inflation-adjusted interest rates — are greatly influenced by TFP growth. To understand why, remember what real interest rates represent: the extent of our preference for spending today versus spending later. The more we prefer spending today, the more we’re willing to pay (through a higher interest rate) to borrow for today, and conversely, the more we need to be rewarded for postponing spending until a year from now, or whatever the term of the bond or savings account that we’re thinking of buying or using. Here’s where TFP comes into the picture: One influence on our desire for present spending is our belief about our future standard of living. We tend to want to maintain a stable lifestyle over time. So if we believe we’re going to be richer in the future, we’ll commonly opt for a little more spending today (via borrowing) and leave it to our well-to-do future selves to pay it back. When TFP growth is high, we are indeed going to be richer in the future — on average, of course. But if we all try to spend more now in anticipation of this rosy future, the price of current spending — the interest rate — will rise. Today, we’re in the opposite situation: Low TFP growth implies low real interest rates, all other things equal. TFP growth matters directly for monetary policy because the Fed aims to track the underlying real interest rate in the economy. Given a determination on the part of the Fed to target inflation at a long-term average of 2 percent — the target that the Federal Open Market Committee announced in January 2012 — low real rates in turn imply low policy rates, such as those we’ve seen for some years. (Keep in mind that the policy rate is a nominal rate and is set to track the underlying real rate plus the targeted rate of inflation.) The possibility that real rates will remain low in the future by virtue of low TFP growth increases the chance that the short-term nominal interest rates we set will hover near the “zero lower bound” on interest-rate policy — and hence that the Fed will once again need to rely on unconventional monetary policy, such as quantitative easing, to respond to a future downturn in the economy. Additionally, prolonged periods of low interest rates are a potential cause for concern given their elevation of asset prices and incentives for risk-taking by financial institutions seeking to reach for returns for their owners and clients. These risks are another example of how developments in the real economy shape the policy choices available to central bankers. EF Kartik Athreya is executive vice president and director of research at the Federal Reserve Bank of Richmond.  NEXTISSUE Cash: Still King?  Despite the proliferation of electronic payment options, the amount of physical dollars in circulation has continued to grow year after year. Some economists predict that eventually most economies will go cashless, but does cash still offer benefits that digital alternatives lack? The experiences of some countries may offer clues about the trade-offs of going cashless.  Inflation  When producers’ costs rise, they charge consumers higher prices, which leads to higher inflation, right? Not necessarily. The relationship between the Producer Price Index (PPI) and the Consumer Price Index (CPI) is complicated, with implications for policymakers.  Trade and the Fifth District  Recent policy discussions have placed international trade in the spotlight. Join our exploration of what the Fifth District trades with the rest of the world and how changes in trade policy might affect businesses across the district.  Federal Reserve Meet Ferbus, the Fed’s main computer model of the economy, more formally known as “FRB/US.” Computer models of the economy are indispensable to central bankers and to macroeconomists in general. What is the role of Ferbus and other computer models in Federal Open Market Committee policymaking?  Economic History The role of information frictions is seen as increasingly important in understanding the economy. Even before the Internet era, technological innovation played a key role in reducing these frictions. The introduction in 1866 of transatlantic telegraph service is a case in point, bringing especially pronounced changes to the U.S. cotton industry.  Interview Chad Syverson of the University of Chicago on whether the productivity slowdown is real or due to mismeasurement, why some firms are much more efficient than others within the same industries, and how learning by doing works in practice.  Visit us online: www.richmondfed.org •	To view each issue’s articles and Web-exclusive content •	 To view related Web links of 	 additional readings and 	 references •	 To subscribe to our magazine •	To request an email alert of our online issue postings  Federal Reserve Bank of Richmond P.O. Box 27622 Richmond, VA 23261  Change Service Requested  To subscribe or make subscription changes, please email us at research.publications@rich.frb.org or call 800-322-0565.  Econ Focus in 2017 FOURTH QUARTER 2017 VOL. 22 NO. 4  FOURTH QUARTER 2017  FEDERALRESERVE RESERVEBANK BANKOF OFRICHMOND RICHMOND FEDERAL  COVER STORY  Medicine Markup  Americans pay a lot for prescription drugs. Does that mean we pay too much?  SECOND QUARTER 2017 VOL. 22 NO. 2  SECOND QUARTER 2017  FEDERALRESERVE RESERVEBANK BANKOF OFRICHMOND RICHMOND FEDERAL  Medicine Markup Americans pay a lot for prescription drugs. Does that mean we pay too much?  COVER STORY  Can “Sin Taxes” be Good for Your Health and the Economy? Soda taxes – the latest example – are gaining favor  Can “Sin Taxes” be Good for Your Health and the Economy? Soda taxes – the latest example – are gaining favor  FEDERAL RESERVE  Speeding Up Payments Apprentices Among the Robots  Following in the Family Footsteps  Interview with Jean Tirole  INTERVIEW  FEDERAL RESERVE Paying for Public Broadcasting  The Fed and Foreign Exchange  Interview with Jesse Shapiro  Jean Tirole, Toulouse School of Economics  THIRD QUARTER 2017 VOL. 22 NO. 3  THIRD QUARTER 2017  FEDERALRESERVE RESERVEBANK BANKOF OFRICHMOND RICHMOND FEDERAL  Can economics shed light on why it’s so difficult to defend against cyberthreats?  Subprime Auto Loans  When is Inflation Too Low?  INTERVIEW  Jesse Shapiro, Brown University  FIRST QUARTER 2017 VOL. 22 NO. 1  FIRST QUARTER 2017  FEDERALRESERVE RESERVEBANK BANKOF OFRICHMOND RICHMOND FEDERAL  CYBERATTACKS and the DIGITAL DILEMMA  Interview with Douglas Irwin  COVER STORY  COVER STORY  Cyberattacks and the Digital Dilemma Recent high-profile hacks have renewed calls for improved security, but competing incentives pose a challenge FEDERAL RESERVE  Waiting for Inflation  The Fed’s Foray into Forex  The Missing Boomerang Buyers Does it matter whether those who lost their homes during the crisis come back to the housing market?  Self-Driving Trucks and Jobs  The Future of Small Colleges  Interview with Janet Currie  INTERVIEW  Douglas Irwin, Dartmouth College Visit: https://www.richmondfed.org/publications/research/econ_focus  The Missing Boomerang Buyers Does it matter whether those who lost their homes during the crisis come back to the housing market? FEDERAL RESERVE  The Fed’s “Tequila Crisis” INTERVIEW  Janet Currie, Princeton University