The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.
FIRST QUARTER 2011 THE FEDERAL RESERVE BANK OF RICHMOND VOLUME 15 NUMBER 1 FIRST QUARTER 2011 COVER STORY 12 What Drives Changes in Economic Thought? Why economists study what they do — and how the crisis might change it Were economists caught by surprise by the financial crisis because the profession’s models and dominant schools of thought are misguided? If so, how did it come to be that way? Will the financial crisis change things? Rather than a fundamental shift in economic thought, many economists seem to hope for a humbler view of what economics can teach us about the world. Our mission is to provide authoritative information and analysis about the Fifth Federal Reserve District economy and the Federal Reserve System. The Fifth District consists of the District of Columbia, Maryland, North Carolina, South Carolina, Virginia, and most of West Virginia. The material appearing in Region Focus is collected and developed by the Research Department of the Federal Reserve Bank of Richmond. DIRECTOR OF RESEARCH FEATURES John A. Weinberg 18 EDITOR Tobacco Stimulus Aaron Steelman Virginia and North Carolina are among the states using money from their 1998 settlement with tobacco companies to spur economic development. MANAGING EDITOR Kathy Constant STA F F W R I T E R S 21 Flexible Workforce: The role of temporary employment in recession and recovery Many economists see temp employment as a buffer during recessions and a harbinger of direct hiring during recoveries. How strong is the current preference for temp workers — and does that preference have long-term implications? Renee Courtois Haltom Betty Joyce Nash David A. Price Jessie Romero CONTRIBUTORS Charles Gerena Becky Johnsen Ann Macheras Sonya Ravindranath Waddell DESIGN 24 Benefits and Burdens of Expanded Military Bases Many military bases in the Fifth District are expanding as a result of the 2005 Base Realignment and Closure Plan, scheduled for completion by mid-September. Economic benefits may loom in the long run, but the influx may cause short-term pain as communities cope with crowded schools and congested roads. DEPARTMENTS 1 President’s Message/Don’t I Buy Food or Gasoline? 2 Upfront/Regional News at a Glance 6 Federal Reserve/Stigma and the Discount Window 9 Policy Update/A Pinch of Basel? 10 Jargon Alert/Market Failure 1 1 Research Spotlight/The Price of Avoiding Minor Financial Loss 28 Interview/Joel Slemrod 34 Economic History/Virginia and the Final Frontier 37 Around the Fed/The Decision to Export 39 Book Review/Bourgeois Dignity 40 District Digest/Economic Trends Across the Region 48 Opinion/The Financial Crisis and the Practice of Economics PHOTO/ILLUSTRATION: GETTY IMAGES BIG (Beatley Gravitt, Inc.) Published quarterly by the Federal Reserve Bank of Richmond P.O. Box 27622 Richmond, VA 23261 www.richmondfed.org Subscriptions and additional copies: Available free of charge through our Web site at www.richmondfed.org/publications or by calling Research Publications at (800) 322-0565. Reprints: Text may be reprinted with the disclaimer in italics below. Permission from the editor is required before reprinting photos, charts, and tables. Credit Region Focus and send the editor a copy of the publication in which the reprinted material appears. The views expressed in Region Focus are those of the contributors and not necessarily those of the Federal Reserve Bank of Richmond or the Federal Reserve System. ISSN 1093-1767 PRESIDENT’S MESSAGE Don’t I Buy Food or Gasoline? ecent increases in commodity and energy prices mean that Americans are paying more at the grocery store and at the gas pump — at the same time that many families are still trying to regain their footing after the recession. In this environment, it is understandable to ask why many economists, including those within the Federal Reserve System, frequently seem to focus on “core” inflation, which excludes often volatile food and energy prices, rather than on “headline” inflation, which does include those items that are so important to households. In fact, from time to time I get asked, don’t I buy food or gasoline? The answer is yes, indeed, I do. And I, as well as others within the Federal Reserve System, pay attention to the prices of those goods — both when we purchase them and also when we consider monetary policy. In fact, the Fed’s mandate is to ensure the long-term stability of the overall price level, which means that headline inflation is our ultimate concern. But in the short term, we must be careful that the tools of monetary policy — which have a lagged effect on the economy — are not applied in reaction to temporary price changes. The level of core inflation is not a goal in and of itself; rather, it is a means to the end of determining the most appropriate policies for the long run. That said, core inflation is not a perfect predictor of underlying inflation trends, and its correlation with overall inflation may depend on how it is measured. Moreover, recent research suggests that looking only at core inflation may understate the decline in purchasing power actually experienced by households during previous periods. The two most common measures of inflation are the consumer price index (CPI) and the personal consumption expenditures (PCE) price index. I tend to prefer the PCE measure, which is based on data from the CPI but includes different weighting methods that make it more consistent over time. Since 2000, the PCE has also been the preferred measure used by the Federal Open Market Committee (FOMC). Because inflation numbers often run hot or cold for several months at a time, the Richmond Fed looks closely at year-over-year headline PCE inflation to evaluate economic conditions in the Fifth District and the nation. Headline inflation is also an important component of the longer-term economic forecast each Reserve Bank president prepares prior to FOMC meetings. That does not mean, however, that core projections aren’t useful too. All members of the Committee evaluate those measures as well, especially when looking at more near-term conditions. Currently, inflation forecasts are consistent with our R price stability mandate, and the market’s expectations reflect that commitment. But as I noted in my last column in Region Focus, signs that the recovery is strengthening may mean that we will need to exit from our current accommodative monetary policy in the near future. And upward pressure on energy and commodity prices must be monitored carefully. Monetary policymakers cannot influence the relative price of oil, of course, but we can and must keep a close eye on whether distributors begin passing along higher input prices to consumers. If headline changes prove to be more persistent than previously expected, we must be vigilant that they not become embedded in expectations. The lesson to be drawn from this discussion is that no single measure of inflation is “wrong.” The Fed’s mandate means that we should choose the best tools available to determine the appropriate policies to achieve long-term stability of the overall price level. In short, we are committed to fostering an economic environment in which households and businesses can make the investment and savings decisions that will promote their well-being and the well-being of the nation’s economy as a whole. That means taking into account the prices of all goods — including food and energy — when considering the likely path of inflation and which policies we should pursue as a result of that evaluation. RF JEFFREY M. LACKER PRESIDENT FEDERAL RESERVE BANK OF RICHMOND Region Focus | First Quarter | 2011 1 UPFRONT Regional News at a Glance The Hipp 1914 Theater Comes to Life in Richmond The Hippodrome Theater will open this spring in Jackson Ward, a historic neighborhood in Richmond. The event caps a decade’s worth of planning and investment and construction. In a city that already boasts several downtown live-music venues, the Hippodrome stands apart. “The Hipp” is located in a National Historic Landmark. Jackson Ward has been a predominantly black community since the early 20th century, and became nationally recognized for its economic and cultural vibrance. In particular, 2nd Street boasted a number of theaters, earning the community the nickname “Harlem of the South.” In 1914, the Hippodrome Theater opened to the community, presenting both vaudeville acts and films. The theater was part of the “Chitlin’ Circuit,” a group of venues in the eastern and southern United States where The Hippodrome’s new marquee brightens 2nd Street on the evening of Mayor Jones’ State of the City address. black entertainers could perform when racial segregation was prevalent. After its heyday in the 1930s and 1940s, the Hippodrome showcased popular acts such as Ray Charles, Ella Fitzgerald, Ethel Waters, and James Brown. In 1945, the building caught fire, and it was rebuilt as a movie theater. In 1970, the Hippodrome became a church, and was rarely used after 1982. The historic venue was not completely forgotten, however. Richmond developer Ronald Stallings inherited the property from his father. His dream has been to reopen the Hipp as an elegant nightclub, a venue for rhythm and blues, jazz, and soul artists. Stallings is the president of Walker Row Partnership, which has renovated or redeveloped dozens of properties in Jackson Ward. Its mission is to provide Jackson Ward with places to “work, shop, live, and play,” all in keeping with “New Urbanism,” an urban design movement that promotes walkable mixed-use communities. After years of planning, Stallings secured financing for the ambitious project in October 2009. Construction took a year and was completed in February. The complex includes a theater, two restaurants, retail space, and 29 apartments. The project cost about $12 million, with $8 million of that going toward the theater renovation. The project received $600,000 from the city and federal funds. In early February, Richmond Mayor Dwight Jones delivered his second State of the City address in the newly transformed theater. For Jones, whose address promoted “creating healthy and sustainable communities,” it was an ideal setting. —BECKY JOHNSEN Snow Job PHOTOGRAPHY: COURTESY OF AMY AND RON STALLINGS Winter Can Cost, but Also Benefit 2 veryone loves a snow day, except the people paying for the cleanup: taxpayers. Through the end of February, North Carolina had spent $55 million clearing snow and ice from state roads. The annual budget for snow removal is $30 million, more than enough in a typical year. But when big storms hit, as they have the last two winters, the state has to pull money from projects E Region Focus | First Quarter | 2011 such as pothole repair or tree trimming, according to Julia Merchant, spokesperson for the state’s Department of Transportation. Those choices are tough, especially when money is already tight. The South Carolina Legislature passed a special bill this year that exempts school districts from making up all of the snow days; the districts can’t afford them. States pay for snowplows, salt for the roads, and overtime pay for the plow drivers. Virginia stocked up on 48,000 tons of sand and 281,000 tons of salt this winter. But the budgets don’t come close to the total cost of a major storm. The economic impact of a one-day shutdown includes lost wages, slow retail sales, and less tax revenue. The total can be in the hundreds of millions of dollars, according to a study of 16 states by IHS Global Insight for the American Highway Users Alliance. In Maryland and Virginia, IHS estimates the impact at $183.5 million and $260 million, respectively. The federal government loses $71 million per day when it closes, as it did for an unprecedented four straight days during last year’s “snow-pocalypse.” And those estimates don’t include property damage, crop loss, car crashes, repairing power lines, or flight delays, all of which add billions to the cost of winter storms. When government offices are closed, people can’t file for unemployment benefits. A major storm that hit Alabama, Georgia, North Carolina, and South Carolina in January contributed to the biggest drop in unemployment filings in nearly one year, but when offices reopened the following week claims increased by 51,000. Bad weather may not affect the unemployment rate, since workers who were paid during any part of the survey period are A snowplow clears the Cherohala Skyway in western North Carolina. counted as employed. But it does affect the number of hours people work, particularly in the construction industry. Not everyone loses money when it snows. “When weather is in the news, it is very good for business,” says Scott Bernhardt, Chief Operating Office of Planalytics, a company that models the economic impact of weather for retailers, manufacturers, and utilities, among others. Snowplow manufacturers and salt suppliers do well during a snowy winter, but some retail stores also benefit. “People grocery shop like crazy. People order pizzas. Convenience stores sell out of just about everything,” Bernhardt says. Still, most of the lost sales due to a storm don’t get made up, particularly in regions where snow is uncommon, according to Bernhardt: “When snow falls in the South, it’s a net loss.” —JESSIE ROMERO Urban Growth D.C. Population Reverses 60-Year Decline n the 20th century, New Deal policies and World War II added to the federal payroll and propelled the District of Columbia’s population to its peak of 802,000 people in 1950. After that, D.C.’s population declined for decades — until the most recent census, when D.C. grew by nearly 30,000 people, or roughly 5 percent. “The overall numbers have a story to tell us about exactly how much the population has increased and how rapidly,” says Harriet Tregoning, the director of the D.C. Office of Planning. “We haven’t seen growth like this since World War II and this hasn’t been a time of remarkable mobility.” The population in 2010 reached 601,723. Fully PHOTOGRAPHY: NORTH CAROLINA DEPARTMENT OF TRANSPORTATION I 12,000 of those people came in the past two years, 10,000 in 2009 alone. Though the data explaining the growth have yet to be published, the rebound likely reflects multiple factors: government growth following the terrorist attacks of Sept. 11, 2001, as well as a general trend of young people and empty nesters moving into cities. From 2007 through 2009, some 4,685 people aged 25 to 34 moved into Washington along with 5,798 people aged 60 to 64, according to American Community Survey (ACS) data released late last year. “This follows the anecdotal information we’re getting,” Tregoning says. “There are particular areas where we’re getting growth, particularly Capital Region Focus | First Quarter | 2011 3 THOUSANDS D.C. Population Rebound 900 800 700 600 500 400 300 200 100 0 1910 1920 1930 1940 1950 1960 1970 1980 1990 2000 2010 SOURCE: U.S. Census Bureau Riverfront and Columbia Heights.” The riverfront neighborhood is in the southeast on the Anacostia River, south of Capitol Hill, and Columbia Heights lies in the northwest quadrant of the city. Across from the Columbia Heights Metro station, Target and Best Buy anchor a retail complex. Transportation and vibrant neighborhoods are what Tregoning calls the “pull” factors, elements that now attract residents. “We have so many neighborhoods that are now interesting to people and good transit; they’re increasingly served by neighborhood retail,” she says. The city’s demographic composition has changed as well. ACS data indicate that the white population was estimated at nearly 39 percent in 2009, up from 34.5 percent in 2006, while the black population fell from 55 percent to 53.2 percent during the same period. (In 1980, black residents comprised roughly 70 percent of the city’s population.) The Hispanic population remained about the same — rising from 8.2 percent in 2006 to 8.8 percent in 2009. D.C. also grew the natural way — more births. Births have increased annually from about 7,600 in 2000 to more than 9,000 in 2009, with a natural increase (births minus deaths) of about 27,000 since 2000. D.C. also attracted more than 24,000 new residents from outside the United States since 2000. Possibly because of the city’s continued job growth, it also attracted domestic migration. Between 2008 and 2009, about 4,450 more people moved to the area from other states than moved away, according to the D.C. Planning Department. The job mix in D.C. remains almost one-third federal — 29 percent of its 728,300 jobs — and that may have attracted people to the area. The private sector comprises two-thirds of jobs, and D.C. government and transportation-related jobs make up the rest. While the District of Columbia’s population has increased, the U.S. population as a whole grew more slowly than during any decade since the Depression, 9.7 percent. The recession slowed immigration and birth rates, demographers say, and that may continue for the —BETTY JOYCE NASH short term. Art in the Office Businesses Learn Creativity hen times are tough, business owners need to get creative. At least three programs in the Fifth District boost problem-solving abilities through art-based activities: the Innovation Institute in Charlotte and Fun Days and Creative Meetings in Richmond. At Art Works Studios and Galleries in Richmond, Fun Days classes serve to reinvigorate organizations. Co-owner Glenda Kotchish says the sessions provide a creative environment where teams can interact. Kotchish, a former computer system consultant and lifelong artist, understands the power of the combined analytical and creative sides of the brain. The Fun Days host small groups, tailored to each organization; past programs include crafting album covers while listening to music, painting a group mural, and assembling handmade books. The Visual Arts Center of Richmond offers Creative Meetings. According to director of education Aimee W 4 Region Focus | First Quarter | 2011 Joyaux, the program was created in response to recent buzz about creativity in the office. “Richmond has a thriving art community and a thriving business community, but there has not been a lot of overlap between the two,” she says. A former student is Jana McQuaid, director of graduate studies at Virginia Commonwealth University’s School of Business. McQuaid participated alongside members of groups from both nonprofit and for-profit sectors, and she says the class encouraged her to reorganize her professional life to encourage creative thinking. “Our team was working on a new procedure for processing applications,” says McQuaid. “Previously I might have created a draft template and asked for everyone to submit individual written comments. Instead, this time we all met as a group with a large board and Post-It notes. We posted everyone’s ideas and we were able to really feed off of one another’s comments, coming up with a process that is much better than submitting feedback in isolation.” The Innovation Institute in Charlotte puts executives from multiple industries under the instruction of painters, sculptors, photographers, or printmakers. Barbara Spradling, the Institute’s Director and a retired bank executive, says that artists take risks. That means they fail, sometimes regularly, while corporations traditionally do not support failure. Consequently, artists have much to offer the business world, and in turn the business world pays artists to host creative classes, making this a “very symbiotic relationship.” Spradling reports that the recession increased enroll- ment in these courses as many business leaders increasingly encourage innovation. Art Works and the Visual Arts Center indicate similar success. Each has attracted wellknown clients: Members of the Virginia Senate have had their own “Fun Days;” Creative Meeting has hosted Capitol One and Wells Fargo; and the Innovation Institute sent representatives to California to teach classes to Motorola employees. The term “starving artist” was coined for those who depend upon their creativity as a livelihood. As more companies enlist artists to help stimulate innovation, creativity could become more lucrative. —BECKY JOHNSEN Coins for the Commonwealth Alternative Currency Proposal Fails PHOTOGRAPHY: FRIENDS OF BLUEMONT (VA.), COURTESY OF JUDY AND BUD ANDERSON A resolution to study the possibility of an alternative currency for Virginia may have stalled out, but it’s an idea that’s also being introduced in other state legislatures. Virginia delegate Robert Marshall, a Republican who represents parts of Prince William and Loudoun counties, recommended to the General Assembly that the Commonwealth adopt its own currency of gold and silver. The resolution went to a subcommittee, which voted to take no action, meaning that it will not come before the full legislature for a vote. In his resolution, Marshall argued that the Federal Reserve’s monetary policies could lead to economic instability, including runaway prices. But it’s unlikely that the United States would face the kind of drastic hyperinflation that would imperil the U.S. economy, says Randall Parker, an economist at East Carolina University whose work has specialized in the causes and consequences of the Great Depression. “I don’t think that hyperinflation on the order of magnitude that would cease to have the dollar as a functioning medium of exchange is very likely at all.” For that to happen, Parker says, inflation would have to reach Zimbabwe- or Serbialike levels. Inflation hit 500 billion percent before Zimbabwe abandoned its currency. The Federal Reserve and professional forecasting groups project inflation in the United States between 1 percent and 2 percent for the next several years. Marshall joins legislators in about nine other states who have made similar proposals. Bobby Franklin in Georgia introduced an act that would require the use of pre-1965 silver coins and silver and gold “American Eagle” coins to pay all debts to and for the state. John Dougall in Utah says he will propose allowing residents to mint gold and silver coins in their homes that These one- and 10-dollar notes were issued in 1862. Virginia issued a total of $5.3 million in currency during the Civil War. would be accepted as part of his proposed new currency. The last time states issued their own currencies was during the Civil War, when Virginia and other Confederate states, including North and South Carolina, needed to fund wartime operations. Several Indian Territory nations, allied with the South, also had their own currencies. The Confederacy printed money to pay for the war because they were cut off from foreign trading partners and bullion deposits in the North. They had no gold or silver. By the midpoint of the war, Southern currency was a mix of state-issued notes, Confederate notes, and private bank issues (not to mention a healthy supply of counterfeits) — all of which were worthless when the war ended. But today, a Civil War-era 100-dollar Virginia note can be worth as much as $5,000 to collectors. Collectors may be interested in another of Marshall’s proposals. His bill authorizing the State Treasurer to mint gold, silver, and platinum commemorative coins passed both chambers and is on its way to the governor. —JESSIE ROMERO Region Focus | First Quarter | 2011 5 FEDERALRESERVE Stigma and the Discount Window BY R E N E E C O U RT O I S H A LT O M is usually discussed in the context of primary credit, which is available to healthy financial institutions. This is the Fed’s principal means of adding liquidity to the banking system. At the height of the financial crisis in October 2008, the Fed granted a weekly average of $111 billion in primary credit, a record (the previous record was about $12 billion for the week of Sept. 11, 2001). It is difficult to prove that stigma exists. Stigma would manifest itself through banks not borrowing from the Fed. However, it would be difficult to distinguish that from the fact that financial institutions usually have viable funding alternatives that are cheaper. Banks rely most heavily on other banks for short-term funding through the federal funds market. Banks have to keep a certain amount of cash, known as reserves, on hand according to the Fed’s reserve requirements, equal to 10 percent of total deposits in most cases. But since a bank’s depositors withdraw their funds at will, the amount of reserves on hand fluctuates from day to day. There’s an opportunity cost for holding “excess” reserves — banks could lend those funds out and earn interest — so banks generally try to minimize the amount they hold. (The Fed started paying interest on reserves in late 2008, which lessens that opportunity cost some.) That’s where the fed funds market comes in. Banks that have an excess supply of funds become lenders, and banks that need to fill a sudden shortfall become borrowers. Banks have existing legal agreements in place, and simply call each other up when they want to trade funds. That’s why it wouldn’t be difficult for other financial institutions to identify discount window borrowing, says Becky Snider, who oversees the Richmond Fed’s discount window. “If an institution suddenly disappears from the fed funds market, other banks might assume, particularly if Richmond posts a large borrowing in that period, that they went to the Fed.” The Fed would much prefer that banks obtain funds from this private source. But banks are rational, and one would expect them to go to whichever funding source is cheapest. Prior to 2003, that was the discount window. The Fed kept the In its early days, the discount window was a discount rate below the target fed Borrowing From Dad physical window located within each regional funds rate and limited arbitrage by Three types of loans are offered Federal Reserve Bank, as shown in this 1960s scrutinizing the banks that borrowed. through the discount window. Stigma photo of the window inside the New York Fed. ne of the primary ways central banks can stabilize the financial system in times of distress is by acting as the “lender of last resort” to financial institutions when funding dries up. Banks that face a liquidity shortage may be unable to provide depositors with the funds they wish to withdraw. At an extreme, the bank could fail. Banks facing shortages can go the Fed’s discount window, and that can help avoid unnecessary failures. Yet banks aren’t always willing to take the Fed up on this offer. During the recent financial crisis, for example, the Fed did everything it could to encourage bank borrowing, from easing lending terms to publicly urging banks to take loans if needed. But borrowing remained low in late 2007 despite severe liquidity shortages in the financial system. A common explanation for the reluctance of banks to borrow from the Fed is a “stigma” attached to the discount window. This stigma is based on the notion that only a bank in financial trouble would go to the Fed over other, cheaper sources of funds. Banks are believed to fear that regulators, investors, or other banks will assume the worst if the bank is discovered to have borrowed from the Fed. There can be perfectly benign reasons for accessing the discount window: a bank that receives a large withdrawal too late in the day to locate a private lender, for example. The problem is that such stigma, if present, may hamper the Fed’s ability to provide liquidity in a crisis. Other institutions will not necessarily know when a bank borrows from the discount window. Banks may sometimes be able to figure out the identities of specific borrowers, but only the total amount of borrowing from each regional Federal Reserve district is made public; those data are published weekly in the Fed’s H.4.1 release. That’s about to change. The DoddFrank financial reform legislation passed in the summer of 2010 requires the Fed to publicize the names of all banks that borrow from the discount window and the total amount borrowed, two years after that borrowing takes place. It is too soon to tell whether the certainty that discount window loans will be made public will further dissuade banks from borrowing in times of need. 6 Region Focus | First Quarter | 2011 PHOTOGRAPHY: CURATING SECTION, FEDERAL RESERVE BANK OF NEW YORK O The Fed required all discount window borrowers to show that they had sought loans on the fed funds market first, and banks had to provide information on what business activities the loans would be funding. Perhaps as a result of the Fed’s scrutiny, going to the discount window became associated with an inability to obtain funds from other banks. “In my banking days, I always described it as being like borrowing from my father,” said Fed Governor Elizabeth Duke in an early 2010 speech. Duke had a long career as a banker before being appointed to the Board of Governors in 2008. “I was always sure that at some point I would have to answer uncomfortable questions.” Evidence of Stigma Since stigma is latent during times of more or less normal market functioning, when there are plenty of funding alternatives to the discount window, economic research that attempts to measure the quantitative impact of stigma has focused on unique events in financial markets. One example was around the turn of the millennium, when businesses of all types were worried that the date turnover would trigger a glitch in computer systems. As a preventative measure, many banks chose to hold extra reserves. The Fed met the added demand for liquidity by creating the Y2K Special Lending Facility. The SLF was specifically designed to sidestep stigma: Borrowers did not need to approach the fed funds market first, they weren’t restricted in how the fund were used, and they didn’t have to pay the funds back right away. The Fed encouraged banks to use the SLF without a fear that it would trigger fears of insolvency or intensified oversight on the part of the Fed. Nonetheless, lending patterns through the SLF provided strong evidence of stigma, economist Craig Furfine of Northwestern University found in 2001. He applied an algorithm to confidential fed funds data to identify the total volume of fed funds loans compared to those through SLF. The results were striking: During one particular week in late 1999, SLF borrowing was $236 million, while borrowing through the fed funds market at rates higher than the SLF rate exceeded $1.5 billion, more than 6.5 times larger, he found. Banks were willing to pay a sometimes hefty premium for the ability to obtain funds from anyone but the Fed. Stigma hadn’t been given theoretical treatment until a 2010 model developed by Richmond Fed economists Huberto Ennis and John Weinberg. They show that it can be rational for banks to borrow elsewhere at higher rates if costly repercussions result from going to the discount window. In their model, a bank’s ability to repay an overnight fed funds loan depends in part on its ability to resell the assets in its portfolio to investors. One reason a bank may go to the discount window is if other banks, perceiving those assets are distressed, refuse to lend at a reasonably low rate. Meanwhile, a bank’s potential investors are unable to distinguish the reason for borrowing, but if they observe discount window borrowing, they can infer with a reasonably high probability that the cause was poor asset quality. Thus, borrowing from the discount window conveys a signal of financial distress to investors, and associated banks are able to resell their assets only at a discount. Changes to Reduce Stigma In 2003, the Fed made dramatic changes to its discount window practice in part to mitigate stigma. The discount rate was changed to a constant one-percentage point spread above the target fed funds rate, which removed the arbitrage opportunity between the discount and fed funds rates. This allowed the Fed to ease up on the regulatory scrutiny that accompanied discount window borrowing. Nowadays, provided a bank is in good financial condition and can post adequate collateral, discount window funds are lent on a “no questions asked” basis. To mark the change, the Fed publicly urged banks and other regulators to view occasional discount window borrowing as appropriate and unworrisome. Nevertheless, evidence of stigma persists. Furfine revisited the issue after the switch. Though the Fed sets a target level for the fed funds rate and is generally able to achieve it through open market operations, the actual fed funds rate can fluctuate. In principle, fed funds transactions can trade at any rate, including above the discount rate. During the first three months of 2003, Furfine found, an average of more than 57 times more activity occurred in the fed funds market at rates equal to or higher than the discount rate. Since discount window borrowing is rare in normal times, markets are likely to view a bank’s sudden willingness to borrow as a sign of weakness when the broader market is experiencing distress, argued Governor Duke in her February 2010 speech. “When uncertainty about the health of individual institutions or the industry as a whole increases, stigma intensifies as the market tries to identify the weaker players. The dilemma facing the Fed is that when discount window borrowing is most needed to keep credit flowing, it is most stigmatized.” Indeed, borrowing remained low as the financial crisis unfolded in the second half of 2007. The Fed reduced the spread between the discount and fed funds rates to one half of a percentage point in August 2007, and discount loan terms were extended from overnight to 30 days (eventually the rates and terms were loosened further). But few banks responded. Four of the nation’s largest banks borrowed a combined total of more than $2 billion, but they stated publicly that is was a symbolic move meant to encourage small institutions facing liquidity shortages to borrow. The Fed responded in December 2007 by creating an entirely new lending facility to get liquidity to the financial system. Like SLF, the Term Auction Facility (TAF) was designed specifically to get around the stigma problem. The key difference was that TAF funds were administered through auction. It worked like this: The Fed announced that it would lend a fixed amount of funds, and an unlimited number of banks could bid for up to 10 percent of that amount (a cap set to ensure the funds were evenly distributed). Funds were given to the highest bidders at the Region Focus | First Quarter | 2011 7 Fed Lending During the Financial Crisis Banks proved much more willing to borrow from the Term Auction Facility, which was specifically designed to sidestep the stigma problem that is believed to afflict the discount window. $MILLIONS (WEEKLY AVERAGE FOR EACH LENDING FACILITY) 600,000 500,000 Term Auction Facility Discount Window 400,000 300,000 200,000 100,000 0 Aug. 2007 Aug. 2008 Aug. 2009 Aug. 2010 Apr. 2011 SOURCE: Federal Reserve Board of Goverors H.4.1 Release (Factors Affecting Reserve Balances) “stop out” rate — the rate whose associated bid exhausted the funds. Thus, the TAF guaranteed multiple borrowers, which reduced the likelihood that any one borrower would be identified and penalized by the market. Despite the fact that discount window and TAF funds were in principle identical — even precisely the same institutions were eligible — TAF lending quickly dwarfed that of the discount window (see figure). During the 28 months that TAF was operational, more than 4,200 individual loans were granted through 60 auctions, providing more than $3.8 trillion in funds. The amount offered at each auction varied from $20 billion when the program was first launched to more than $150 billion during the worst days of the crisis. There is no evidence that the market initially attached any stigma to TAF when it was launched, writes a group of researchers in a 2011 Federal Reserve Bank of New York Staff Report. They document that financial institutions were willing to pay an average premium of 37 basis points, and 150 after the failure of investment bank Lehman Brothers to borrow from TAF rather than the discount window. The fact that banks were repeatedly willing to pay more for TAF funds is interpreted by the authors as strong, quantitative evidence of stigma. If one interprets all of that premium as being the result of stigma, then stigma cost banks an average of $5.5 million in interest per TAF auction during the summer of 2008, a period when the TAF rate was consistently above the discount rate — and $75 million in interest for the auction immediately following the failure of Lehman. It appears stigma has the potential to be quite costly for banks in times of greater liquidity needs. What Will Happen After Dodd-Frank? An outstanding question is how the financial crisis will affect stigma. Will it worsen it by forever associating Fed loans with financial turbulence? Or lessen it by making central bank loans more common, as is the case in other countries? Adding a twist of uncertainty is the requirement under the Dodd-Frank law that discount window loans be published with a two-year lag. It is possible that the effects of publication will be different for stigma during “normal” times versus stigma during times of financial distress. There appears to be a clear difference between the two. As Governor Duke and the New York Fed’s TAF study each suggest, financial turmoil seems to worsen stigma as uncertainty rises and banks struggle to identify weaker counterparties. Will loans being made public two years after the fact affect banks’ willingness to borrow in a crisis? We may not know until the next financial panic. Dodd-Frank was not the only recent mandate for the Fed to reveal discount window borrowing. Lawsuits filed under the Freedom of Information Act by the news organizations Bloomberg and Fox News sought to require the Fed to disclose discount window borrowing that occurred during the financial crisis — during April and May 2008 (around the failure of Bear Stearns), and from August 2007 through November 2008, respectively. The Fed denied the requests under the argument that disclosure would dissuade banks from accessing the discount window. The U.S. Court of Appeals for the Second Circuit ruled against the Fed in the Bloomberg case, and in March of 2011, the Supreme Court denied the Fed’s petition to hear that case. The Fed subsequently released the data. The lawsuits are the first of their kind leading to publication of discount window data, according to Alan Meltzer, a Fed historian at Carnegie Mellon University. As for more normal times, whether stigma deters borrowing can come down to the culture of an individual bank, says Snider of the Richmond Fed. “There are two different groups: For some banks, no matter what, they will never come to the discount widow. Then you’ve got a group to which it appears to make sense, especially late in the day and when conditions are advantageous. To them, if we’re going to publish the data in two years, that doesn’t seem like a big deal.” Still, she says, the Richmond Fed made efforts to make sure that each of the hundreds of depository institutions in the Fifth District which potentially have access to the discount window were aware of the coming change. Ultimately, Snider says, “what banks should remember is that if a bank borrows from the discount window, that means it met the Fed’s criteria for primary credit: It is a fundamentally healthy institution.” RF READINGS Armantier, Olivier, Eric Ghysels, Asani Sarkar, and Jeffrey Shrader. “Stigma in Financial Markets: Evidence from Liquidity Auctions and Discount Window Borrowing During the Crisis.” Federal Reserve Bank of New York Staff Report No. 483, January 2011. Ennis, Huberto, and John Weinberg. “Over-the-Counter Loans, 8 Region Focus | First Quarter | 2011 Adverse Selection, and Stigma in the Interbank Market.” Richmond Fed Working Paper No. 10-07, April 2010. Furfine, Craig. “Standing Facilities and Interbank Borrowing: Evidence from the Fed’s New Discount Window. International Finance, November 2003, vol. 6, no. 3, pp. 329-347. POLICYUPDATE A Pinch of Basel? BY DAV I D A . P R I C E n Dec. 16, 2010, the Basel Committee on Banking Supervision — a group of senior officials of central banks and bank supervisory agencies from 26 countries and the Hong Kong Special Administrative Region — released a final draft of its framework of banking regulatory reforms. That framework, known as Basel III, is a response to the 2007-2008 financial crisis, and is expected to be adopted by bank regulators worldwide, including in the United States. The reforms set out by Basel III are wide-ranging, but at the center are increased capital requirements. Capital requirements are intended to act as a buffer, ensuring that financial institutions are able to withstand some level of losses; they are typically expressed as a ratio of capital to assets, or to some risk-adjusted measure of assets. Of course, banks also maintain capital reserves for self-interested reasons — for instance, to maintain the confidence of investors. It is widely believed, however, that many of the risks created by low capital levels are felt not by the bank, but by the financial system as a whole (including government programs such as deposit insurance), thus warranting regulation of minimum capital levels. Today, U.S. regulators generally require a 4 percent capital ratio for “Tier 1” capital (mainly shareholders’ equity and retained profits net of accumulated losses) and an 8 percent capital ratio overall. Under Basel III, these requirements will become more stringent. The framework calls for the minimum Tier 1 ratio to go up to 4.5 percent in 2013, continuing to increase to 5.5 percent in 2014 and 6 percent in 2015. The minimum total capital requirement will remain at 8 percent, but it will be supplemented under Basel III by a “capital conservation buffer” of common shareholder equity that will kick in at 0.625 percent in 2016 and rise to 2.5 percent in 2019. Banks that do not meet the buffer requirement would be restricted in their ability to pay dividends and bonuses and to buy back their shares. Thus, roughly speaking, the total Tier 1 ratio will effectively double from 4 percent to 8.5 percent (6 percent plus 2.5 percent), and the minimum total capital ratio will increase from 8 percent to 10.5 percent. Additionally, the framework contemplates a “countercyclical capital buffer,” also based on common equity. The buffer would vary from 0 to 2.5 percent at the discretion of national regulators. The concept is that regulators would require this additional buffer during an expansion, and would reduce it during a downturn to maintain the availability of credit. The framework states that systemically important banks — banks of such a size that they may be considered too big to fail — “should have loss-absorbing capacity beyond the minimum standards.” Basel III does not specify what addi- O tional standards should apply to systemically important banks, instead indicating only that “the work on this issue is ongoing.” (Systemic risks to the financial system are also newly addressed within U.S. law by the Dodd-Frank Wall Street Reform and Consumer Protection Act, which provides for regulation of systemically important bank holding companies and nonbank financial institutions.) No congressional action is needed for the Basel III framework to take effect in the United States. To meet the implementation date of Jan. 1, 2013, the Federal Reserve System and the other bank regulatory agencies are expected to issue a proposed notice of rulemaking this year for rules incorporating the Basel III capital requirements into U.S. regulations. It is likely that regulators will invite comments from the public on the proposed rules before they become final. It’s also worth noting that the Basel III reforms are not binding: Countries have discretion to disregard or not fully implement certain provisions. What will be the macroeconomic effects of the increased capital requirements? That is the trillion-dollar question. A 2007 literature review by David VanHoose of Baylor University, published in the Atlantic Economic Journal, concludes that heightened capital requirements are likely to lead to reductions in bank lending. In the short run, banks facing increased capital requirements may be reluctant to issue new equity to bring their capital ratio into line with those requirements, given the costliness of raising equity capital. Thus, they will tend to respond on the asset side, slowing the growth of their assets, rather than on the capital side. Another potential unintended consequence of higher capital requirements is the risk of regulatory arbitrage. If capital requirements create a modest cost disadvantage for banks, then an increasing share of lending activity could move to vehicles outside the bank regulatory system, such as private equity funds or securitization. No one yet knows whether the increased capital requirements of Basel III are sufficiently high to produce appreciable macroeconomic effects or regulatory arbitrage, or whether the minimum capital ratios could be set even higher without such effects. Given the lengthy phasing-in of the requirements, regulators will have the opportunity to assess the extent of those effects in coming years. “There’s a lot of uncertainty about what the right amount of capital is,” says Richmond Fed economist Huberto Ennis. “It depends on the costs and benefits of having additional capital. Incrementally, how costly would it be to ask for 15 percent capital? We don’t know. As far as I can tell, it’s people making calls based on soft information and hunches. We know capital is good, we know it may be costly. The question is, what’s the right amount?” RF Region Focus | First Quarter | 2011 9 JARGONALERT Market Failure basic economic principle is that free markets produce outcomes in which resources are generally allocated efficiently. By “efficient,” economists mean that all the mutually beneficial trades which are possible have been exhausted. Free markets accomplish this feat by coordinating willing buyers and sellers through prices. Sound too good to be true? It can be. There are special circumstances that economists call “market failures” in which a freely functioning market is unable to produce an efficient outcome. When there are market failures, government intervention may be justified to correct the failure and, ideally, drive the outcome closer to efficiency. Economic theory has identified a limited number of market flaws that can lead to market failure. One is when a good’s consumption or production comes with externalities. Consider a factory that produces smog with each unit of output. The smog harms nearby households and businesses. But if producers can pollute for free, the production costs faced by the producer are lower than the true costs to society. The price of the good will be artificially low, and too much of the good, and its associated pollution, will be produced. A “public good” can also be an example of a market failure. Sometimes it is not possible to exclude nonpayers from consumption of a good or service. A local fireworks display is a common example. It would be hard to exclude anyone in the surrounding area that chooses not to pay, so few people have incentive to fork over the entrance fee. As a result, a private party will be less likely to put on the show at all, even though many people would derive value from it. In the case of public goods, the government may step in to provide the service, inducing everyone to pay through taxation. This is why one often sees city governments at the helm of local Fourth of July celebrations. Though market failures may at first blush appear to be about fairness — the smog producer harms its neighbors, or people free ride on the fireworks display — this occasional feature is not what concerns economists. The primary cost associated with market failure is that an inefficiently high or low amount of the good in question is produced. That causes resources to be directed to places other than where they are most highly valued. Society as a whole is richer when resources go to their highest-value uses. In fact, plenty of market outcomes that reasonable people may view as undesirable or unfair are not market A 10 Region Focus | First Quarter | 2011 failures at all. Take, for example, a market-oriented economy that produces income inequality. If a person becomes very rich by inventing a product that a lot of people value highly, that may be a perfectly efficient outcome even if no poorer person benefits in the slightest. A society that spends an exorbitant amount of money on gambling or unhealthy foods reflects that people place different values on how to spend their time and money. Distasteful to some people, perhaps — but not a market failure. One must also be careful about alleging market failure — especially if that allegation is used to justify government intervention — in instances where markets are not truly free. The recent financial crisis is an example. Financial markets are heavily regulated, which necessarily alters the incentives that market participants would face in a truly free market. Most financial markets are far from being truly “free” markets. It is important to separate the effects of market failure, if any, from the unintended side effects of the regulations. As for correcting market failures, a good rule of thumb is that successful methods replicate market behavior as closely as possible. For example, in the case of the smog-producing factory, the government could simply place a ban on smog production, but that would deprive consumers of the benefit of the good that was being produced. A more efficient arrangement would be for the government to assign property rights to the surrounding air. This would force the producer to “internalize” the externality by compensating its neighbors for the right to pollute their air, raising production costs. Private producers have often found ways to correct market flaws in order to produce efficient outcomes. When there are externalities, private parties have sometimes been able to divvy up property rights with no government intervention whatsoever. Private entrepreneurs have also found ways to exclude nonpayers to profitably produce roads (through tolls) and radio signals (through scrambled signals) even though both are commonly cited examples of public goods. In addition, government interventions can themselves reduce efficiency through unintended consequences, distortionary taxes, special interests, or simply errors in judgment. That’s why not all market failures warrant policy action. When considering policy intervention to correct a market failure, the relevant question is whether the costs associated with government action are likely to be greater than those of the initial market failure. RF ILLUSTRATION: TIMOTHY COOK BY R E N E E C O U RT O I S H A LT O M RESEARCH SPOTLIGHT The Price of Avoiding Minor Financial Loss BY B E C K Y J O H N S E N homes. Second, consumers may prefer to smooth costs over uch economic research has focused on how the long-run rather than risk suddenly having to pay a signifpeople avoid or embrace financial risk. Justin icant amount. Third, some individuals could have significant Sydnor, a microeconomist at Case Western borrowing constraints, making them unable to cover sudden Reserve University, focuses in a recent paper on the financial loss. Fourth, homeowners could be “influenced or phenomenon that, as he puts it, many consumers “appear pressured to take the more expensive lower deductible to pay a large amount to insure against very modest financontracts by the company’s sales agents, who earn partial cial losses.” He cites the demand for extended warranties, commissions.” Fifth, the selection of deductibles offered mobile phone insurance, and low insurance deductibles as by the provider may influence the consumer. “People evidence that many consumers are highly averse to risk. have a tendency to avoid picking the extreme options from In particular, Sydnor finds that there is “a surprising level a menu and may be reluctant of risk aversion over modest to pick the highest or lower stakes” in the market for home“(Over)insuring Modest Risks.” deductible available,” says owners’ insurance. Justin Sydnor. American Economic Journal: Sydnor. Finally, consumers may Sydnor analyzes data from have other personal reasons to a random sample of 50,000 Applied Economics, October 2010, avoid risk. For example, previous standard homeowners’ policies vol. 2, issue 4, pp. 177-199. research has concluded that issued by a large insurance consumers want to avoid the provider. The policies were all psychological pain of sudden financial loss. issued in a single unnamed western state in the past decade. Sydnor also looks at the extent to which consumers’ In choosing coverage, customers had a choice of four availchoices of deductibles affected profitability. In view of his able deductibles that they would pay in the case of loss or findings that consumers pay more for low-deductible damage: $1,000, $500, $250, $100. As usual in insurance policies than the expected value of the additional coverage, markets, choosing lower deductibles meant higher premione might assume the low-deductible policies are more ums. Sydnor finds that a plurality of customers, 48 percent, profitable to the insurance provider, but this is not the case. chose the $500 deductible; 35 percent chose $250; 17 percent The insurance provider earned roughly similar profits from chose $1,000; and less than 1 percent chose $100. He then both groups. Sydnor argues that this is partly because lowquantifies the average difference between the cost and value deductible customers have higher claim rates than those of the different deductibles by comparing the additional with high deductibles. In short, it appears that the prices costs and claim rates of the lower deductibles. He calculates charged by insurance companies may be consistent with a that “those who held lower deductibles paid [five times] competitive equilibrium. more than the expected value for that extra insurance.” Thus, while some individual consumers could save money Sydnor also analyzes the decisionmaking processes by switching to higher deductible plans, if all consumers behind these purchases. In studying the participants, he finds that customers who have held policies for longer “were changed their behavior, the insurance company would have actually more likely to hold one of the lower deductibles.” a difficult time distinguishing more and less risky customers. He speculates that such decisions by this segment of homeThis might force the provider to raise insurance costs or to owners may be due to “consumer inertia,” where despite create a new higher deductible. rising insurance costs, individuals fail to adjust their initial As a result, Sydnor does not recommend government choices. “It is likely that the observed choice of lower policy changes to “correct” consumer behavior. “In partideductibles partially reflects inertia and not solely an active cular, a given individual might benefit from [altering the choice reflecting risk preferences.” He predicts that this biases] that caused him to avoid purchasing [a more approphenomenon would apply to the U.S. homeowners’ insurpriate deductible], but a policy aimed at changing all ance market as a whole and calculates that by switching to consumers’ behavior is unlikely (at least in home insurance) the highest available deductible, U.S. homeowners could to improve the equilibrium for the consumer.”And while save around $4.8 billion annually. Sydnor looked at only homeowners’ insurance policies, he Sydnor concludes that those in his sample “overinsure[d] concludes that similar research could usefully be applied to modest risks when making home insurance purchases” and other insurance markets as well. “Doing so may open up new proposes six potential explanations for this phenomenon. insights about behaviors,” he suggests, “and may generate The first, and the most obvious, according to Sydnor, is that policy prescriptions in areas such as health insurance and consumers may simply misjudge the level of risk to their annuities.” RF M Region Focus | First Quarter | 2011 11 F or those on the inside, it is hard to ignore the sense that the economics profession is in a state of intellectual crisis. The sense is not unanimous, mind you, or probably even the majority view. But there is the uncomfortable fact that the profession was largely surprised by the largest economic event in several generations. Some have taken it as a sign that economists are, decidedly, studying the wrong things. The specific complaints are varied: Economists were so generally can’t conduct controlled experiments. An exception is the experiments conducted by the likes of Nobel Prize-winning economist Vernon Smith, but those typically deal with how people interact in different market settings and often do not have broad applicable policy implications. As a practical matter, one can’t raise taxes on one segment of Why economists study what they do — and how the crisis might change it focused on unrealistic, highly mathematical models that they missed the problems developing before their very eyes. They were so complacent with the idea that markets usually get things right that they ignored a housing bubble and securitization mess in the process. Overall, critics say, policymakers shouldn’t listen to a profession so lacking in consensus and out of touch with reality. Did the research and beliefs of economists leave them ill-equipped to foresee the possibility of a major financial crisis? And if so, what drove the profession to such a myopic position? Is Economics a Science? The recent criticisms are interesting in historical perspective because states of intellectual crisis tend to spur new theories in the sciences. One might expect the evolution of thought to be slow and steady, but physicist Thomas Kuhn paints a more dramatic picture in his 1962 book The Structure of Scientific Revolutions: It is a “series of peaceful interludes punctuated by intellectually violent revolutions.” What triggers a revolution, he writes, is that researchers identify something at odds with the dominant theory. One of three outcomes occur: The dominant paradigm explains it satisfactorily; the field determines it is something we are unable to study with existing tools, putting it off for future generations; or a new candidate paradigm emerges. But after a revolution, only one end is possible: The worldviews cannot coexist; one replaces the other. Evolution is a bit choppier in economics than natural sciences. It is much harder to disprove theories of human behavior than the mechanistic functionings of, say, physics or chemistry. People are subject to change, and economists 12 Region Focus | First Quarter | 2011 the population to analyze effects on the taxed and untaxed — not even in the name of science. Instead, economics experiments take place in models, or systems of equations designed to simulate real-world behavior. Science-like tools such as statistics and equations weren’t always a part of economics. Most people recognize Scottish philosopher Adam Smith as the “father” of economics, who delivered some of the most basic economic principles. But economics as a technique was invented by economist David Ricardo, writes economic historian Mark Blaug in his book Economic Theory in Retrospect, one of the leading texts on the history of economic thought. Ricardo is most famous for showing in 1817 that two nations can benefit mutually from trade even if one is better than the other at producing every single good. This idea of comparative advantage is perhaps one of the least intuitive concepts in all of economics, which Ricardo made profoundly clear through a simple story problem involving two countries that produce the same two goods. His approach was deductive reasoning: drawing specific conclusions based on a much more general, simplified example, the benefit of which is that it can be solved through logic. Not everyone agreed with that approach. Twentieth century economist Joseph Schumpeter coined the phrase “Ricardian vice” as the faulty practice of using overly simplistic assumptions to guide real-world policy. Nonetheless, the value of objective reasoning caught on and distinguished economics from other “moral” sciences like sociology and philosophy. Math became a big part of economics after the “marginal revolution” of the late 1800s. Previously, the “classical” economists — as they are now generally dubbed — spent their PHOTO/ILLUSTRATION: GETTY IMAGES BY R E N E E C O U RT O I S H A LTO M time studying the aggregate economy. But a desire to understand market-level issues, like why an egg costs more than a cup of tea, narrowed their attention to the margin: the last unit bought, sold, traded, or produced. The new nucleus of study was incentives, opportunity costs, and marginal choices. The marginal revolution was the birth of neoclassical economics, which looks a lot like the microeconomics of today. Calculus was required to study incremental decisionmaking at the margin, and math became a permanent mainstay in economics. The use of math in this context fit well with Ricardo’s logical technique for analyzing policy problems. The combination enabled economists to make objective policy prescriptions free of moral or ideological axes to grind. Anyone could plainly see the assumptions behind an economic model; one could accept or reject the policy wages would eventually adjust to bring supply and demand back in balance. More than a decade of severe unemployment seemed to prove that prediction false. They were looking in the wrong place, Keynes said. The Depression was a shortfall in demand at the aggregate level. While the neoclassicals might have been more concerned with the specific source of the shock through the lens of individual decisionmaking, to Keynes that wasn’t the point — in fact, people can randomly and without cause shift demand due to “animal spirits,” he said. The policy implication was simple: The government can avoid recessions by stepping in to consume when flesh-and-blood consumers refuse. For decades, circumstance seemed to confirm Keynes’s theory, and that kept it in favor. Unemployment virtually disappeared after military spending was ramped up for World War II. (Those who are rightfully concerned about implications on their merit. The notion that economics could, and should, be objective was how it became thought of as a science. Marginalism, for its part, refocused the discipline’s purpose: Economics was about how people behave as they allocate scarce resources to make their lives better. With its new branding as an independent science, economics also became more academic and less accessible to laypeople. Gone were the days of amateur economists like Smith and the brilliant but virtually untrained Ricardo. It is no surprise that many of today’s leading economics journals were established around the turn of the 20th century. 9 percent or 10 percent unemployment over the past few years will understand how monumental it seemed when unemployment that topped 15 percent or 20 percent for the better part of a decade simply evaporated and never returned.) The 1950s were mild and the 1960s boomed. By the late 1960s, many economists believed the government could “fine tune” the economy to keep it at full employment. Booms and busts were a thing of the past. Keynesianism seemed to work, but other factors also kept it dominant, says UCLA economist Lee Ohanian. Keynesianism was a fertile area of study, so that’s what received the attention of economists. The General Theory was nontechnical, even a tad rambling. Economists got much better at econometrics in the 1940s and 1950s, and that created the opportunity to develop Keynes’s broad ideas into a full-fledged Keynesian toolkit for the economy, Ohanian says. Economists at universities, the Federal Reserve, and other agencies spent much of the 1950s and 1960s constructing enormous models of the economy that included hundreds of equations, each representing supply and demand behavior in some specific sector. Chances are, Ohanian says, if you ask economists who received their Ph.D. in the 1960s the topic of their dissertation, it was one of those equations. With the new models economists could tweak a policy variable and calculate the precise expected results on employment, consumption, wages, and a variety of other variables of interest. “These econometric developments elevated Keynesianism to a quantitative enterprise.” Upheaval in Economics A recurring theme in economics since the marginal revolution has been that major economic events — especially when they are painful — are the primary catalysts for new modes of thought. No economic event was larger, or more painful, than the Great Depression. As revolutions go, marginalism was adopted at a glacial pace — five or six decades from first inklings to the formal codification of supply and demand in 1890 — compared to the upheaval that followed when John Maynard Keynes published the General Theory of Employment, Interest and Money in 1936. In the history of economics, the Keynesian revolution comes closest to being one of Kuhn’s dramatic paradigm shifts in thought. His theory made its way into economics textbooks after a decade, and visibly into policy within five or six years. The reign of his ideas lasted decades. No existing theory could explain the Depression. Neoclassicals could explain unemployment: If real wages were held above market-clearing levels by any number of situations — monopolies, tariffs, or price rigidity in general — unemployment would result. But they could not explain protracted unemployment. They predicted that prices and A Model Breaks Down Between an economy that conformed and conveniently timed econometric advances, “Keynes’ idea was in the right place at the right time,” Ohanian says. He explores in a recent paper with UCLA colleague Matthew Luzzetti the Region Focus | First Quarter | 2011 13 Right Place, Right Time The Phillips curve was a truly revolutionary idea: It appears in some form in virtually all macroeconomic research. It was also possibly in the right place at the right time, according to University of Chicago economist Harald Uhlig. If Phillips had plotted the relationship between inflation and unemployment today instead of in 1958 he’d have seen little of obvious interest — or, potentially, publishable certainty. INFLATION (PERCENT) Phillips Curve (1948-1959) 12 10 8 6 4 2 0 -2 -4 2 3 4 5 6 7 8 9 10 11 12 PERCENT UNEMPLOYMENT INFLATION (PERCENT) Phillips Curve (1948-2010) 16 14 12 10 8 6 4 2 0 -2 -4 2 4 6 8 10 12 PERCENT UNEMPLOYMENT SOURCE: Both inflation and unemployment data from the Bureau of Labor Statistics. Charts originally appear in “Economics and Reality,” Harald Uhlig, NBER working paper no. 16416, September 2010. reasons behind the rise of Keynesianism — and its eventual decline. “What’s interesting is that after 1970 the economy evolved in ways that were as pathological to the Keynesian model as the Great Depression was to the market clearing models of the 1920s,” he says. “It was really a case of what goes around comes around. And I think a lot of science proceeds that way.” The first problem was the breakdown in the Phillips curve. The Phillips curve is the famous inverse relationship between inflation and unemployment identified in 1958 and quickly enveloped in the Keynesian rubric: In practice, it was believed, policymakers could orchestrate lower unemployment by producing inflation, which would cause producers to think that demand for their goods had increased, causing them to hire more workers. When inflation later flared up, they could employ tighter monetary or fiscal policy, which would slow the economy a bit but would stabilize prices. In short, government officials could effectively fine-tune the economy as needed. It worked well for about a decade. But in the late 1960s some countries experienced rising inflation with little or no improvement in unemployment. Through much of the 1970s, both inflation and unemployment rose at once — stagflation. The Phillips curve’s menu of choices suddenly seemed unavailable. Economists jumped at the chance to explain the puzzle, much like Keynes had done 40 years earlier. Robert Lucas offered the famous “Lucas critique” in the 1970s: People simply caught on to the government’s strategy. In general, 14 Region Focus | First Quarter | 2011 Lucas said, people’s past behavior is a poor guide for future policy because that strategy will, in fact, change the very behavior it is based on, neutralizing systematic attempts to manipulate the economy. This was a decidedly antiKeynesian proposition. (The Phillips curve may be another example of an idea that was in the right place at the right time. See chart.) Lucas and others helped launch the “rational expectations” movement, which provided a formal model for how people’s expectations affect macroeconomic outcomes. No longer satisfied with the notion of mysterious animal spirits, economists took a closer look at the causes of shifts in demand. Further econometric advances allowed them to develop specific theories — called “microfoundations” — about what caused consumption and other aggregates to change at the individual and firm level. There were other gradual shifts that led economists away from Keynes. The Lucas critique also applied to the complex systems of equations economists had spent two decades developing. The equations were based on past behavior; there was no reason to expect them to be stable. Sure enough, a series of studies starting in the early 1970s showed that simple statistical models which included no theory whatsoever were often better at forecasting the economy than the complex models the profession had spent two decades producing. But it was that real-world events blatantly conflicted with the theory that really caused the profession to move on from Keynes. “[T]he inflation and the stagflation of the 1970s did more to persuade economists that there was something wrong with Keynesian economics — that you needed supply-side policies and all that — than all the empirical evidence on the econometric studies against Keynesian economics,” Blaug said in a 1998 interview with Challenge magazine. “Sometimes you have to be hit over the head with a hammer before you give up a beloved theory.” Math as Not Just a Servant, But a Master? As MIT economist Olivier Blanchard describes it, if Keynesianism was a revolution, macroeconomics since rational expectations has been a drawn-out battle with gradual movement toward peace. Two, not one, replacement paradigms emerged, both emphasizing microfoundations. First, the new classicals, whose models included fewer market imperfections, were able to incorporate a “general equilibrium” view that demonstrated how separate markets affect each other as they might in the actual macroeconomy. Their brand of macroeconomics looked more like microeconomics, with a focus on the power of markets to allocate resources most efficiently. Second were the new Keynesians, who wanted to tweak, not replace, the Keynesian models by using microfoundations to explain aggregate imperfections that the government might be able to fix via judicious monetary and fiscal policy. The models generally were unable to study more than one market at a time, a “partial equilibrium” perspective. This is the famous “freshwater” and “saltwater” divide that has caused much controversy (and occasional name calling) within the economics profession. The monikers describe the geographic locations where the economists in those camps have tended to be located. Saltwater economists (centered at universities around the two coasts) have been accused of assuming policymaker omniscience and ignoring the bad incentives that government intervention can create, while freshwater economists (centered at universities around the Great Lakes) have been accused of operating with blind faith in the unfailing power of markets to self-regulate. Those caricatures still exist, but for the most part the camps have converged over time to create a hybrid of general equilibrium, microfoundational models that include imperfections and a potential role for government intervention. “The new tools developed by the new classicals came to dominate. The facts emphasized by the new Keynesians forced imperfections back into the benchmark model,” wrote Blanchard in 2008. The Economist magazine described the convergence in vision as “brackish” macroeconomics. Even more striking has been the convergence in methodology. Because of advancements in econometrics and computer power, economists today can combine the strengths of various theories better than before. What models today have in common is not so much any one school of thought, but the type of mathematical tools that are used — so, in a way, mathematics is the new reigning paradigm of economics. The quintessential example are DSGE models, which stands for “dynamic stochastic general equilibrium” (a fancy way of saying they include decisions made over time and under uncertainty, and that the decisions made by policymakers, consumers, and firms affect each other). DSGE models are the dominant workhorse in macroeconomics today, especially at policy institutions like the Fed. They consist of a small handful of equations that tell economists how much households are likely to consume and work, and how much firms are likely to produce and invest, as the result of some policy or shock the economist imposes. Such models are used to ask specific hypothetical questions, the results of which are interpreted with a good amount of judgment (see page 17). But they’re solved using math so complex that one of the biggest constraints on their size is sheer computer memory. One result of increasing reliance on math — critics pejoratively refer to it as “formalism,” or math for math’s sake — is that the profession has become very specialized. Economic historian Robert Whaples of Wake Forest University puts this is in characteristically economic terms: “Fixed costs of switching fields of study may be higher in economics than in other sciences, especially social sciences, because you do have to learn a lot of rigorous techniques.” Increasing specialization may be one reason the crisis caught economists by surprise. To truly see the complex web of securitizations — that, when unwound, was the crisis — one would have needed knowledge about multiple fields like financial and real estate economics, argues University of Even the most inclusive models cannot be used as a “theory of everything,” to borrow a phrase from physics, that merges a large number of fields and sounds an alarm when events like crises are imminent. Chicago economist Raghuram Rajan in a recent blog posting. “[Y]ou had to know something about each of these areas, just like it takes a good general practitioner to recognize an exotic disease. Because the profession rewards only careful, well-supported, but necessarily narrow analysis, few economists try to span sub-fields,” he writes. Fed Chairman Ben Bernanke argues that even economists who warned of instability saw only very limited portions compared to what actually transpired. “[T]hose few who issued early warnings generally identified only isolated weaknesses in the system, not anything approaching the full set of complex linkages and mechanisms that amplified the initial shocks and ultimately resulted in a devastating global crisis and recession,” Bernanke said in a September 2010 speech on implications of the crisis for the field of economics. There are limits to how much that interdisciplinary perspective can be modeled quantitatively. Even the most inclusive models cannot be used as a “theory of everything,” to borrow a phrase from physics, that merges a large number of fields and sounds an alarm when events like crises are imminent. Even if computers and math could handle such a feat, the result would risk taking historical relationships for granted, much like the Keynesian equations of the 1950s and 1960s. “We’ve got hundreds of millions of people interacting,” says Whaples. “They’re real people, and they’re complex in their behaviors, their motivations, and their interactions. And there are some really smart ones out there who have seen how the system works and that there’s a little way they can make it work to their advantage,” he says. “You will never be able to model the economy the way my physicist friends want you to.” Even if such a model existed, forecasting crises is, to most economists, a nonstarter. Markets tend to uncover information on crises and turning points before economists can forecast them with models. But the professional rewards for taking a qualitative interdisciplinary perspective are also lower. Using prose instead of math is less likely to get an economist published and is harder to garner attention. Rajan would know; he warned of instabilities with surprising accuracy in a 2005 Federal Reserve conference, when he was chief economist at the International Monetary Fund, and even with his stature he was largely dismissed by the economics profession, including his own colleagues, the IMF said in a recent report. But even with the benefit of hindsight, it is hard to see how it could be any other way: Those who spend their careers predicting unlikely events like large crises and crashes are destined to be wrong a lot of the time. Many economists Region Focus | First Quarter | 2011 15 quite rationally stick to chipping away at outstanding research questions. Arguably, it is regulators who should have been aware of the hidden risks, but the financial system appears to have innovated out of their view. Almost no one appreciated its susceptibility to bank-like runs, Bernanke argued. He described the crisis as a flaw in the administration of economic knowledge — for example, the design of regulations in the public sector, and the design of risk-management systems in the private sector — not so much in the science or theory of economics. The flurry of regulatory overhauls since the crisis — which, economists argue, should emphasize information gathering and better, not simply more, regulation — are intended to fix that problem. In fact, Ohanian predicts the crisis will change regulatory economics more than theories of macroeconomic issues like business cycles and growth. “I think what will come out of the last couple of years is a focus, ironically, more on the ‘micro’ side of macroeconomics, meaning how should you pursue regulation of financial institutions, how do we deal with the problem of too-big-to-fail, what type of accounting standards should be used, how should we move things off the balance sheet back onto the balance sheet,” he says. “I think we learned a lot that bad policies are ones that create bad incentives.” A Caveat, Not a Revolution For better or for worse, most economists don’t seem to predict wholesale changes in what economists study. In addition to forecasting, theory is used to understand how the world works. While there are features that models will inevitably be made to include in order to better study the crisis, like a stronger role for the financial intermediation sector that was the epicenter, many aspects of the crisis were already well-represented in models. Economists had been formally modeling financial market characteristics like runs, illiquidity, risk, and leverage for years, Bernanke said in his speech. It’s that they — and regulators and indeed many market participants — weren’t aware of the specific corners where some of those potential problems existed. Once those corners were revealed, Bernanke said, existing models proved exceedingly helpful in determining how to treat them. The methodological consensus that Blanchard described may have been damaged somewhat by the crisis. Many people viewed dominant methods like DSGE modeling as increasingly useful tools and believed that efforts should be devoted to refining them. But those models couldn’t fully make “sense out of the 2008 financial crisis” says Harald Uhlig, chair of the department of economics at the University of Chicago. Many “Ph.D. students and researchers alike these days want to contribute to the new debates that have emerged rather than fixing these models. That could be a good thing, if alternative, quantitatively useful models eventually emerge,” he says, but it would be a shame if the value of existing models is forgotten in the process. 16 Region Focus | First Quarter | 2011 Instead of a revolution, many see the profession, policymakers, and the public adopting a much humbler view of what economics can tell us about the world. For one thing, the economy is dynamic — what economists believe to be true can change. Economists just got complacent, argues John Quiggin of the University of Queensland in Australia. The “Great Moderation” of the last 30 years, in which recessions were generally mild, made economists and policymakers a bit too comfortable with their apparent past success, he says. We also saw this in the 1920s and late 1960s, he says, in the events that ushered Keynesianism into and out of fashion. “You heard that we had it under control, or as controllable as it might be. Those claims have been proven false, so I guess we have to accept that our knowledge about the macroeconomy is fairly provisional.” Quiggin organized a session at the 2011 annual meeting of the American Economics Association, the professional organization of economists, titled, “What’s wrong (and right) with economics.” Part of his impression from the session was that with economic recovery apparently under way, complacency is probably on its way back. While he concedes that “we may just be too close to the action to see what new ideas are emerging,” he argues that it appears “there was a lot more soul-searching on the part of the Keynesian establishment and a lot more creative stuff happening than there is this time.” It may partially be the quantitative nature of modern economics that causes it to be mistaken for the certainty and precision that natural sciences can offer, George Mason University economist Russ Roberts wrote recently on his blog, Café Hayek. He argues macroeconomics should be viewed more like biology than physics. “We do not expect a biologist to forecast how many squirrels will be alive in 10 years if we increase the number of trees in the United States by 20 percent. A biologist would laugh at you. But that is what people ask of economists all the time.” But the economics profession is relied upon to provide clear policy guidance. The ability to provide it can affect how much attention a theory gets, Ohanian says. “During any kind of crisis or recession there are always many calls from many quarters for government to ‘do something.’ It’s really inconceivable that policymakers might say, ‘You know what, we don’t see that there’s anything we can do that we’re convinced is going to make things considerably better, so we’re going to sit in the sidelines,’” Ohanian says. That means there are rewards of influence to those willing to overstate the certainty of their predictions, says David Colander of Middlebury College, a long-time critic of excess formalism in economics. “A lot of economists don’t do that, but unfortunately they’re not the ones who get reported in the newspaper and whose views get discussed,” he says. “My complaint about economics is that too often some groups of economists let other people think that we fully understand things that I don’t think we do. The honest economic scientist should be willing to say, ‘Scientifically we continued on page 38 Does Math Make Fed Policy? One doesn’t often hear the complaint that economics is too heavy on math and theory coming from within policymaking bodies such as the Fed. A cynic could argue that’s a matter of self-preservation. But those on the inside say it’s because math and theory have an important, though measured, role in policy. Economic models are used to run hypothetical policy experiments — analogous to what physicists do in a lab — the results of which provide some input on the likely path of important indicators like GDP, consumption, or employment in response to the question being asked. How can models that rely on often unrealistic, simplified assumptions possibly be useful for real-world policy? With a fair amount of judgment. Briefings for Fed policy meetings entail many hours of discussion; if they were just about the output of models, they would be over quickly. But judgment doesn’t enter in where one might expect. The math that solves a model isn’t much of a topic for debate since the tools are common and economists are generally confident in the algebraic abilities of their colleagues. The real action is in the base assumptions: the nature of constraints and trading opportunities facing consumers, producers, and policymakers in the simulated economy being represented by a model. Economists ask whether the assumptions behind the model pass the “smell” test, and where a bit of homegrown judgment can fill in holes. That judgment is enlightened by real-world data; anecdotal insights from the Fed’s business contacts (including labor and production decisions they face); bank examiners; discussions with other policymakers; and still other models, including types that rely more on mere data and less on theory and math. Those alternative sources help form opinions about which assumptions, and thus which models, are most likely to reflect how the economy currently functions in the unique situation being considered. When multiple models with believable assumptions start to produce comparable quantitative results, then economists become more confident in their predictions. Why the need for simplifying assumptions at all? The U.S. economy consists of hundreds of millions of people, each with unique circumstances and motivations. Economists will never be able to capture that complexity in a single model. Yet if we thought there were no commonalities between people or across businesses, or that they made decisions randomly, there would not be much to study or explain. So right away economists have to assume some basic rules to get behind the “whys” of human behavior. That’s why many models — particularly ones that include the “microfoundations” of human behavior that are required to evaluate the effects of policy on individuals — get more mathematical as they include more real-world complexity. Math is the only language that is unambiguous. It is the only way to be clear to one’s colleagues what complex behaviors are being assumed, which is how they understand what drives the model’s results and decide whether that is believable. In 2001, economist Robert Lucas described working with Edward C. Prescott in the early 1970s on research that applied the rational expectations concept that would eventually win him a Nobel Prize. (Prescott would also become a Nobel Prize winner for related ideas.) The two economists were struggling to crack how labor markets are likely to respond to monetary policy. Lucas said: Some days, perhaps weeks, later I arrived at the office around 9 and found a note from Ed in my mailbox. The full text was as follows: ÒBob, This is the way labor markets work: v(s,y,) = max {_,R(s,y) + min[_,§v(s«,y,_)Ä (s«,s)ds«]}. EdÓ The normal response to such note, I suppose, would have been to go upstairs to EdÕs office and ask for some kind of explanation. But theoretical economists are not normal, and we do not ask for words that ÒexplainÓ what equations mean. We ask for equations that explain what words mean. From there, any difference of opinion between Lucas and Prescott could only lie in what either believed reasonable to assume about labor markets — not what “might” be true through the lens of ideology or bias. If they could agree on the structure of labor markets — agreement made possible by the clarity math provides — they could agree on the output of the model. Because the public does not converse in this way, the nature of the often subtle debates between research economists rarely translates well to the public. For instance, the public discussion of many policies of the last few years, from quantitative easing to fiscal stimulus, was littered with estimates from economists of various stripes about the likely impacts on jobs and GDP, but with relatively little discussion of what each was assuming en route to their conclusions. It’s no wonder it can appear to outsiders as if economists could be laid end to end and still never reach a conclusion, as George Bernard Shaw allegedly quipped. Economists disagree on many policies, but would agree much more than laypeople might assume if they could first agree on starting assumptions, which, admittedly, is very difficult to do. In that light, it is easy to see why new techniques are readily added to policymakers’ broad toolkit, but new theories take much longer to be embraced. Policymakers tend to wait until an idea is well-established before using it as the basis for policy. In many ways, economist David Colander of Middlebury College says, actual policy today reflects innovations of a generation ago. “The younger people are pushing … new models. But the policy that is used really reflects … some Friedman, some Keynes, a whole variety of ideas,” he says. “There’s judgment.” — RENEE COURTOIS HALTOM Region Focus | First Quarter | 2011 17 Virginia and North Carolina are among the states using money from their 1998 settlement with tobacco companies to spur economic development BY C H A R L E S G E R E N A ell before the federal government doled out billions of dollars to push the economy out of the 2007-2009 recession, states created their own stimulus programs using money from an unlikely source: cigarette manufacturers. In 1998, the attorneys general of 46 states and the District of Columbia signed a master settlement agreement (MSA) to resolve lawsuits against America’s four largest tobacco companies. As of April 2010, the firms have paid out more than $74 billion as part of that agreement, with the Fifth District receiving about $7 billion. The money was intended to compensate for costs associated with smoking-related illnesses and to fund programs that improve public health and reduce tobacco use. But the MSA didn’t dictate how the payments should be used. As a result, many states have used the windfall for a variety of purposes, from buying laptops for classrooms to plugging budget holes. Those with communities that relied on tobacco farming and production to generate economic activity have used the money to offset job losses that have resulted from declines in smoking, some of which may be attributed to the MSA’s restrictions on cigarette marketing. In the Fifth District, Virginia and North Carolina lawmakers devoted a significant portion of their tobacco settlement payments to stimulating development and job growth, especially in rural communities with a history of agriculture and relatively high unemployment. W As with the federal stimulus program, however, it’s difficult to separate the effects of the payments from other things happening in the economy or to know what would have happened in these communities without the influx of outside funds. Since 2000, there have been two recessions and a significant expansion in between, not to mention the repercussions of globalization and technology-driven increases in productivity. In addition, any jobs generated from investing tobacco settlement payments in economic development projects must be weighed against the costs. Will a project require the expansion of local public goods like roads and police? Will it result in the displacement or substitution of existing businesses? An important step in such a cost-benefit analysis is tracking where the money goes. “Good public policy requires that the details of incentive packages be disclosed and that the effectiveness of incentives be measured,” noted Daniel Gorin, an economist at the Federal Reserve Board of Governors, in a 2008 overview of economic development incentives. “Policymakers can then be held accountable for their decisions on the basis of evidence rather than politics.” The following charts summarize tobacco settlement payments to the Fifth District and the spending of those funds in North Carolina and Virginia, both by county and by category. A spreadsheet with more detailed information can be downloaded from the Richmond Fed’s public website: www.richmondfed.org/publications Tobacco Master Settlement Payments to Fifth District, 1999-2010 aryland and North Carolina have been the largest beneficiaries of MSA payments in the Fifth District, each receiving about $1.7 billion between 1999 and 2010. Each state has prioritized its spending quite differently, the $MILLIONS M 700 600 500 400 300 200 100 0 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 MD NC SC VA SOURCE: Tobacco Project website, National Association of Attorneys General 18 Region Focus | First Quarter | 2011 WV DC former supporting health care and the latter emphasizing economic development. Nearly all of Maryland’s payments support statewide and local efforts to reduce cancer mortality and tobacco use, as well as fund substance abuse programs and the state’s Medicaid program. The rest are devoted to a program that helps southern counties transition out of their 300-year-old tradition of tobacco farming. As part of that program, Maryland farmers were offered an annual payment of $1 for every pound of tobacco they grew in 1998 for 10 years, in return for growing something else. More than 90 percent of farmers accepted the voluntary buyout. Other parts of the program continue to fund regional agricultural development and land preservation. In contrast, North Carolina sends half of its tobacco settlement payments to the Golden LEAF Foundation, a nonprofit organization dedicated to fostering development in economically distressed and tobacco-dependent communities. Another quarter of the payments assist tobacco producers and related businesses, while the remaining quarter support tobacco use prevention and cessation, obesity prevention, prescription drug assistance, and other health-related programs. Virginia also splits its MSA payments three ways. Fifty percent of the payments are devoted to assisting tobacco farmers and fostering economic development, 41.5 percent support the state’s Medicaid program and various healthcare initiatives, and 8.5 percent are spent by a foundation that combats youth smoking and childhood obesity. Many states have securitized the stream of income from their future MSA payments so they could have the budgetary certainty of a lump-sum payment. Tobacco-control advocates, however, have criticized the practice. States have frequently used the one-time infusion of money to plug budget holes or finance major capital investments in the short term, not to fund health care programs over the long term as critics say the MSA originally intended. The Fifth District’s experience with securitization of MSA payments has been a mixed bag in this regard. South Carolina was among the first states to take this route in 2001, issuing bonds that raised $934 million. About threequarters of the bond proceeds went into a special trust fund to finance a variety of health care initiatives, including a prescription drug benefit for seniors and antismoking programs. The remainder compensated individuals for losses in tobacco production as well as funding water and wastewater system projects, car tax relief, and grants to local governments. The same year, the District of Columbia sold $525 million in bonds against a portion of its future payments to reduce its debt and fund capital projects. A second bond issue of $245 million in 2006 financed other capital projects. West Virginia securitized all of its future tobacco settlement payments in 2007, raising $807 million to help balance the books of the state’s teacher retirement system. Initially, it had directed half of its payments into a trust fund to cover the future health-related costs of tobacco use and half for the state’s health and human resources department, primarily to replace a portion of state funding for hospitals. Expenditures of Tobacco Master Settlement Agreement Funds by County, 2000-2010 Virginia has favored the southern and southwestern counenoir County in eastern North Carolina is known as the ties where farmers produce most of the state’s tobacco, home of the state’s first governor and attracts visitors while North Carolina has spread its investments to economevery year to its Civil War battlefields and drag strip. So why ically-challenged rural counties in general, not just did $106 million in tobacco settlement payments flow into tobacco-growing centers. this rural county during the last 10 years? ➤ A single project accounts for 95 percent of this money: the North Carolina Global TransPark, a 2,500-acre site that the state envisions as an air cargo airport surrounded by just-in-time manufacturing facilities. The airport has been developed and one anchor tenant, Spirit AeroSystems, was secured, thanks to a $100 million grant from the Golden LEAF Foundation to finance the construction of its manufacturing facility. A single project also accounts for Pittsylvania County receiving the largest amount of payments — $92 million — in Virginia. About $26 million has been awarded to the county and the city of Danville to develop the Institute for Advanced Learning and Research into a high-tech hub for southern and Southwest Virginia. The institute partners with Virginia Tech to conduct research and development in horticulture and forestry, motorsports, and other areas, as well as commercialize technologies in those fields. Aside from these outliers, Virginia and Virginia Tobacco Indemnification and Community Revitalization Commission, Virginia Foundation for Health North Carolina have invested their tobacco set- SOURCES: Youth, Golden LEAF Foundation, North Carolina Tobacco Trust Fund Commission, and North Carolina Healthy and tlement payments over a wide geographic area. Wellness Trust Fund Commission L Region Focus | First Quarter | 2011 19 Expenditures of Tobacco Master Settlement Agreement Funds by Category of Spending, 2000-2010 Noneconomic Development Spending (Total = $455.2 million) 400 350 300 250 200 150 100 50 0 $MILLIONS North Carolina $MILLIONS Economic Development Spending (Total = $494.6 million) K-12 Business Public Higher Ed. Development Construction & Workforce Education & Expansion Projects Development Agriculture 400 350 300 250 200 150 100 50 0 Health-Related Expenditures $MILLIONS $MILLIONS Virginia Higher Ed. Public K-12 Business Development Construction & Workforce Education & Expansion Projects Development Other Noneconomic Development Spending (Total = $642.8 million) Economic Development Spending (Total = $621.8 million) 400 350 300 250 200 150 100 50 0 State Smoking Tobacco Farmers & Producers General Fund Prevention Agriculture 400 350 300 250 200 150 100 50 0 Health-Related Expenditures State Smoking Tobacco Farmers & Producers General Fund Prevention Other NOTES: Data represents grants awarded and program funds committed by a state-designated organization during the specified fiscal year. It does not include nongrant or non-program expenditures, such as an organization’s administrative and marketing expenses, or direct compensation to tobacco farmers. Actual disbursements may occur in subsequent years, and end up being smaller than the original grants due to grantees discontinuing the funded program or the organization cutting off funding. Data for Virginia includes the 40 percent of the state’s MSA payment set aside for the general fund from 2000 to 2004 and for the Virginia Health Care Fund from 2005 to 2010. SOURCES: Virginia Tobacco Indemnification and Community Revitalization Commission, Virginia Foundation for Healthy Youth, Golden LEAF Foundation, North Carolina Tobacco Trust Fund Commission, and North Carolina Health and Wellness Trust Fund Commission hen tobacco settlement payments are used to develop certain sectors of a regional economy or support specific businesses, states are placing bets on economic winners. Lawmakers, along with the assortment of commissions they have created to spend the money, may try to reinvigorate declining industries or jump-start new ones. The problem is that empirical research hasn’t been able to conclusively prove whether such “industrial policy” in general pays off significantly. “The standard justifications given … by state and local officials, politicians, and many academics are, at best, poorly supported by the evidence,” noted Alan Peters and Peter Fisher, both professors of urban and regional planning at the University of Iowa, in a 2004 journal article. Consequently, Peters, Fisher, and other researchers of economic development incentives advocate improving education, simplifying the taxation and regulation of businesses, and doing other things to encourage economic growth rather than trying to choose winners. Virginia and North Carolina have tried both approaches with the tobacco settlement funds they have devoted to economic development. From 2000 to 2010, the states awarded $35 million in grants for K-12 education, funding after-school W 20 Region Focus | First Quarter | 2011 programs for at-risk youth and purchases of laptops, iPod Touches and other technology for classrooms. A larger chunk of money — $247 million — went into higher education and workforce development. In many cases, the money paid for new equipment at community colleges so that students can be trained to use the latest technology. The biggest portion of Virginia and North Carolina’s economic development dollars — about $561 million — funded business development and expansion on the local, regional, and statewide level. Often, the money helped existing businesses to acquire capital equipment or financed improvements to business parks and municipal infrastructure to attract new businesses. Targeted industries included biotechnology, manufacturing, and tourism. Budget pressures, however, have competed with efforts to spur economic development. More than $233 million in tobacco settlement payments have been directed by North Carolina lawmakers into the state’s general fund during the last 10 years. About $273 million of Virginia’s payments went into the state’s general fund, largely because 40 percent of payments were automatically counted as general revenue from 2000 to 2004. RF The role of temporary employment in recession and recovery BY B E T T Y J OYC E N A S H A N D J E S S I E RO M E RO andra Youngblood’s temporary staffing service took its first hit in 2008, when her clients started laying off temps. “So as not to affect their permanent staff,” she says. She runs Youngblood Staffing with her husband; the home office is in Wilmington, N.C., with branches in Whiteville, Lumberton, and Fayetteville. The staffing company was lucky, she says, because they were able to cut expenses to the bone and save their own employees’ jobs. “We did not have to close offices.” Things got better. By the middle of April 2010, “The skies opened and we tripled our business in a month and we have not slowed down.” This pattern of boom and bust has become typical of the temporary help services industry. Starting with the 19901991 recession, the drop-offs and subsequent spikes in temp employment have intensified with each downturn. Temp employment turns negative months before total nonfarm employment — 12 months before, in the case of the last recession — and then starts increasing before an employment uptick. As a result, economists see temp employment as a buffer during recessions and a harbinger of direct hiring during recoveries. Temporary jobs accounted for 26 percent of the new private-sector jobs created in 2010, compared to 7.1 percent in the same period following the 2001 recession. The industry has added an average of 25,000 jobs each month for the last year, the most of any sector. With unemployment still around 9 percent more than 18 months after the recession officially ended, some observers are wondering if, not when, companies will start hiring new employees directly. But although temporary employment is increasing more rapidly than employment overall, it remains a small share of the total. Even at the industry’s peak in 2000, it accounted for just 2 percent of total employment, and today it is 1.7 percent. But just how temporary is the current preference for temp workers — and does that preference have long-term implications? S The Macro Level The temp industry grew from 1 million to 2.7 million workers in the 1990s, and its growth is cited as a factor in the decade’s historically low unemployment and inflation rates. At least in the short term, some economists view the relationship between unemployment and inflation as a trade-off: Low unemployment and strong economic growth may lead to upward pressure on wages and “overheating” in the economy generally, which lead in turn to higher prices. But the wide availability of temp workers may have reduced the wage pressures that typically accompany a tight labor market, as suggested by Lawrence Katz of Harvard University and Alan Krueger of Princeton University in a 1999 paper. Looking at state-level data, they found that wages rose more slowly in states with a higher share of temporary employment: An 0.25 percent increase in temporary employment was associated with 0.2 percent slower wage growth. At any given time, there is a certain rate of “natural” or “frictional” unemployment caused by the fact that it takes time to match workers with open jobs. Katz and Krueger note that temporary employment may smooth this friction by making matching more efficient. A worker can sign on with a temp agency instead of spending time searching for an open position, and a company can contract with a temp agency instead of spending time recruiting workers. And although a worker may be in a temp job involuntarily, preferring to find a permanent job, at least that worker is no longer unemployed. For these reasons, the rise in temporary employment between 1979 and 1993 may have lowered the natural unemployment rate by as much as 0.25 percent, according to a 1999 paper by Maria Ward Otoo of the Federal Reserve Board of Governors. During a downturn, temps lose their jobs first. During the 2001 recession, temporary workers accounted for 26 percent of net job losses, although they made up only 2 percent of the workforce prior to the recession. Between December 2007 and December 2008, temp employment dropped by more than 484,000 jobs, about 19 percent, while total employment fell by 2.3 percent, according to the Bureau of Labor Statistics (BLS). The trough was in June 2009, when temp employment fell to its lowest rate since 1995, but since then it has increased every month, except for a regular seasonal downturn in January. Temping’s Appeal Such flexibility is what the temp industry was designed for: Companies use temps to respond to changes in demand without incurring the costs of hiring and firing. It is expensive to recruit, train, and provide benefits for new employees. If demand falls, the “adjustment costs” of laying off workers, such as mandatory advance notice of layoffs or severance packages, can equal as much as a full year of payroll. Firms may also use temporary workers to screen potential new employees or to fill in for sick employees. Unemployment insurance taxes are another major expense. Companies are “experience rated” according to the number of workers they lay off who then claim benefits; more layoffs bring higher taxes. But temp workers are employed by the staffing agency, which means the agency, Region Focus | First Quarter | 2011 21 Temporary Employment, GDP, Nonfarm Employment, and Unemployment 70 60 PERCENT CHANGE 50 40 30 20 10 0 -10 -20 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 -30 Nonfarm Employment Temporary Employment GDP Unemployment Rate NOTES: Data are seasonally adjusted. GDP in real 2005 dollars. Shaded areas represent recessions. SOURCES: Bureau of Economic Analysis, Bureau of Labor Statistics, Haver Analytics not the company where the work was performed, gets the rating. There are some disadvantages to using temporary workers. They can be less productive, perhaps because they are not motivated by the prospect of future raises or promotions, according to Chicago Fed economist Yukako Ono. Companies also pay a fee to the staffing firm, which varies depending on the number of workers, skill level, and contract length, among other factors. The total “bill rate” covers the worker’s wage plus a markup for expenses such as workers’ compensation and payroll taxes, operating expenses, and a profit margin. That markup may not translate into higher wages for the employee. Temp workers in high-skill, high-demand fields, such as nursing, may earn more per hour than a similar permanent employee. But the majority of temporary positions are in low-skill fields such as clerical or light industrial work, where average hourly pay ranges from 75 percent to 85 percent of the national average wage for the same position, according to the BLS. Benefits also tend to be worse for temps than for regular employees, except in high-end fields. Although many temp firms offer health insurance, the policies are often bare bones and the worker is responsible for most of the cost, so the share of workers opting for coverage is low, according to Bryan Pena of Staffing Industry Analysts, an industry consulting group. The uncertainty of temp work may make workers more likely to be depressed or anxious, according to researchers at McGill University. Anecdotally, temp workers report feeling like “second-class citizens” in the workplace, and miss feeling connected to an employer and their coworkers. Others prefer the flexibility, however, and view temp work as an opportunity to quickly learn new skills. But the majority of temp workers accept temporary jobs for economic reasons, either because it was the only job they could find or because 22 Region Focus | First Quarter | 2011 they hope the placement will translate into a permanent position. In the last few years, the industry also has emphasized its role as a bridge to traditional employment. Surveys conducted by the American Staffing Association (ASA) report that about half of temporary workers are eventually hired directly by the company where they were placed as a temp or with another firm. Results are mixed for the less-skilled and low-wage workers who make up the bulk of the industry. Temporary work employs a significant portion of participants in government employment and training programs; after the 1996 welfare reform in the United States, 15 percent to 40 percent of former welfare recipients found work in temporary jobs. A 2009 study followed welfare-to-work clients who were randomly placed in either temporary or directhire jobs. The workers placed in direct-hire jobs were more likely to be employed and had substantially higher earnings over a two-year period, but placement in a temporary job had no long-term positive effects on the probability of remaining employed or on earnings. “Placing them in a temporary job was about equivalent to no placement at all,” says David Autor of MIT, who conducted the study with Susan Houseman of the Upjohn Institute for Employment Research. Still, when unemployment is high, a temporary job is better than no job. “You make more as a temp worker than you do when you’re unemployed,” Autor says. “Being employed is a positive thing.” The number of people looking for jobs right now means that competition is stiff even for temp jobs. Just ask Zachary Basham, 23, of Nellysford, Va. As a recent college graduate, he secured a temp job at a law firm reviewing claims. “It was my first job out of college and I was simply looking for a place to start.” But he was let go after a week, along with several others, with no explanation or warning. He returned to doing maintenance on a golf course, but is currently unemployed except for his volunteer work in an alumni office at a private school. He’s registered with several temp agencies but has had no calls. Harbinger of Hiring? Temporary employment is broadly viewed by economists and policymakers as a leading indicator of permanent job creation — a sign that companies are trying out new workers and positions in anticipation of increased demand. But persistently high unemployment combined with the rapid growth of temp jobs has some observers worried that this recession is different. Data proving the relationship between temporary and permanent employment are hard to come by. An analysis of BLS data by the American Staffing Association shows that, in the past, temp employment has led overall employment by about six months during normal economic times, and by three months when the economy is coming out of a recession. The ASA study doesn’t include data from the current recession, which was both deeper and longer than the ones before it. Economists and other researchers don’t think that a “perma-temp” workforce is likely. “I don’t see what would make us think this is some kind of brand-new paradigm. It seems a perfectly plausible response to a period of sustained uncertainty,” Autor says. A large and permanent increase in temporary employment would mean that something dramatic had changed in the economy to change the costs of direct versus temporary hiring, he explains. “In general that would be an indicator of some type of deeper ailment. I don’t think that’s going to happen.” Instead, the magnitude of the initial downturn may mean that there is a longer lag between temporary and permanent hiring on the upturn. “It’s been a long deep recession and faltering recovery. It’s natural that employers would not be hiring like gangbusters,” says Autor. The “absorptive capacity” of the temp industry has increased with each of the last three recessions, which could be a factor in the jobless recoveries that have followed. Erica Groshen, an economist at the New York Fed, suggests that perhaps one-third of the current job loss is due to cyclical change, resulting from decreased demand, and two-thirds to structural change, meaning that the jobs that have been lost aren’t coming back in the same industries or locations. Temp work may facilitate structural change by enabling companies to use a downturn as an opportunity to reorganize production processes and trim payrolls. It may also make companies more likely to shed jobs via permanent layoffs, rather than laying off workers temporarily and then recalling them if things pick up, as Groshen explains in a 2003 article with fellow New York Fed economist Simon Potter. Currently, consumers are spending more freely, business investment is picking up, and GDP is generally projected to grow 4 percent in 2011. And employers are adding to payrolls, albeit slowly. Hiring is likely to come around once employers trust the recovery, says Michael Doyle, vice president and general manager of the Southeast division of Manpower, one of the nation’s largest temporary staffing firms. He is also seeing an increase in the number of people moving from temporary to direct-hire jobs. “I think [firms] cut beyond, maybe 2 percent to 3 percent more than they should have, and they’re now hiring temps back rather than full-time employees.” That said, for some companies, a leaner staff that uses the flexibility of temps may become their continued on page 38 How Temping Grew In 1956, about 20,000 employees worked in temporary help services, mostly in factories and offices. Though temp use has spread across job sectors in the last 20 years, about 80 percent of temp labor is still low-skill, low-paying light industrial and clerical work. The remaining 20 percent includes professional jobs such as engineering, information technology, and health care specialties. Many factors played a role in the rapid growth of the temporary sector, which expanded by 11 percent annually between 1979 and 1995. About 20 percent of the increase can be explained by the decline of employment at will, according to MIT economist David Autor. Employment at will traditionally assumed that employers and employees can unilaterally end a relationship when and why they choose, unless otherwise stated by contract. The doctrine was recognized throughout the United States in the middle of the 20th century. Between 1973 and 1995, however, the courts of 46 states found exceptions that limited employers’ discretion to let workers go. The direct and indirect costs to employers of legal action in the wake of those decisions are hard to quantify. But Autor’s results found rapid growth of temp employment after a state’s courts adopted an exception to employment at will, about 10 percent in the year of the ruling. The paper notes that temporary help continued to expand after 1992, several years after adoption of the most recent exception, suggesting other factors are involved as well. As advances such as just-in-time delivery caught on, firms incorporated the idea of just-in-time labor into employment practices. Temp agencies also got good at matching people to jobs, using technology and expanded footprint to reach across geographical areas. The agencies added client services such as training and consulting, which has contributed to growth in the sector. As the concept of the variable workforce has taken hold, Michael Doyle of Manpower observes that this has become an integral part of company strategy, especially for large corporations Today, the $70 billion temp employment industry consists of about 23,000 agencies. Almost half are small, with fewer than 100 full-time employees. At the other end of the spectrum are companies like Manpower, which operates in 82 countries and has 4 percent market share in the United States. During and after the recession, buyers have been negotiating lower fees, and the industry has consolidated. Weaker firms have been acquired by larger firms that, in some cases, cut rates to knock out the competition, according to Brian Pena of Staffing Industry Analysts. Temp agencies face many of the same pressures as their client firms — particularly unemployment insurance taxes. Sandra Youngblood, of Youngblood Staffing, headquartered in Wilmington, N.C., says she’s careful to serve only workers she can place elsewhere, in the event of layoffs. She recently turned down a chance to place 60 seasonal employees. “Back in the day, we would have jumped all over that,” she says. “But knowing that in July those people are going to be gone, I don’t see where I could place them.” She, too, worries about the rising rates. The jobs recovery may have started first in the temp agencies, but even they still face tough decisions when it comes to hiring. — BETTY JOYCE NASH Region Focus | First Quarter | 2011 23 BY B E T T Y J OYC E N A S H T Liaison Officer Ted Kientz (at left) shows Army Chief of Staff General George William Casey around the newly built U.S. Army Forces and Reserve Commands combined headquarters at Fort Bragg, N.C. 24 Region Focus | First Quarter | 2011 Though communities may benefit long term, an influx may mean short-term pain in the form of crowded schools and congested roads. The region near Fort Bragg will need to educate the expected 6,000 new students that may crowd classrooms and overwork teachers. Moreover, Fort Bragg’s expected 41,000 new people will crowd area roads. Bragg, Aberdeen, and Lee: Gateways to Growth Growth inside military gates can mean growth outside the gates. Jobs may expand in construction, retail, health care, and hospitality. But the economic effects are likely to be larger in locations where there’s already a healthy mix of professional positions in scientific research and development and engineering. Nowhere is that truer than in Maryland, a state where the pre-BRAC military in 2008 generated $16 billion in direct spending in the state, according to Richard Clinch, an economist at the University of Baltimore who has studied the economic role of Maryland’s military installations. “Maryland is lucky in that it has been able to attract, because of its proximity to Washington, high value-added services for the U.S. Department of Defense,” Clinch says. The 73,000-acre APG will gain between about 8,000 and 9,000 positions, 5 percent of which are military. People are starting to move in. Many of those are transfers from Fort Monmouth in New Jersey, which is slated to close. Commercial and residential real estate are selling in Harford County, home of Aberdeen Proving Ground. “About 60 percent of the Fort Monmouth workforce has relocated due to this BRAC,” says Denise Carnaggio, deputy director of the county’s office of economic development. She also reports strong demand for Class A office space. The positions at APG will average $80,000 annually and will include jobs in engineering, electronics, systems, computers, and budget, among others. Incoming organizations include communications, electronics management and research, vehicle technology research, and medical and chemical defense R&D. APG’s expansion is its biggest since World War II, with a dozen missions arriving from eight states, including some from elsewhere in Maryland and Virginia. Military spending stimulates personal income growth in states with higher manufacturing and retail shares, and in those that already receive a large share of military prime contracts, according to a March 2009 paper by Michael Owyang of the St. Louis Fed and Sarah Zubairy of Duke University. The benefits of military spending are unlikely to be as great in isolated areas, where there may be only trooprelated activities. In those cases, the military bases may be PHOTOGRAPHY: D. MYLES CULLEN/COURTESY OF U.S. ARMY he stars will soon align over Fort Bragg, N.C., and when they do, only the Pentagon will have more generals. That’ll be sometime this year when two Army commands relocate to brand-new Fort Bragg headquarters, part of a total of about $1.3 billion in construction. Likewise, Fort Lee’s 6,000 acres in central Virginia are humming with $1.2 billion worth of new and expanded training and logistics schools and facilities for all military branches. Communications and intelligence operations are expanding big-time at Fort Meade and the Aberdeen Proving Ground (APG), among others of Maryland’s 17 military sites. These expansions result from the 2005 Base Realignment and Closing plan, or BRAC, which reshuffles 33 major sites and closes 22 others. (See adjacent map.) It’s a $35 billion nationwide effort, due for completion by mid-September. Economic studies project mostly positive economic effects from the added commerce the expansions will bring. Base closures typically bring the opposite: job loss, the severity of which varies with the strength and diversity of a community’s nonmilitary economy. Military installations can benefit communities, especially posts that import the equivalent of a corporate headquarters, with highly paid jobs — engineers, scientists, professionals, and high-level managers. Those locales see more of a boost than ones with bases that only process and train troops. Already, 40 new defense contractors have set up shop in Harford County, Md., in anticipation of APG’s economic boon. District of Columbia Northern Virginia Maryland 1. Bolling Air Force Base 2. Naval District Washington 3. Potomac Annex 5. DFAS Arlington 6. Marine Corps Advanced Amphibious Assault 15 . Fort Meade 16 . Naval Station Annapolis 7. Fort Belvoir 8. Marine Corps Base Quantico 9. Naval Surface Warfare Center Dahlgren 17 . Army Research Laboratory Adelphi 18 . Naval Surface Warfare Center Indian Head 19 . Andrews Air Force Base 4 . Walter Reed Army Medical Center 20 . 21 . 22. 23. 24. 25 . 26. 27. 10. Arlington Service Center 11. Headquarters Battalion Headquarters Marine Corps Henderson Hall 12. Various Leased Space West Virginia 13. Fairmont U.S. Army National Guard Reserve Center 14. Bias U. S. Army Reserve Center Huntington Maryland 13 22 29 BALTIMORE 26 21 24 25 23 15 16 10 4 11 5 1 2 3 12 19 20 18 7 6 9 8 Naval Air Facility Washington Martin State Airport Air Guard Station Fort Detrick Navy Reserve Center Adelphi National Naval Medical Center Bethesda Naval Surface Weapons Station Carderock Aberdeen Proving Ground Defense Finance and Accounting Service Patuxent River 28. Naval Air Station Patuxent River 29. PFC Flair U.S. Army Reserve Center Frederick 17 WASHINGTON, D.C. West Virginia 14 CHARLESTON Virgini rginia rgini ginia RICHMOND 27 28 Central and Southeastern Virginia 30 31 33 32 34 35 36 38 37 42 41 39 40 North Carolina 43 CHARLOTTE 49 44 46 30 . Richmond International Airport Air Guard Station 31 . Defense Supply Center Richmond 32 . Naval Weapons Station Yorktown 33 . Fort Lee 34. Fort Eustis 35 . Langley Air Force Base 36. Fort Monroe 37. Naval Amphibious Base Little Creek 38. Naval Support Activity Norfolk 39. Naval Medical Center Portsmouth 40. Naval Shipyard Norfolk 41. Naval Station Norfolk 42. Naval Air Station Oceana 45 47 50 48 South Carolina COLUMBIA 55 56 57 52 North Carolina 43 . 44. 45 . 46. 47 . 48. 49. 50. Navy Rese rve Center Asheville Charlotte/Douglas International Airport Pope Air Force Base Seymore Johnson Air Base Fort Bragg Marine Corps Base Camp Lejeune Niven U.S. Army Reserve Center Albermarle Marine Corps Air Station Cherry Point 51 53 54 South Carolina 51. 52 . 53. 54. 55. 56. 57. Defense Finance and Accounting Service Charleston Naval Weapons Station Charleston South Naval Facilities Engineering Command Marine Corps Air Station Beaufort Fort Jackson Shaw Air Force Base McEntire Air Guard Station Legend Closure Realignment Gaining jobs SOURCE: 2005 Defense Base Realignment and Closure Commission Final Report Region Focus | First Quarter | 2011 25 more self-contained. George Mason University economist Stephen Fuller notes that in such areas, “It’s easier to keep everything on post. The spillover is only some retail spending. There isn’t a whole lot that goes back into the local economy.” Fuller studied the effects of BRAC on jurisdictions in Northern Virginia concerned about the effects of a possible influx of new residents, especially in less-populated areas. Forts Bragg and Lee will be home to high-level commands, and though they are located in less-populated locales, the regions have already benefited. For example, the initial large-scale construction may have buffered the Fort Lee area near Petersburg, Va., from the recession. The construction remains under way as the BRAC effort goes full throttle toward its deadline. “All that construction worked to our advantage,” says Dennis Morris of the Crater Planning District Commission, an organization of 11 jurisdictions. A procurement organization helped small businesses identify subcontractors who then met with the prime contractors. About 65 percent of the prime contracts were awarded to Virginia firms, and subcontractors in the region received 851 contracts. “The bottom line is we fared well in the region on getting our share of those prime contracts or subcontracts,” he says. “Our rural areas did very well as a supplier for, let’s say, brick for the new barracks.” Fort Lee will double in size, to approximately 44,500 people, divided roughly evenly between employees (military, civilian, and contract) and family members. Its biggest impact may be in total wages and salaries. The Virginia Employment Commission has estimated, beginning in 2008, those could average $1.2 billion per year through 2011, though the projections could be off because many are opting to commute, for now, if they live within a couple of hours’ drive, from Tidewater or Northern Virginia. Fort Lee, long the source of logistics training and supply — “the right stuff at the right place at the right time” — will now handle more training, including schools of transportation, ordnance, and culinary arts, among others, for all military branches. There will also be about 700 to 900 people employed by the Defense Contract Management Agency, the procurement headquarters for all the branches. To house students, there’s a 1,000 room hotel under construction. In short, Fort Lee will train every branch of service in jobs and missions that support soldiers. The fort trains people in “everything from a young man’s personal hygiene needs, that is, how to take a field shower and do their laundry, to explosives and ordnance, including repairing a weapon,” says Scott Brown, chief of Fort Lee’s BRAC synchronization office. Many operations at Fort Lee involve high-level management — the Quartermaster, Ordnance, Transportation schools for the Army, the Air Force Transportation Management School, and the Defense Commissary Agency headquarters. In many ways, these resemble corporate headquarters. And those managerial jobs pack a bigger economic wallop. 26 Region Focus | First Quarter | 2011 Fort Bragg’s new $300 million, 700,000-square-foot headquarters will do likewise. The complex will house the U.S. Army Forces Command, FORSCOM, and the U.S. Army Reserve Command, USARC. Overall, Bragg is expected to grow to 58,336 military personnel, from preBRAC levels of 49,247. The two commands alone are likely to bring nearly 3,000 active-duty, civilian, and contractor jobs, with higher than average salaries for the Sandhills region. Average military and civilian salaries for FORSCOM positions pay $75,000. At USARC, the average is $93,000 for military salaries and $78,000 for civilian. Roughly a third of the civilian employees in this group are expected to relocate to Bragg from other bases; the rest will be hired locally or move for a job. An estimated 1,000 military contractors are expected to set up shop in the area to be close to key decisionmakers. Military-related population growth includes active-duty soldiers, civilians employed by the Army, private contractor employees, and Army dependents. The number also includes people who may move to the region to get a job off the base. Fayetteville issued $300 million in new permits — everything from luxury apartments to four-star hotel projects — in 2010, according to Fayetteville City Manager Dale Iman. While this flurry of activity is welcome, the costs to state and local governments can create fiscal challenges in the short term, economist Clinch says. “The problem with any introduction of a huge economic activity is that the capital costs have to be paid for but the revenue comes later.” Education Station That’s happening with roads and schools in the 11-county Sandhills Region of North Carolina and in Aberdeen’s Harford County, and probably any locale where military posts are growing. Take schools near Fort Bragg, for instance. “Our counties are struggling to figure that out,” says Greg Taylor, executive director of the task force. Schools in counties likely to be affected, Cumberland, Harnett, and Hoke, are estimated to need nearly $220 million in capital construction, $68.4 million of that is related to military growth. The county has used bonds to pay for four new schools in the past couple of years. But it’s still not clear whether funding from the state will keep up with Cumberland County’s growth, according to Theresa Perry, assistant superintendent. Districts are entitled to impact aid from the federal government because military installations pay no taxes, but the program isn’t fully funded. And if the school district lies outside of the county where the base is located, the per-student aid is half. Business growth will occur in the wake of Fort Bragg’s expansion, Taylor says, “but that money comes after the fact. You can’t tell the kids, wait five years and we’ll build you a school; the funding to fix the problem comes later.” In counties where the tax base includes a healthy mix of commercial and residential properties, there’s less to worry about. But in largely residential, or bedroom, counties, there may be problems. Schools in Prince George County aren’t hurting, though — at least, not yet. Bobby Browder is superintendent. “Projections of student population were much higher but a majority of individuals planning on moving didn’t move.” For example, the Transportation Command from Fort Eustis in Newport News will move, but the majority of employees have not. “From what we can discern, they are commuting,” he says, along with transferred personnel from locations in Northern Virginia. “Because they couldn’t sell their homes, they’ve bought hybrids and they commute.” Browder’s happy that growth hasn’t materialized all at once because Prince George is only now receiving its impactaid funds for the 2007-2008 school year. With the recession, school funding everywhere has been cut, so the impact aid becomes even more important. To date, the district has received $3.5 million in impact aid. The Military Road Transportation remains a hot issue. The National Academy of Sciences studied funding of traffic improvements following the BRAC report. Short-term strategies include quick fixes — toll lanes or lane widening. The report cited fundamental flaws in the BRAC decisions concerning the ability of local infrastructure to handle added traffic. The report also cited the Defense Department’s inability to fund road improvements, and poor communication between installations and local transportation authorities. To ease projected congestion on Interstate 95, a partnership between Amtrak and Fort Lee puts troops on trains, right on base, for weekly training exercises at Fort A.P. Hill, in Northern Virginia. Workers broke ground on a spur, a portion of Interstate 295 that will link Fort Bragg to Interstate 95. Other improvements to ease ingress and egress from the base may take years and lots of money. Case in point: State and federal funding sources can’t cover the tab for appropriate projects — $344 million — to widen roads and provide direct interstate access to Bragg, the biggest post in the nation. In 2008, more than 400 military convoys with troops and heavy equipment traversed Fayetteville, N.C.’s city streets on their way to Bragg. Although military spending doesn’t always offset costs for communities, they are Military Spending and GDP never unwelcome, says Randy Defense accounted for more than a Parker, an economist at East Carolina University. “The peothird of GDP in World War II, and by ple in the town in which these 1947, it had fallen to about 7 percent. bases are centered are not The Korean War, the Cold War, and unhappy campers,” he says. later, the Vietnam War drove “The people are happy to accept any type of growth that defense’s share of the GDP into doucomes their way.” ble digits. That was in the 1950s and He points to the large 1960s. By 1979, military spending fell number of military retirees to less than 6 percent of GDP. It rose in the 11-county region near Fort Bragg. “I don’t know above 7 percent in the mid-1980s, but that this will bring factory by 2000, the so-called “peace divijobs and so forth, but dend,” took spending below 4 percent it fosters some economic of GDP. Bases were closed or growth in restaurant jobs, primarily in service and retail realigned in 1988, 1991, 1993, and 1995, jobs.” as spending declined. Since 2001, But a possible military defense spending has grown to drawdown presents future almost 5 percent of GDP. risks. Some regions are just now recovering from previous — From “How is the Rise in Defense Spending base withdrawals, and, of Affecting the Tenth District Economy?” by Chad Wilkerson and Megan Williams, Federal course, APG’s economic gain Reserve Bank of Kansas City Economic Review, is New Jersey’s loss, at least in Second Quarter 2008 the short term. “When you link your fortunes to military growth, then you also link your fortunes to military shrinkage,” Parker says. “It is a somewhat risky strategy, since you’re putting your eggs in one basket.” When the stars are rising, that’s great, but when spending shrinks, local military economies may falter. Northrop Grumman, the defense contractor, has attributed its hundreds of recent layoffs in the district to a slowdown in defense spending. Still, national security is likely to remain a major budget item for the foreseeable future even if peace breaks out all over. RF READINGS Gonzales, Oscar. “Economic Development Assistance for Communities Affected by Employment Changes Due to Military Base Closures (BRAC).” Congressional Research Service Report for Congress, June 2009. Owyang, Michael, and Sarah Zubairy. “Who Benefits from Increased Federal Spending? A State Level Analysis.” Federal Reserve Bank of St. Louis Working Paper, June 2010. Region Focus | First Quarter | 2011 27 INTERVIEW Joel Slemrod Editor’s Note: This is an abbreviated version of RF’s conversation with Joel Slemrod. For the full interview, go to our website: www.richmondfed.org/publications u RF: In a 2005 paper, you described your “Beautiful Tax Reform.” Could you briefly discuss the system that you laid out in that paper? Relatedly, do you think it is possible to divorce normative concerns from positive concerns when thinking about tax policy, or will equity issues always arise? Slemrod: Let me take the second question first. Although this paper does lay out a framework for my preferred tax system, it also argues that one’s preferred tax policy is inevitably a mixture of what one thinks about how the economy works — for example, behavioral responses to tax rates — and also value judgments, which aren’t subject to economic analysis and probably are very hard to persuade others to adopt. So I thought that I could make a bigger contribution to this edited volume (which included papers from many people saying what tax system they would prefer), if I stated explicitly how my own preferred tax reform depends on both my views on how the economy works and on what my values are. So my answer to the second question is no — what tax policy is best will always depend on both positive and normative judgments. Let’s now come back to the first question. In the paper, I made the point that the simplest tax system isn’t necessarily 28 Region Focus | First Quarter | 2011 the best, in part because of the trade-off between what one might call efficiency and equity. Consider that the simplest tax system is probably what one might call a lump-sum tax, where everyone pays basically a fixed amount. It wouldn’t be trivial to enforce, but it certainly would be a lot simpler than what we’ve got now. However, I think most people, maybe not everybody but most people, would find that objectionable because they think the tax burden ought to be related to a household’s well-being, whether that be assessed by income or wealth or consumption or some other measure. So that’s why I wouldn’t favor replacing our graduated income tax system with a lump-sum tax or a value-added tax — because the distribution of the tax burden is not progressive enough for me, and that, I emphasize, is “for me.” If your values were such that you were happy with an approximately proportional tax burden, where the tax burden is approximately proportional to lifetime income, there’s a lot to be said for just relying on a value-added tax. But I’m not willing to do that, so I’m sticking with a tax system that relies heavily, maybe not entirely, but heavily on a graduated income tax. The rest of the paper talks about fairly standard ways to clean up the income tax, because I think a lot of the exceptions to a straightforward tax levied on income — a lot of the credits and deductions, for instance — are examples of the PHOTOGRAPHY COURTESY OF THE UNIVERSITY OF MICHIGAN: The recent recession and relatively sluggish recovery have prompted much discussion about what policymakers can — and cannot — do to stimulate economic activity and foster a healthy financial system. Much of that discussion has focused on monetary and regulatory policy, of course. But fiscal policy can play an important role as well in such periods. Joel Slemrod, an economist at the University of Michigan, has spent his career working in the field of public finance. His research has spanned a number of areas, including how sensitive businesses and individuals are to tax rates within and across countries, and how that sensitivity affects their location decisions; the degree to which households, especially high-income households, alter their behavior due to tax policies; conditions that may affect people’s savings behavior; and the level of noncompliance with tax laws in the United States and abroad. Slemrod, who directs the Office of Tax Policy Research at Michigan, also has held numerous appointments in Washington, D.C., with such institutions as the U.S. Department of Treasury and the President’s Council of Economic Advisers. Aaron Steelman interviewed Slemrod in the fall of 2010. RF: What does our recent experigovernment favoring particular One’s preferred tax policy is ence tell us about the effecpeople or particular activities that result in inefficiency. So, for examtiveness of tax rebates in stimuinevitably a mixture of what one lating economic activity in a ple, I would get rid of the itemized deduction for state and local propthinks about how the economy recessionary period? erty taxes; I would get rid of the works — for example, preferential tax treatment of Slemrod: Actually, we have had employer-provided health insurthree tax rebate policies enacted in behavioral responses to tax rates the last decade — 2001, 2008, and ance; and I would clean up the implicit subsidy to owner- occupied then again in 2009. I have done a — and also value judgments. housing, and although I don’t have a fair amount of research on this topic clean solution for how to do that, with Matthew Shapiro, my colwe know how to move things in the right direction. Then in league here at Michigan. The research methodology is based the paper I talk a little bit about how the process of filing on posing the following questions to a sample of people: taxes could be simplified, so that a large fraction of What did the tax rebates lead you to do? Did they lead you Americans wouldn’t need to file a tax return at all. Other mostly to increase spending, mostly to increase saving, or countries do this, including 15 OECD countries, and we mostly to pay down debt? could do it, too, regardless of how hard the Internal Revenue When we focused on 2008 and 2009, in both cases we Service (IRS) now says it would be. And, finally, I talk a little found that only a small fraction of people said it led them to bit about cleaning up how corporate income is taxed in the mostly increase their spending. In 2008, less than a quarter United States. Almost no economist thinks we have a very of people said that and in 2009, only about 13 percent said rational system. The most obvious problem is that income is that. So we concluded that the stimulus to spending that taxed first at the corporate level and then at the personal works through the marginal propensity to spend would actulevel, but the underlying issues are that the rate of tax on ally be quite modest as a fraction of the total tax cut. corporate income is different than on other income and the Because these tax cuts were pretty large, the dollar stimulus tax base is capricious, depending on things like the financial was not trivial, but certainly relative to the tax cut, the stimpolicy of the corporation; I discuss some ideas about how to ulus probably was fairly modest. Our surveys tell us two clean that up. other interesting things: One, contrary to conventional wisdom, we found no evidence that low-income people RF: You mentioned the way we currently tax employerwould be more likely to spend the money they received from the tax cuts. Second, we found that, with the 2009 tax cuts, provided health insurance. What do you think are the which were delivered in the form of reduced employer withbenefits as well as deficiencies with that policy? holding, people actually had a lower marginal propensity to spend, which was contrary to what a lot of economists had Slemrod: Well, it certainly reduces the after-tax price of opined when the delivery mechanism for the tax cuts — health insurance for people. The problem is that it reduces rebate payments versus reduced withholding — was being the price below the true social cost, so that people acting in discussed in early 2009. their own family’s interest, are, at the margin, buying insurance where the value to them is actually less than the true RF: We often hear the claim that taxpayers vote with cost. In a word, we are subsidizing high-deductible, low cotheir feet, leaving relatively high-tax states for relapay insurance policies and, given the upward trend we are seeing in the fraction of our gross national product that goes tively low-tax states. What does your work on the estate to health care, I think we ought to be moving toward reductax tell us about that claim? And what do you think of it ing or eliminating such subsidies. Not only that, it’s a very more generally? unattractive sort of subsidy, because the subsidy rate is dependent on the household’s marginal tax rate, so the subSlemrod: I think there’s substantial evidence that people sidy rate is highest for the highest-income people. And I just and businesses, when they consider where to locate, think don’t think that even people who would argue for a subsidy about the financial implications of where they’re going, would favor such regressivity if they were designing a subsidy including the kind of taxes and the tax rates they would face. scheme from scratch. The reason to be wary about There’s also a lot of evidence that they think about the other abandoning the subsidy is that it supports the system of side of the government budget too, that is, what government employer-provided health insurance, which spreads risks provides. For example, there is evidence that, other things across employees and offsets the problem of adverse selecequal, some people will migrate to where welfare benefits tion that can plague health insurance markets; before we are higher. It’s also very clear that people don’t migrate simeliminate the subsidy entirely, we need to have other policies ply to where the taxes are the lowest, because if you look at in place to prevent a collapse of efficient markets for health the United States, the states with the lowest taxes ( and, coninsurance. sequently, relatively low levels of public services) are not Region Focus | First Quarter | 2011 29 attracting masses of people. I have one article, now several years old, which looks at one aspect of this issue: whether differences in the estate tax rates of American states affect migration. It uses a fairly simple research design that analyzes data about the state of residence at death of people whose wealth is high enough that they would be subject to estate taxes. We looked to see whether, over time, when state estate tax rates change, if deaths of high wealth people go up when estate tax rates go down, and vice versa. And we do, in fact, find that to be true. That’s consistent with the fact that, other things equal, at the margin, some rich people will move when they’re getting on in years to places where the estate tax is lower. Another possibility is that they just manage their affairs so that their legal residence is in a state with a relatively low tax rate. One or the other, or some combination of the two, appears to happen, although the magnitude is not very large. RF: This is a broad question: But does Atlas, in fact, shrug? Slemrod: You’re referring to the book I edited titled Does Atlas Shrug? The Economic Consequences of Taxing the Rich. It seems like that this issue never goes away in policy debates. The book is a collection of articles by different economists and lawyers who don’t all come to the same conclusion. My own view, based in part on the research discussed in this book, is that certainly high-income people notice taxes, and they react to taxes in ways that lower their exposure to taxes. The evidence for taxes substantially affecting what one might call “real” behavior, such as labor supply or savings, is not as strong as the evidence regarding another class of behaviors we might label “avoidance.” There are a lot of examples of high-income people taking avoidance steps to reduce their exposure to taxes. For instance, when tax rates are known in advance to change from one year to the next, we see high-income people shifting their taxable income into the lower tax rate year. If the relative tax on corporate income versus individual income diverges a lot, there’s evidence that high-income people will change the form of their business, from a corporation subject to the corporate income tax to a business not subject. Thus, my overall conclusion is that Atlas does shrug, but not in the way that some might think. Now, that being said, the public economics field has moved toward the view that if you’re trying to measure the efficiency cost of state income taxation, the best summary measure of that is not the elasticity of labor supply — it’s the elasticity of taxable income. Taxable income is certainly affected by labor supply, but also by all the other things people might do to lower their taxable incomes, such as avoidance, evasion, increasing tax deductible activities, and so on. So I don’t mean to say that these other sorts of “avoidance” responses are not relevant for policy. They absolutely are. 30 Region Focus | First Quarter | 2011 RF: How large of a problem is tax evasion in the United States? That is, what is the magnitude of tax evasion and what could be done to decrease that number in a way that is not socially harmful? Slemrod: The most comprehensive attempts to assess the magnitude and nature of tax evasion have been done in the United States by the IRS. For obvious reasons this is not an easy question to answer, even with a careful, comprehensive study. So, with some margin of error, the IRS thinks that for the income tax and other taxes that the IRS oversees, the rate of noncompliance is about 13 or 14 percent — about 13 or 14 percent of what should be paid is not paid. What should be done about it? First, note that just because there’s a 13 or 14 percent noncompliance rate does not mean that we have vastly too little enforcement. For sure the optimal noncompliance rate is certainly not zero, just the way the optimal burglary rate is not zero — it would just require too many resources to completely eradicate either of these things. What would I do? The most effective way to reduce noncompliance is to have third-party reporting. In the United States for most wages and salaries, your employer sends a report to the IRS stating how much you have been paid, and now their computers are good enough that if you don’t report those wages and salaries, there’s a very high likelihood that you are going to get a computer notice from the IRS asking you why. Thus, the chance of getting away with understating your wage and salary taxable income is very low and, consequently, the IRS has estimated that the rate of noncompliance for wages and salaries is 1 percent, while the rate of noncompliance for self-employment income is 57 percent. The former is subject to withholding and information reporting, and the latter is subject to neither. So one thing we should consider doing is extending information reporting further. Most other countries have it for interest and dividends; many countries have withholding for those kinds of payments, as well. We should pursue, as the IRS has been doing recently, informationexchange agreements with other countries, because it’s become quite clear that a lot of the noncompliance of highincome people involves offshore accounts or transactions, and transparent information exchange among countries reduces the attractiveness of noncompliance. I and a co-author just recently completed a study using these data from the IRS about the distribution of noncompliance by income class, which suggests that the rate of noncompliance goes up with income class, except at the very highest levels of income. No one had a good sense of the distributional pattern of noncompliance until this analysis. One commonly hears that “the poor evade but the rich avoid,” the idea being that high-income people don’t need to do illegal things because they have plenty of legal ways to reduce their taxes. But our analysis suggests that this is not true — the rate of noncompliance does generally go up with income class, except, again, at the very highest levels. Now, all of these studies are fraught with problems. It might be, for example, that the kind of evasion that really high-income people engage in is very difficult for the IRS, even with a very intensive audit, to discover. The IRS is certainly aware that even an intensive audit isn’t going to uncover all noncompliance, and it’s going to uncover some kinds more than others. They try to make up for this by estimating multiplicative factors that adjust for the fraction of noncompliance they think they have missed. Joel Slemrod ➤ Present Position Paul W. McCracken Collegiate Professor of Business Economics and Public Policy, Professor of Economics, and Director of the Office of Tax Policy Research, University of Michigan what is a policymaker who, otherwise would prefer to use that instrument to reduce consumption in an effort to improve public health, to do? Slemrod: Goods vary quite a bit in their price elasticity, that is, their ➤ Previous Faculty Appointment responsiveness to tax-inclusive University of Minnesota (1979-1987) prices. The evidence suggests that the consumption of cigarettes is ➤ Other Positions and Professional relatively price inelastic. So while Activities: you could potentially see an alliance Member of the Advisory Board, Tax Policy Center of the Urban Institute RF: Many developing countries between people who care about and Brookings Institution (2002have much higher rates of tax public health issues and people Present); Member of the Congressional in the government who care evasion. Is this simply because their Budget Office Panel of Economic about raising revenue, those two collection systems are less ineffiAdvisers (1996-2004); Contractor, constituencies differ in their cient? Or might there be cultural Office of Tax Analysis, U.S. Department “preferred” elasticity. If cigarette reasons such as the populace of the Treasury (1983-1984 and 1986purchases were inelastic to a taxmay have less trust in their 1988); Senior Staff Economist, Council induced price increase, this would government and feel less obliged of Economic Advisers (1984-1985) disappoint people who want to to support it? reduce smoking, but it is going to ➤ Education raise more revenue than if demand Slemrod: Well, I think the first aspect A.B. (1973), Princeton University; Ph.D. (1980), Harvard University were highly price elastic. — variation in tax enforcement effecMy own work has addressed tiveness — is certainly a big part of it. In ➤ Selected Publications how the possibility of tax avoidance countries where the tax administration Co-author (with Jon Bakija) of Taxing affects the impact of raising state is severely constrained for resources Ourselves: A Citizen's Guide to the Great cigarette taxes. Consider what and the enforcement is very weak, the Debate Over Tax Reform (4th edition, happens when there are ways to return to evasion is high. People don’t 2008); editor or co-editor of numerous avoid a state’s cigarette taxes withwant to be perceived as suckers in books including Does Atlas Shrug? The out actually smoking less — for countries like that, where they see Economic Consequences of Taxing the Rich example, traveling across the boreverybody else getting away with it. (2000) and Why People Pay Taxes: Tax der to buy cigarettes in a state Whether part of the story is that Compliance and Enforcement (1992); and which has much lower taxes, or in people evade more when they don’t author of articles in such journals as the the modern version, going on the trust their government, including American Economic Review, Quarterly Journal of Economics, Journal of Political Internet and buying apparently taxregarding spending their money wisely, Economy, Journal of Public Economics, and free cigarettes. Then the state faces is an interesting question, and a lot of National Tax Journal a tricky dilemma. If it tries to raise social scientists argue that it is imporrates, it’s not going to get as much tant. My own view is that we don’t yet revenue as it otherwise would. And for a lot of people the have a lot of hard evidence on the question. There is a posieffective price has not gone up, because it just drives them to tive cross-country correlation between the fraction of the Internet. My research on cigarettes suggests that the people who say they don’t trust their government and measelasticity of taxed sales in a given state has gone up over ures of tax evasion, but that doesn’t compellingly tell us that recent years, as these tax-free alternatives, for example the lack of trust causes the higher rates of tax evasion. What through the Internet, have become more widely available. causes what is tough to nail down. For example, you can’t And I say “apparently tax-free” because, although it is quite really do a field experiment, where you go into one part of a easy to buy untaxed cigarettes over the Internet, technically, country and change how people feel about the government, if I did that, I am supposed to remit tax liability to the state and don’t do that it in another part, and then compare where I live. The tax applies depending on where you smoke changes in tax evasion rates. Trust in government could be them, not where you buy them. But everyone knows that very important, but it is just very hard for social scientists to almost nobody actually remits these taxes. pin down its behavioral implications. On this topic, there is a wonderful piece of work by an economist from the University of Illinois at Chicago named RF: Are there certain goods for which consumption David Merriman who had his students collect discarded seems relatively unaffected by higher taxes? For cigarette packs all over the Chicago area. You can tell from instance, certain “sin goods” such as cigarettes? If so, Region Focus | First Quarter | 2011 31 the stamp on the pack where it was purchased. And sure enough, there are a lot of packs apparently consumed in Chicago that were purchased in Indiana, where the taxes are lower, and the fraction increases as you move closer to the Indiana border. RF: What does your research tell us about the effects of tax policy on foreign direct investment (FDI)? Slemrod: My own research and research done by others suggest that a host country’s tax policy does have a significant effect on the amount and the type of FDI it attracts. My own research has tried to differentiate two aspects of why a low-tax rate may make a country more attractive for FDI. One is that it just lowers the effective tax on income. The other aspect is that, with a low tax rate, once there is activity in the country, most multinationals have the incentive to shift their taxable income into your country. They have many ways of doing this — such as establishing subsidiaries in low-tax countries. From a policy point of view, I feel quite differently about these two aspects. I have no problem with a country levying a low effective tax rate on income to try to attract real investment. I have a bigger problem with a country inviting, even encouraging, multinationals to shift income from higher-tax countries into their country, because to me this is parasitic on the treasuries of these other countries and isn’t productive at all from a global point of view. In fact, I think this is welfare reducing because the higher-tax countries expend resources to try to keep the revenue from leaving and the companies expend resources to camouflage the income shifting. In the news recently is a country that has been quite successful at both of these aspects. For a long time Ireland has had a 12.5 percent corporate tax rate. This means there is a relatively low-tax rate on income from investment in Ireland. But I think the bigger issue is that this provides a tremendous incentive for, say, U.S. car companies to build maybe only a single plant in Ireland, and then shift the taxable income earned in its hightax locations into Ireland and thus lower their worldwide tax burden. That’s a “beggar-thy-neighbor” policy of Ireland. RF: The savings rate is affected by many things but one possible factor that many people may not have considered is the threat of a catastrophic war. What does your research tell us about this question? Slemrod: Congratulations for waiting about an hour to ask me about one of my quirky papers! I have studied a few issues people seem fascinated by, and this is one of them. I have three articles that try to estimate whether, when people seriously think there’s a chance of a nuclear conflagration, this belief affects their saving behavior. In short, do people believe we ought “to eat, drink, and be merry, for tomorrow we die?” To test this hypothesis I looked at aggregate saving over time in the United States, across countries, and micro data within the United States, and in all three 32 Region Focus | First Quarter | 2011 cases found that when people think, or profess to think, there’s a chance of a nuclear war, their saving rate goes down, just as economic theory would predict. RF: You mentioned you have published other supposedly quirky papers that have garnered a lot of attention. Slemrod: Yes, my co-author, Wojciech Kopczuk of Columbia University, and I actually won an “Ig Nobel” prize from a publication called the Annals of Improbable Research. We won it for a serious economics paper that was eventually published in the Review of Economics and Statistics entitled “Dying to Save Taxes: Evidence from Estate Tax Returns on the Death Elasticity.” We looked at estate tax return data from the history of the U.S. estate tax and found that when the estate tax was going to change — go up or down — in an anticipated way, then the distribution of deaths around that date was not symmetric. When the tax rate was going to increase, more people died before the rate rose, and when the tax rate was going to be lowered, people held on and more people died after the decrease. Since we wrote the paper, the general “death elasticity” finding has been replicated using data from episodes in Australia and Sweden when they ended their estate taxes. Those studies found evidence that people delayed their death to save their heirs’ money, in some cases, millions and millions of dollars. We wrote this paper before the 2001 U.S. tax changes, which phased down the estate tax over the subsequent decade and which eliminated it completely for 2010, only to reinstate it in 2011. So there now are two recent episodes to further investigate our hypothesis. Between 2009 and 2010, some people should have been hanging on to “get” the zero estate tax rate. And now, right now (November 2010), the morbid part of the hypothesis applies, because someone who is going to leave a huge estate — well, they’ve got four weeks to get on with it estate-tax-free. RF: Are there papers you have been working on recently that you would like to discuss? Slemrod: I am working on a paper about the effect of public disclosure of income tax returns. The issue is, what would be the impact if there was public disclosure of income tax liability and taxable income, as there is in several countries today and there was in the United States in the 1920s and again in the 1930s? In Norway, for instance, you can go online and see anybody’s taxable income, income tax liability, taxable wealth, and wealth tax liability. The people who think this is a good idea argue that it dampens noncompliance, because if your neighbor sees that you have reported $10,000 of income and has reason to think it should be $100,000, he might provide that information to the tax authority. The ongoing research is, as far as I know, the first empirical study on the impact of disclosure. It uses data from Japan, which had disclosure from 1949 until 2004 for both individuals and corporations. We examine what happened when disclosure ended, and take advantage of the fact that disclosure was required only for people and corporations with taxable income and tax liability over some threshold. So we look at the distribution of taxable income reports and observe that it tightly fits a Pareto distribution until you get very near the threshold, where there are noticeably fewer reports than would be expected under a Pareto distribution. This is completely consistent with the notion that both individuals and corporations near the threshold are understating their income in order to avoid disclosure. Also, in Japan — this is wellknown among accountants, apparently — there were so-called “39 companies,” referring to the fact that the disclosure threshold level for disclosure was 40 million yen. These “39 companies” arranged their affairs so they wouldn’t have to publicly disclose their income. That’s the first half of the paper: that a nontrivial amount of individuals and corporations apparently take actions to avoid disclosure. In the second part of the paper, we look at whether we can see a disruption in corporations’ reporting of taxable income in their financial statements when disclosure ended. Remember, in 2004, everybody could see what your taxable income was, but in 2005 nobody could see, other than Japan’s version of the IRS. Did we suddenly see taxable income and tax payments go down, because they didn’t feel this pressure of public disclosure? And the answer seems to be that no, we didn’t see that. This might be so because in Japan there is a very high degree of conformity between the tax return measure of income and the financial statement measure of income, which means there’s already quite a lot of information in the public domain about taxable income for big public companies, so disclosure was not that big a deal — when it ended, there wasn’t a big response. But for smaller companies whose income was near the disclosure threshold, which were mostly private companies, the public tax disclosure was the only information out there, so it mattered more for them. Something else that I have been thinking about lately is what I call “policy notches” — where a very small change in behavior can lead to a large change in tax liability. A good example is fuel economy policy. The “Gas Guzzler” tax is notched. So when the fuel economy of a car (but not a truck or SUV) changes from 16.4 to 16.5 miles per gallon, there’s a several-hundred dollar reduction in the tax, throughout the whole range of the tax. The same is true in the Canadian system. Do automobile manufacturers respond to those notches? Do they make sure that the fuel economy measure on which the tax is based is just over the notch to get the lower tax? A former graduate student of mine, Jim Sallee, who’s now at the Harris School of Public Policy Studies at Chicago, and I have written a paper called “Car Notches” that reports pretty convincing evidence that car manufacturers are well aware of these notches and are re-engineering their cars to take advantage of it. Unfortunately, a notched policy like this is inefficient because it induces auto manufacturers to spend a lot of effort and resources re-engineering some of their cars that are near the notch just barely over it and provides no incentive at all to do the same on cars that aren’t near the notch. That just isn’t an efficient way to encourage fuel economy. RF: Which economists have been most influential in shaping your research agenda and your thinking about economic policy issues? Slemrod: I was incredibly lucky to go to graduate school at Harvard at a time when some of the really great contributors to my field were on the faculty. My advisor was Marty Feldstein, who was teaching one half of the two-semester public finance sequence. Richard Musgrave, who had written probably the most influential book on public finance while he was teaching at Michigan in the late 1950s, about 15 years before I got to graduate school, taught the other half. Marty was a leader in the new public finance, taking seriously rigorous normative models of optimal taxation and applying frontier empirical methods to estimating the impact of taxation on behavior, and he had a tremendous influence on how I think about research. But I was tremendously influenced by Dick Musgrave and his views about the importance of the normative issues that inform economic models. The most stimulating economic intellectual experience I have ever had was the weekly public finance seminar at Harvard, when — metaphorically speaking — Marty would sit in one corner of the room, Dick Musgrave would sit in the other corner, and we graduate students, who at the time included tremendously smart people like Larry Summers and Alan Auerbach, would be in the middle. Marty and Dick would — generally very respectfully, but always forcefully — give their often contrary perspectives on the issues of the day. That experience was very important and formative for me. My first job as an assistant professor was at the University of Minnesota. I never became a rational expectations guy like many others there, but one thing I respected tremendously about the faculty there was that they took economics very seriously. It wasn’t a game. It was important, and it needed to be rigorously based, whether they were talking about theory or empirical work; I hope I picked up some of that seriousness. I also have had long-time colleagues and collaborators who have been incredibly important to me. One is Shlomo Yitzhaki, an Israeli economist who I worked with continually for 20 years on one project or another, and who convinced me that aspects of taxation that were then at the periphery of the standard models, such as avoidance, evasion, and enforcement, were actually central to the economics of taxation. Another big influence on my research career was Roger Gordon, who was my colleague at Michigan and was the reason I came here. The serious and eclectic intellectual environment in the economics department at the University of Michigan, and the great graduate students here, keep me stimulated and motivated. RF Region Focus | First Quarter | 2011 33 ECONOMICHISTORY Virginia and the Final Frontier Federal space centers and proximity to customers have helped to attract private space firms Apollo astronauts practiced for lunar missions at NASA Langley’s Lunar Landing Research Facility. This 1969 photo of the facility shows its model Lunar Excursion Module, or LEM. The LEM was suspended from cables to enable the astronauts to simulate piloting on the moon. A Langley worker is standing beneath the craft. 34 n a Saturday morning a little more than a half-century ago, Oct. 24, 1959, some 20,000 visitors swarmed Langley Research Center in Hampton, Va. The occasion was an open house to give the public a look inside the National Aeronautics and Space Administration, or NASA, which had been hurriedly created by Congress and President Eisenhower in mid-1958 in response to the orbiting of the Russian Sputnik satellite. America’s early efforts at chasing Sputnik had not gone well. An attempt by the U.S. Navy to launch a satellite in late 1957, two months after Sputnik, failed when the rocket exploded spectacularly on the launch pad. Now Americans looked expectantly to NASA to put the United States in front of the space race. At the center of attention inside Langley’s open house were spacecraft and rockets: Among them were a model of a German V-2 rocket engine; a rocket known as “Little Joe” that would soon be used to launch a sevenpound Rhesus monkey into space and back again; and a mock-up of the Mercury space capsule, which was to carry an American into orbit. At an exhibit on Project Mercury, an engineer told attendees, “The possibility of venturing into space has shifted quite recently from the fantasy of science fiction into the realm of actuality. Today, space flight is considered well within the range of man’s capabilities.” The visitors were led to understand that they were in one of the birthplaces of the American-manned space program — and they were. Two and a half years later, the Langley-run Mercury program would send John Glenn around the Earth. Langley managed the project, trained Glenn and the program’s six other astronauts, carried out aerodynamic and structural tests, and created the ground- O Region Focus | First Quarter | 2011 tracking system, among other things. Soon afterward, Langley would make crucial contributions to the Apollo program: Its Lunar Landing Research Facility trained the Apollo astronauts in piloting spacecraft near the moon’s surface and in moon walking. In the overall design of the Apollo missions, a Langley engineer, John Houbolt, argued as a voice in the wilderness for a plan of lunar-orbit rendezvous — that is, the docking of two spacecraft in lunar orbit after one of them had landed on the surface — and ultimately convinced other NASA centers of its superiority. Langley then built a rendezvous and docking simulator in an airplane hangar to train the astronauts in the technique. Today, the 788-acre center employs some 3,800 workers in the Hampton Roads area — divided roughly 50-50 between civil service employees and contract employees — who work on projects ranging from wind tunnel tests to next-generation escape systems for saving astronauts in case of launch failure. In recent years, policymakers in Virginia have been seeking to build on the presence of Langley and other federal and private space centers with the aim of expanding the state’s space industry — and, with it, the highpaying skilled jobs that the industry brings. To that end, the Virginia General Assembly has enacted legislation to promote the industry’s development: The Spaceport Liability and Immunity Act of 2007, which protects space transportation companies from liability for the injury or death of a spacebound passenger if the company has given warning of the risks, and the Zero G Zero Tax Act of 2008, which exempts human space launches and some cargo space launches from state income taxes. While such measures may make the state more attractive for investors, the PHOTOGRAPHY: NASA L ANGLEY RESE ARCH CENTER BY DAV I D A . P R I C E history of Langley and other players in the Virginia space sector highlight the powerful role of nonpolicy factors — and sometimes sheer serendipity — in bringing space activities to the state. The stories of a cross-section of Virginia’s space organizations, public and private, suggest that the state does have a number of advantages that have seeded development in the past and that will likely continue doing so, including its proximity to the nation’s capital and its strong public university system. NASA Lands in Virginia The months between Sputnik and the creation of NASA saw a melee among defense agencies for control of the space program. (An Air Force publicist invented the term “aerospace” to reinforce the idea that aeronautics and space were inseparable.) Eisenhower opted for a civilian program that would exist in tandem with, and somewhat overlap, Pentagon missile programs. NASA’s facilities would be plucked here and there from the military: the Army’s missile program in Huntsville, Ala., headed by Wernher von Braun, the German missile designer and future director of the Saturn V moon-rocket program; the Army’s Jet Propulsion Laboratory near Pasadena, Calif.; the Navy’s Vanguard rocket program; and part of an Air Force test range at Cape Canaveral, Fla. NASA also inherited five centers from its predecessor, an aircraft research agency known as the National Advisory Committee for Aeronautics (NACA). As it happened, two of those centers were in Virginia: One was Langley, and the other was Wallops Flight Facility, located on Virginia’s Delmarva Peninsula and nearby Wallops Island. (Congress established another NASA center in the Fifth District, Goddard Space Flight Center in Beltsville, Md., a year after NASA’s founding.) Langley had been founded in 1920 as the first civil aeronautical research agency in the United States. Its site in Hampton, originally farmland, was chosen on the basis that it was a reasonable distance from Washington and yet was isolated enough for safe and secure flight testing. “Since then, we’ve been involved in advancing the science of flight,” Langley director Lesa Roe says. “Literally every aircraft today contains some technology that we developed at Langley.” Roe hopes that Langley’s capabilities will find customers in the private space industry. “We have met with the SpaceX folks [the launch vehicle and spacecraft company run by PayPal co-founder Elon Musk] and others,” she says. “If they need to use our facilities, we can put agreements in place so they can come and test in our wind tunnels or have access to our expertise in materials or aero sciences or systems analysis. We’re eager and willing to do that.” Wallops Flight Facility was built later than Langley, in 1944, as the Pilotless Aircraft Research Station; NACA staffed it with Langley employees to conduct research for the war effort. Rocketry was part of its research from the outset — to aid its aircraft work. “In the early years, under NACA, it made sense to try out a bunch of different aerodynamic shapes,” says Wallops director William Wrobel. “They didn’t have wind tunnels. To get the high speeds, they put these shapes on rockets.” Wallops today consists of an airfield for aircraft-related research on the peninsula and six launch pads and related facilities on the island. The center launches 15 to 20 rockets per year, ranging from targets for the Navy to suborbital science projects and satellites, and has 1,100 full-time employees. Located alongside the NASA launch facility is the MidAtlantic Regional Spaceport, or MARS, operated by Virginia and Maryland in an effort to bring space business to the region. On land leased from NASA, it offers two launch pads to commercial customers. It is one of seven nonfederal spaceports that the Federal Aviation Administration has licensed. Its biggest deal so far is an agreement with Orbital Sciences Corp., based in Dulles, Va., for eight launches of Orbital’s Taurus II launch vehicle between 2011 and 2015 to carry supplies to the International Space Station. A Private Space Sector Thrives One of the biggest private-sector players in Virginia’s space industry started as an Arizona company. Motorola believed in the 1990s that there was a mass market for premiumpriced satellite phone service with worldwide coverage. It started Iridium to serve this market and, with other investors, put $5 billion into it. Iridium orbited 66 communications satellites (plus in-orbit spares) to service the hundreds of thousands of customers that it assumed would beat a path to its door. Iridium opened for business in late 1998 — and filed for bankruptcy less than a year later. Its satellite system worked splendidly, but it never had more than a small fraction of the number of subscribers it needed to break even. In late 2000, as Iridium was preparing to shut down and de-orbit its satellites, a group of private investors scooped up Iridium for a comparative pittance, $25 million, with the intention of offering the service to specialized markets. They had encouragement from the U.S. government, which, for strategic reasons, didn’t want to see Iridium fail. The new owners quickly moved the headquarters to Arlington, Va. (since moved to Tysons Corner) to be closer to Iridium’s most important customer, Uncle Sam. In addition, the company had always maintained a significant part of its operations in Leesburg, namely, its Satellite Network Operations Center, operated under contract by Boeing to control the satellites. In the past decade, Iridium’s customer base has shifted more to private customers. “Although [the Department of Defense] is still our single largest customer, representing 23 percent of revenues, commercial is now the larger part of our business and is growing at a faster rate,” says Don Thoma, Iridium’s executive vice president of marketing. One thing, however, has been a constant in Iridium’s strategy: It still seeks niches, not the mass market. “Iridium fills a need for customers who operate in locations or Region Focus | First Quarter | 2011 35 situations where there really are no reliable communications alternatives,” Thoma says. “Iridium complements cell phones by providing voice and data communications to the rest of the globe, where cell towers can’t reach. We offer a reliable option, for example, for ships in the middle of the ocean, planes flying over the North Pole, and first responders in the middle of a natural disaster.” Iridium, now a profitable public company, derives a competitive advantage from its large satellite fleet (or “constellation,” to use the industry’s term). The approach of Iridium’s main competitors is to provide coverage of the Earth with just a few satellites placed in high orbit, around 22,000 miles up — known as geostationary orbit, because each satellite stays in a fixed position in relation to points on the ground. Iridium, in contrast, keeps its satellites in low Earth orbit, roughly 483 miles up, thus requiring the large constellation. Iridium’s way is more costly, but the shorter distance eliminates the transmission delay that comes with geostationary satellites and hinders phone communications. The shorter distance also means Iridium’s phones can have smaller, less bulky antennas. In satellite phones, as in other electronics, customers like things small. Another space-industry company that values Virginia’s proximity to the federal government is Dulles-based GeoEye, which owns and operates three Earth-imaging satellites, and sells high-resolution images to the National Geospatial-Intelligence Agency (NGA), as well as to private customers. NGA, in turn, disseminates the imagery to U.S. intelligence agencies and military services. “It’s nice that we can drive to the agencies, to Capitol Hill, to the Pentagon,” says Uyen Dinh, senior director for government affairs at GeoEye. The company is in Dulles because it started as a division of Orbital Sciences there before being spun off in 1997. Now the company has more than 230 employees in Virginia and $270 million in 2009 revenue. Two-thirds of its revenues come from the U.S. government. Another customer is Google, which uses images from GeoEye, as well as other providers, for display in Google Earth and Google Maps. The government allows GeoEye to sell imagery to private customers such as Google at halfmeter resolution, while the images that GeoEye delivers to the government are much more detailed. The company is positioning itself to sell sophisticated analysis of Earth images in addition to the images themselves. Part of that strategy is GeoEye’s acquisition in December of McLean-based SPADAC, which uses human analysts and software to perform “predictive analytics” of geographic images for terrorist attacks or other threats. “As a market begins to mature, strategically you want to move further up the value chain and offer more comprehensive services to your clients,” says Chris Tully, the firm’s senior vice president of sales. “We don’t want to be simply a pixel provider.” For affluent customers who prefer to look at Earth from space for themselves, there is Vienna, Va.-based Space Adventures, Ltd., which charges $50 million per passenger for orbital missions of 10 to 12 days in a Soyuz spacecraft and the International Space Station. Since 2001, seven clients have flown on eight orbital missions (one client, exMicrosoft executive Charles Simonyi, has gone up twice). The company’s latest offering is a trip around the moon on a Soyuz spacecraft. One of the two available seats has already been reserved. Ticket price: $150 million. Space Adventures is based in Virginia because its founder, a Coloradoan, Eric Anderson, had studied aerospace engineering as an undergraduate at the University of Virginia and decided to stay in the state. He had dreamed of being an astronaut until he learned during a summer internship at NASA that his eyesight would disqualify him from the agency’s astronaut corps. So he shifted his dream to starting a business that would put private citizens in space. Two years after graduation, Anderson founded Space Adventures in his Arlington townhouse in 1998. He had no way to put people into space, so he offered terrestrial adventures in astronaut training — zero-gravity flights, high-gravity training in a centrifuge — and flights to the edge of space in a Russian MiG-25 fighter jet. By 2001, he had managed to engineer an agreement with the Russian Federal Space Agency and Rocket and Space Corporation Energia, a Russian manufacturer of spacecraft components, for the orbital flights. Of course, the market for the Soyuz missions is limited because the ticket price is so high. “We estimate that for orbital space flights, there’s probably less than 10,000 people in the world who could afford them,” says Space Adventures president Tom Shelley. “But within that very narrow market, there’s a pretty strong interest. We have a large number of people in our pipeline who’ve stated they want to do this. The biggest limiter for them is time because it takes a good deal of commitment of time.” (Founder Anderson is now chairman of the company.) The exploitation of space in Virginia has come far since the Saturday-morning preview at Langley in 1959. Economic incentives will play a role in the industry’s development. Yet if history is any indication, an even greater role will be played by proximity to sophisticated customers and a pool of highly skilled engineers to act as employees and entrepreneurs. RF READINGS Hansen, James R. Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo. Washington, D.C.: National Aeronautics and Space Administration, 1995. 36 Region Focus | First Quarter | 2011 McDougall, Walter A. … The Heavens and the Earth: A Political History of the Space Age. New York: Basic Books, 1985. AROUNDTHEFED The Decision to Export BY C H A R L E S G E R E N A “Understanding Exports From the Plant Up.” George Alessandria and Horag Choi, Federal Reserve Bank of Philadelphia Business Review, Fourth Quarter 2010, pp. 1-11. elling to foreign markets isn’t easy, but it often can boost local job growth along with an exporter’s sales figures. That’s why states send their governors on trade missions and offer various incentives to encourage exporting. But does exporting itself beget success, or are successful companies in a better position to become exporters? George Alessandria at the Philadelphia Fed and Horag Choi at Monash University address this question in a recent paper. Using data from manufacturers in the United States, Canada, and Chile, the economists determined the distinguishing features of exporters — they are bigger, more productive, and more profitable than nonexporters. Then they modeled the decision of a plant to export or sell domestically to explain these characteristics. Their conclusion was that the process of exporting does not necessarily transform less productive firms into superstars. “Our simple model shows that causation may run from superstar to exporting,” notes Alessandria and Choi. “Indeed, future exporters tend to be more productive and to grow faster even before they enter export markets.” S “The Effect of Falling Home Prices on Small Business Borrowing.” Mark E. Schweitzer and Scott A. Shane, Federal Reserve Bank of Cleveland Economic Commentary 2010-18, December 2010. t has been widely reported that the housing market’s downturn has sharply affected residential construction. A recent analysis by the Cleveland Fed suggests that reduced lending to small businesses should not be overlooked in the process — and that, in fact, the two issues are in some ways connected. One reason for reduced lending is that many business owners have seen the equity in their homes dry up. At focus groups convened by the Federal Reserve last summer, businesses owners reported that the reduced value of their homes made it difficult to provide collateral for loans. “Other participants said that the reduced value of homes has made home equity borrowing as a source of business capital more difficult to come by, also contributing to the difficulty many small businesses face in obtaining sufficient capital to finance their operations,” notes Mark Schweitzer, the Cleveland Fed’s director of research, and Scott Shane, a professor at Case Western Reserve University. To support this anecdotal evidence, Schweitzer and Shane analyzed survey data. They found that in 2007 I between one-fifth and one-quarter of business owners had obtained a loan against the equity in their homes or used their primary residences as collateral for business purposes. They also found that the use of home equity lines rapidly expanded during the last decade as home prices increased. And when prices fell, home equity borrowing declined sharply. However, it isn’t clear how much of these changes were due to other factors such as changes in the availability of home equity lines. Finally, as the authors point out, not all small businesses owners are equally affected by declines in home prices. Those more likely to leverage their residences include “companies in the real estate and construction industries, those located in the states with the largest increases in home prices during the boom, younger and smaller businesses, companies with lesser financial prospects, and those not planning to borrow from banks.” Still, there is enough of a correlation between home values and small businesses’ access to capital to inform policymakers. “Returning small business owners to prerecession levels of credit access will require an increase in home prices or a weaning of small business owners from the use of home equity as a source of financing,” writes the report’s authors. “How Do Sudden Large Losses in Wealth Affect Labor Force Participation?” Eric French and David Benson, Federal Reserve Bank of Chicago, Chicago Fed Letter 282, January 2011. nother frequently reported effect of the housing market downturn is that workers are delaying retirement to replace large and sudden losses in household wealth. Chicago Fed economists Eric French and David Benson decided to find out whether declines in home values (as well as drops in some stock prices) have significantly affected the U.S. labor market. “On the surface, labor force participation statistics for older individuals seem consistent with anecdotes about delayed retirements,” note French and Benson. While labor force participation for most age groups has been falling, it has been rising for those aged 55 to 64. But this upward trend dates back to the early 1990s. So, other factors, such as longer life spans and changes in pensions and Social Security rules, may have encouraged delayed retirements as well. French and Benson estimated the wealth losses of older workers approaching retirement and plugged those data into an equation that relates changes in wealth to changes in the labor supply. The result: The labor force participation rate for workers aged 51 to 65 would be 2.9 percentage points lower if asset prices hadn’t declined between 2006 and 2010. RF A Region Focus | First Quarter | 2011 37 ECONOMIC THOUGHT • continued from page 16 don’t know, we’re dealing with unfamiliar territory here.’ ” Unfortunately, he says, it would take a discipline-wide commitment to turn that around. The AEA recently considered adopting a code of ethics to induce economists to disclose any paid consultancies that could potentially sway their research conclusions. A better move, Colander says, would be for economists to have a culture that discourages people from purporting undue certainty in their predictions and explanations. If there’s a bottom line to recent criticisms of what economists study, Whaples says, it is that the fundamental dispute dates back at least a century. The consensus vacillates between those who say markets don’t work well and that we need to put regulations on them, and those who point out the unintended side effects of government intervention and the fact that smart people will exploit regulations. “That basic argument goes back and forth, around in a circle, forever,” Whaples says. “When we haven’t had any crises for a while, the ‘markets work’ group will get stronger. And when we have a crisis the ‘markets don’t work so well’ group will get stronger.” Nobody can say which is right, he says; there are valid points to be made on both sides. “But there’s always going to be that middle ground. The problem is, it’s kind of wide.” The crisis may have helped narrow the question some: In what situations do markets work, and how does policy affect how markets function? Economics is about the journey, not the destination; economists will never be “done” understanding the economy and human behavior. But the constant drive toward better understanding can only be a good thing for future economic thought. RF READINGS Bernanke, Ben S. “Implications of the Financial Crisis for Economics.” Speech at the Conference co-sponsored by the Center for Economic Policy Studies and the Bendheim Center for Finance, Princeton University, Sept. 24, 2010. Blanchard, Olivier. “The State of Macro.” NBER Working Paper No. 14259, August 2008. Blaug, Mark. Economic Theory in Retrospect, Fifth Edition. Cambridge: Cambridge University Press, 1997. Luzzetti, Matthew, and Lee Ohanian. “The General Theory of Employment, Interest, and Money after 75 Years: The Importance WO R K F O RC E • of Being in the Right Place at the Right Time.” NBER Working Paper No. 16631, December 2010. Putterman, Louis. Dollars & Change: Economics in Context. New Haven: Yale University Press, 2001. Rajan, Raghurham. “Why Did Economists Not Spot the Crisis?” Feb. 5, 2011 post on the University of Chicago Booth School of Business’ “Fault Lines” blog. Uhlig, Harald. “Economics and Reality.” NBER Working Paper No. 16416, September 2010. continued from page 23 new normal. “I don’t know — I’m waiting to see,” he says. Temp work is an important part of the flexibility that is one of the U.S. economy’s great strengths. “In the long run, this flexibility helps make our country more competitive, it increases living standards, it lowers prices for goods,” Groshen says. “But in the short run, there can be high costs to the workers involved — the costs are very concentrated, while the benefits are diffuse.” RF READINGS Autor, David, and Susan Houseman. “Do Temporary-Help Jobs Improve Labor Market Outcomes for Low-Skilled Workers? Evidence from ‘Work First.’ ” American Economic Journal: Applied Economics, July 2010, vol. 2, issue 3, pp. 96-128. Peck, Jamie, and Nik Theodore. “Flexible Recession: The Temporary Staffing Industry and Mediated Work in the United States.” Cambridge Journal of Economics, March 2007, vol. 31, issue 2, pp. 171-192. Groshen, Erica, and Simon Potter. “Has Structural Change Contributed to a Jobless Recovery?” Federal Reserve Bank of New York Current Issues in Economics and Finance, August 2003, vol. 9, no. 8. Otoo, Maria Ward. “Temporary Employment and the Natural Rateof Unemployment.” July 1999, Board of Governors of the Federal Reserve Working Paper, no. 99-66. Katz, Lawrence, and Alan Krueger. “The High-Pressure U.S. Labor Market of the 1990s.” Brookings Papers on Economic Activity, 1999, vol. 30, no. 1, pp. 1-88. 38 Region Focus | First Quarter | 2011 BOOKREVIEW Riches From Respect BOURGEOIS DIGNITY: WHY ECONOMICS CAN’T EXPLAIN THE MODERN WORLD CHICAGO: UNIVERSITY OF CHICAGO PRESS, 2010, 450 PAGES BY DEIRDRE N. McCLOSKEY REVIEWED BY DAVID A. PRICE wo centuries ago the world’s economy stood at the present level of Bangladesh,” observes Deirdre McCloskey at the outset of Bourgeois Dignity. McCloskey, an economist at the University of Illinois at Chicago, who holds appointments in the university’s history, English, and communication departments, seeks in her latest book to explain the unprecedented worldwide, long-term economic ascent that began in Holland in the 1600s and in Britain in the 1700s, bringing — by her estimate — at least a sixteenfold increase in real income per person during that time. In doing so, she replaces traditional explanations for this growth with one based on a change in rhetoric and attitudes, which she calls the Bourgeois Revaluation: a reappraisal of the status of bourgeois commercial activities such as trading and inventing. Starting in Holland and then in Britain, she argues, people throughout society, including within the aristocracy, no longer sneered at these activities — no longer saw them as vulgar — but instead saw them, and the bourgeoisie that carried them out, as having merit. The bourgeoisie had long had some degree of liberty: Now it had dignity. From dignity to economic growth, the transmission belt implied by McCloskey’s story is that talented yeomen who would have otherwise pursued traditional occupations such as farming or soldiering were drawn instead to the newly respected pursuits of trade and industrial innovation. Gentlemen and aristocrats were perhaps drawn to organizing ventures and investing. McCloskey’s thesis is intuitively appealing. In our own time, it is reasonably obvious that social prestige is commonly a factor in occupational choice and employer choice. Why not in the time of the Industrial Revolution too? The conversational narrative style of Bourgeois Dignity is appealing, as well. At times, it feels as if she is writing for a favorite niece. (“I wish you would pay attention,” she playfully chides the reader at one point.) Along the way, there are quick digressions on such varied subjects as the persecution of British mathematician Alan Turing under antigay laws, the animated film Ratatouille, and space telescopes. But setting out an attractive and stylishly told thesis is one thing; proving it is another. Here is where the book becomes frustrating. To be sure, hers is inherently a difficult T thesis to support using the conventional tools of economics: It is challenging to find reliable time series for ordinary economic aggregates going back 400 years, let alone proxies for intangibles like dignity. Her approach in this book is negative, considering and rejecting a series of alternative explanations for modern economic growth. If none of these adequately accounts for the sixteenfold-plus increase, she holds, that failure supports her theory as the residual. Among the explanations she finds lacking are foreign conquest and imperialism, foreign trade, science (as distinct from commercial innovation), savings, a rise in greed, economies of scale, natural resources, and railroads, canals, and improved roads. For McCloskey, none of these could have had more than a small part in the growth; each had too small an effect, started too early to explain the rising tide, or occurred in too many other places without a corresponding effect on growth. Responding to institutional theorists, such as Douglass North, she agrees that property rights and the rule of law were necessary for growth, but argues that they evidently were not sufficient, since both of these predate the period when growth started in Holland and Britain. “And what then of secure Italian or for that matter Byzantine or Islamic or Chinese property rights?” she asks. McCloskey’s approach seems unsatisfying in some respects, however. First, even if none of the traditional factors fully accounts for the growth, what about the interaction of them? She gives too little consideration to this possibility. Second, her treatments of some of the traditional explanations are somewhat cursory and derisive. Rightly or wrongly, she gives the impression that she has not presented those theories in their strongest form before attempting to knock them down. Her positive argument for her theory is set out briefly here in about 35 pages. (She promises that a follow-up volume, The Bourgeois Revaluation: How Innovation Became Virtuous, 1600-1848, will make the case in more detail.) It is primarily based on canvassing rhetorical sources of the period and showing the use of pro-bourgeois rhetoric. Yet it is hard to make the case based on rhetoric alone: The rhetoric of the period, as she is careful to note, is divided on the subject. Moreover, her rhetorical methodology does not here meet the standard to which she holds the theories she criticizes. “The assertion is without quantitative oomph,” she says of one opposing argument, “ and is not science, until it is actually measured.” Then, too, the causation could run in the other direction: Rising prosperity might have led to a rise in pro-bourgeois rhetoric. Her next volume undoubtedly will set out a more comprehensive case for her theory of bourgeois dignity in economic growth. RF Region Focus | First Quarter | 2011 39 DISTRICTDIGEST Economic Trends Across the Region Long-Term Industry and Occupation Outlook in the Fifth District BY A N N M AC H E R A S he Bureau of Labor Statistics (BLS) publishes projections biennially for the various occupations that produce all of the goods and services in our economy — from the bricklayer to the computer programmer. The occupational employment projections offer a view into expected changes in the number of people employed in each profession over a 10-year time horizon. (There are approximately 750 occupations in the classification system used by BLS and other federal agencies, known as the Standard Occupational Classification system, or SOC.) The current projection period covers 2008 to 2018. In contrast to forecasts of near-term economic activity, the long-term projections for growth in occupations convey valuable insight on growth or decline in occupations over a period of time that is sufficient to allow for planning and strategic decisionmaking. The development of long-term employment projections dates back nearly 60 years, to shortly after the end of World War II, and their original purpose was to provide career information for veterans returning to civilian life. Today, occupational employment projections provide valuable information that serves an even broader set of customers in three important areas: career advice and planning, curriculum planning for education and training institutions, and alignment of economic development planning with the workforce. Employment changes across occupations matter a great deal to guidance counselors in middle schools and high schools as they offer career advice to help students match their interests to potential opportunities, while also considering the availability of future job openings. Likewise, postsecondary institutions, whether four-year colleges and universities, community colleges, or technical institutes, use the projections as a valuable component to plan for the appropriate courses and majors or technical training that will serve the needs of their students over time. Finally, economic development agencies use the projections to set realistic targets for the types of industries they want to attract and promote in their region, so that the workforce needed by companies will match the availability of specific occupations. T How the Projections Are Derived The methodology and the timing of the projections have evolved over the years to the current two-year cycle. Development of the national projections involves several related steps, starting with estimates of the total labor force for the projection year. Census estimates of the size and demographic composition of the population are combined 40 Region Focus | First Quarter | 2011 with projected labor force participation rates to obtain an estimate of the labor force in 2018. In turn, the labor force projections feed a model of the U.S. economy to derive estimates of growth in the aggregate economy. Growth in the economy stems from demand for goods and services on the part of households, businesses, the government, and other countries. Together, these sources of demand, referred to as “final demand,” generate growth across many industry sectors within the economy. While some industry sectors expand output in direct response to final demand, others grow because they supply inputs to expanding industries. The resulting projections of industry output imply a level of industry employment, taking into account other factors, such as expectations of productivity growth. Finally, detailed occupational employment for each industry is estimated by applying an industry-occupation matrix, often called a staffing matrix, to the projected industry employment. The staffing matrix assigns industry employment to all of the occupations that are used in a particular industry; it is based on the BLS’ Occupation Employment Statistics (OES) survey, which collects data from employers on a triennial cycle. Throughout the entire estimation process, a number of assumptions are made regarding the path of the economy over the 10-year horizon, the effect of demographic changes on the rate of labor force participation, and technological advances in production. To be sure, output by industry can move at a different pace and even in the opposite direction from employment by industry, reflecting the type of technological progress that allows for more output to be produced with fewer workers. This is an important distinction because growth, as measured by greater output, may not create additional job opportunities in every case. For this reason, careful estimates of future industrial production and employment, based on appropriate assumptions regarding technological developments and expected productivity growth, form the foundation for accurate projections of employment by occupation. The information gained from the occupational employment projections differs in a meaningful way from the projections by industry. While industry-level projections tell us what firms will produce, the occupation-level projections tell us how labor will be combined to do the work. In addition, the occupational projections summarize the net employment change for a particular occupation across various industries. For example, we know the demand for computer systems analysts will grow, but the projections also tell us that the management, scientific, and technical Projected Occupational Employment Composition in 2018 — Fifth District States consulting services industry will increase the State number of analysts they employ, while wired MD VA NC SC Total Major Occupation Groups telecommunications carriers will reduce their Transportation and material moving 5.1 6.3 5.9 6.7 5.4 Production 8.8 7.3 3.1 4.4 5.8 use of computer systems analysts by 2018. This Installation, maintenance, and repair 4.6 4.0 4.0 3.7 3.9 type of information allows a job seeker to tailor Construction and extraction 5.7 5.4 5.1 6.0 5.6 his job search to a particular industry where 0.8 0.2 0.9 0.5 Farming, fishing, and forestry 0.4 chances of success are higher. Office and administrative support 14.4 14.9 14.3 14.5 14.8 Sales and related 10.7 10.4 10.5 10.1 10.9 State workforce agencies and labor market Service 20.7 20.3 21.2 20.1 19.4 information departments use the national proProfessional and related 22.0 18.8 23.9 20.1 24.4 jections from the BLS to prepare their own state 10.9 10.3 12.3 11.7 8.5 Management, business, and financial and local area industry and occupational NOTE: State total does not include all Fifth District jurisdictions. At press time, the District of Columbia has employment projections. Clearly, the more geonot yet published 2008-2018 statistics and West Virginia did not publish group level data. SOURCES: Bureau of Labor Statistics, Individual State Labor Market Information Offices graphically focused projections provide great value to the customers of the information — students of all ages, guidance and career counselors, postsechealth care services, nursing and residential care facilities, ondary education institutions, and economic developers — and social assistance, while South Carolina registered who are all involved in planning at a more local level. To the the fastest growth expected for hospitals. In addition to extent that the industry mix of the state or region differs their high percentage growth rates, many of the health carefrom the industry composition of the nation, the occuparelated industries will also provide the largest gain in the tional employment projections will reveal different trends in absolute number of jobs over the 10-year period. Other demand for occupations at the local level. So, what do the high-growth industries common to these four states include industry and occupational employment projections reveal data centers and informational technology, as well as profesfor the states in the Fifth District? sional, scientific, and technical services. In contrast to the other states, West Virginia’s projected fastest-growing industry over this period is the construction of buildings. Industry-Level Employment Projections Overall employment is expected to grow in the Fifth Most of the states in the Fifth District publish their longDistrict states for which we have data, but only Virginia’s is term projections for employment by industry at the same expected to grow at an average annual rate that exceeds the time that they publish the occupational employment projecnational growth rate of 1 percent. tions. For the top 10 fastest-growing industries in Maryland, West Virginia, Virginia, and South Carolina, some common trends emerged. Health care and social assistance industries Projections for Occupational Employment are expected to show the fastest rates of job growth from The long-term industry employment projections are inter2008-2018 for the Fifth District states and for the nation. esting in themselves, but also are critical as input to the These industries include ambulatory health care services, projections of occupational employment. Shifts in the hospitals, nursing and residential care facilities, and social industrial structure of the economy translate into changes in assistance. (Industry-level projections for 2008-2018 were demand for many occupations and, over time, even the not available from North Carolina and the District of emergence of new occupations. The occupational employColumbia at press time.) ment projections reveal how people are employed in the Virginia projected the fastest job growth in ambulatory base year and how the composition of employment will change over the 10-year horizon. For the nation as a whole, professional and service Fastest-Growing Occupations — Fifth District States occupations already accounted for 40 percent of 3.2 Financial examiners occupational employment in 2008 (the base year). By 3.2 Biochemists and biophysicists 2018, the share of these broad occupation groups is Survey researchers 3.3 expected to increase to 43 percent. On the other hand, sales and office and administrative support Veterinary technologists and technicians 3.3 occupations, which together accounted for more than Medical scientists, except epidemiologists 3.6 26 percent of employment in 2008, will likely make up Home health aides 3.9 a smaller share of employment in 2018, due to the Personal and home care aides 4.1 application of technology that reduces the number of Mathematical scientists, all other 4.6 sales personnel and office clerks required to support a Network systems and data communications 4.6 business. Likewise, the share of production workers is analysts 5.9 Biomedical engineers expected to decline, from 6.7 percent in 2008 to 5.9 0 1 2 3 6 4 5 7 percent in 2018, as manufacturing continues to impleAverage Annual Growth Rate (Percent), 2008-2018 ment technology that changes the quantity and the NOTE: Does not include Washington, D.C. SOURCES: Bureau of Labor Statistics, Individual State Labor Market Information Offices mix of workers. Region Focus | First Quarter | 2011 U.S. 6.1 5.9 3.8 5.3 0.6 15.6 10.2 20.2 21.8 10.5 41 Occupations with Largest Gain — Fifth District States 23,581 Personal and home care aides Management analysts 25,073 Nursing aides, orderlies, and attendants 25,468 Accountants and auditors 26,314 36,911 Office clerks, general Customer service representatives 37,952 Retail salespersons 38,090 50,781 Home health aides Combined food preparation and serving workers, including fast food Registered nurses 52,555 61,241 0 10 20 30 40 50 60 70 (IN THOUSANDS) Employment Change, 2008-2018 NOTE: Does not include Washington, D.C. SOURCES: Bureau of Labor Statistics, Individual State Labor Market Information Offices Across the Fifth District, there is some variation in the projected composition of employment by occupation in 2018. Maryland and Virginia’s employment in professional and related occupations will exceed the national average, with shares of 24.4 percent and 23.9 percent, respectively, versus 21.8 percent for the nation. South Carolina expects a higher share of production workers relative to the other Fifth District states and the nation, but it will also have a higher share of service and sales related occupations as well (see table on page 41). Changing demand for occupations is best explored by a closer look at some of the 750 occupations for which longterm projections are available. It is helpful to consider the rate of change in employment by occupation, as well as the change in the number of jobs by occupation. The fastestgrowing occupations do not necessarily create the greatest number of jobs, although their growth is important in terms of the needs of particular industries and the educational programs that generate the pipeline of future workers. Also, some of the fastest-growing occupations are also the most highly compensated. In contrast, some of the greatest job gains will be created in occupations with a modest growth rate (due to the high number of workers in those occupations), such as cashiers and food preparation, which will grow fairly steadily to match population growth, or registered nurses, whose numbers will grow to serve the rising share of older age groups within the population. Within the Fifth District, biomedical engineers was the fastest-growing occupation, ranking first for every state, with an average annual growth of 5.9 percent expected between 2008 and 2018 in the District, compared to growth of 5.6 percent nationally. (See chart on page 41.) Together, Virginia, Maryland, and North Carolina account for 95 percent of the demand for biomedical engineers in the Fifth District in 2018, although every District state registers a growth rate of at least 5 percent. Moreover, the median salary for biomedical engineers in the United States was $77,400 in 2008, making this a highly compensated occupation. Advances in technology and innovations in medicine will drive the high growth for biomedical engineers over the coming decade. 42 Region Focus | First Quarter | 2011 Other occupations with high rates of expected growth in the Fifth District include network systems and data communications analysts, mathematical scientists, personal home care aides, and home health aides, rounding out the top five growth occupations. The fastest-growing occupations derive directly from the fast-growing industries projected for the Fifth District, where health care, data processing, and professional and technical services topped the list. By comparison, the largest absolute gain in employment by occupation is in registered nurses, an occupation that is projected to grow by 61,241 in the Fifth District, far outnumbering other occupations in terms of total jobs created. (See adjacent chart.) Although employment of registered nurses is expected to grow in percentage terms at only a moderate rate of 2.2 percent, the growth in the number of registered nurses implies a greater need for education and training programs in Fifth District community colleges and four-year colleges and universities. Registered nurses also earn relatively high salaries, with the national median of $62,450 in 2008. Clearly, for individuals with an interest in nursing, the opportunities are abundant and the wage compensation is significant, especially compared to other occupations that require a similar education background. In general, however, many of the occupations with the greatest gains in employment in the Fifth District are not at the higher end of the compensation scale. Other occupations that stand to gain large numbers from 2008 to 2018 include food preparers, home health aides, retail sales persons, and customer service representatives. Many occupations will experience an outright decline in the number employed and perhaps a sharp contraction in their rate of growth. In the Fifth District, these occupations are employed primarily in industries that have been experiencing structural decline over the past few decades. The textile and apparel industries, as well as furniture manufacturing, have been particularly pressured by foreign competition, but also by labor-saving technological progress. Occupations such as sewing machine operators and other operators of textile-related machinery will experience a continued decline in employment from 2008 to 2018. Most of the declining occupations are concentrated in the production group, both within the Fifth District and nationally. At a local level, communities struggle to provide employment opportunities to individuals who have lost their jobs through these structural changes. Workforce development agencies can use the occupational employment projections to help steer displaced workers in perhaps a more appropriate direction for retraining in occupations that have solid growth prospects. Education and Training Requirements As an important component of the occupational employment projections, the BLS assigns an education or training category to each occupation to indicate the most significant source of postsecondary education or training among workers who have become fully qualified in that occupation. The education-related categories include first professional degree, doctoral degree, master’s degree, bachelor’s degree plus work experience, bachelor’s degree, associate degree, and postsecondary vocational award. The postsecondary vocational award refers to certificates or awards that can be earned in as long as a year or as short as a few weeks. The other education categories match the definitions used in the Census Bureau’s educational attainment data or other sources of education statistics. The work-related training categories include work in a related occupation, which mainly applies to supervisors or managers, and on-the-job training, which varies in length and accompanying instruction. Short-term on-the-job training applies to occupations in which the skills needed to be fully qualified can be acquired during one month or less of on-the-job experience and a short demonstration of job duties. Moderate-term on-the-job training involves a period of one to 12 months of on-the-job experience combined with informal training to be considered fully qualified in the occupation. Finally, long-term on-the-job training requires more than 12 months of on-the-job experience and formal classroom instruction and may take the form of formal or informal apprenticeships that last several years. Employment in occupations that involve some level of postsecondary award or degree made up about a third of national employment in 2008, but higher education will become increasingly important as nearly half of all new jobs expected to be created from 2008 to 2018 fall in this category. The same trend holds true for the Fifth District, where education at the level of a postsecondary award or degree will account for a third of expected employment in 2018, but nearly half of the growth in employment over the 10-year period (see the following charts). Indeed, a bachelor’s degree will be the most significant source of education or training for 22 percent of the new jobs created from 2008 to 2018. Most of the fastest-growing jobs in the Fifth District require at least a bachelor’s degree and include such occupa- Fifth District — Education and Training Requirements for Employment Change, 2008-2018 First professional degree 1.7% Doctoral degree 1.5% Short-term on-the-job training 29.5% Master’s degree 2.8% Bachelors or higher degree, plus work experience 4.2% Bachelor’s degree 22.3% Moderate-term on-the-job training 10.4% Long-term on-the-job training 4.6% Associate degree 7.8% Postsecondary vocational award 7.5% Work experience in a related occupation 7.5% NOTE: Does not include Washington, D.C. SOURCES: Bureau of Labor Statistics, Individual State Labor Market Information Offices tions as biomedical engineers, network systems and data communications analysts, and financial examiners. The occupation predicted to grow the most in absolute terms in the Fifth District, registered nurses, requires at least an associate’s degree to be fully qualified. Nonetheless, there will still be many jobs that require only short-term on-the-job training, as this category will account for 36 percent of projected employment in the Fifth District in 2018 and 30 percent of employment growth from 2008 to 2018. As noted earlier, jobs requiring a higher level of education and training earn a higher median wage. Nationally, jobs requiring a bachelor’s degree paid a median wage of $57,770 in 2008, while jobs that required only short-term on-the-job training paid $21,320. Several occupations predicted to gain in great numbers in the Fifth District fall in the category of short-term on-the-job training, including food preparers and servers, home health aides, and retail salespersons. Conclusion Fifth District — Education and Training Requirements for Projected Employment (2018) First professional degree 1.2% Doctoral degree 0.9% Master’s degree 1.7% Bachelor’s or higher degree, plus work experience 4.6% Short-term on-the-job training 35.5% Bachelor’s degree 13.9% Associate degree 4.6% Postsecondary vocational award 6.0% Work experience in a related occupation 9.9% Moderate-term on-the-job training 15.2% Long-term on-the-job training 6.4% NOTE: Does not include Washington, D.C. SOURCES: Bureau of Labor Statistics, Individual State Labor Market Information Offices As a faster pace of job gains likely takes hold this year, the labor market will quickly reveal which occupations are in highest demand. It is equally important, however, to look further ahead to understand the longer-term changes in our economy as they relate to the demand for specific occupations. Projections from the BLS provide 10-year industry and occupation projections at the national level, while the individual efforts of state labor market information agencies produce state- and local-level projections. In combination with information on education and training requirements, the occupational projections provide powerful information for individuals entering the labor market or considering a career change, as well as the counselors, advisers, and educational institutions who serve them. In addition, the long-term occupational employment projections provide a view of future labor demand so that local economic developers and providers of education and training can synchronize their efforts more effectively. RF Region Focus | First Quarter | 2011 43 State Data, Q3:10 DC MD NC SC VA WV 709.3 2,518.0 3,859.5 1,808.5 3,632.9 747.9 Q/Q Percent Change Y/Y Percent Change -0.8 -0.2 -0.4 0.0 -0.2 0.2 1.2 0.3 -0.5 0.4 0.3 1.0 Manufacturing Employment (000s) Q/Q Percent Change Y/Y Percent Change 1.2 -7.7 -14.3 114.3 -0.8 -2.0 431.6 -0.1 -1.4 207.5 0.2 -0.8 229.8 -0.8 -2.6 49.2 0.1 -0.1 Professional/Business Services Employment (000s) 149.3 Q/Q Percent Change 0.8 Y/Y Percent Change 1.7 386.0 -0.1 1.3 484.7 1.1 5.3 217.7 2.1 9.8 649.9 0.5 2.2 61.0 0.8 2.5 Government Employment (000s) Q/Q Percent Change Y/Y Percent Change 243.3 -1.6 0.4 502.5 -0.3 1.7 698.1 -2.4 -1.3 346.2 -1.6 -0.5 698.5 -1.8 0.2 152.5 -1.1 1.7 Civilian Labor Force (000s) Q/Q Percent Change Y/Y Percent Change 332.3 -1.0 0.4 2,978.1 -0.1 -0.5 4,486.9 -1.1 -1.1 2,159.7 -0.3 -0.6 4,177.3 -0.2 -0.1 779.2 -0.5 -2.2 9.8 9.9 10.0 7.4 7.4 7.4 10.1 10.8 11.0 11.0 11.2 11.7 6.8 7.0 7.1 9.2 8.8 8.4 38,232.6 0.4 256,038.4 0.5 306,949.6 0.5 138,294.7 0.5 324,061.8 0.5 54,408.5 0.5 2.7 1.8 3.1 2.9 2.1 2.3 Building Permits Q/Q Percent Change Y/Y Percent Change 235 658.1 44.2 3,124 -10.0 29.5 8,485 -11.9 -9.3 3,371 -14.9 -22.8 6,079 6.7 12.4 423 -27.1 -42.4 House Price Index (1980=100) Q/Q Percent Change Y/Y Percent Change 572.2 2.1 2.5 437.1 1.5 -1.6 321.3 0.4 -2.6 326.3 0.9 -2.3 415.1 0.9 -1.2 226.2 -0.2 1.0 Sales of Existing Housing Units (000s) Q/Q Percent Change Y/Y Percent Change 8.0 -23.1 -9.1 65.2 -24.2 -13.3 112.8 -30.5 -21.2 58.8 -30.7 -19.7 103.2 -12.8 -17.0 24.4 -14.1 -15.3 Nonfarm Employment (000s) Unemployment Rate (%) Q2:10 Q3:09 Real Personal Income ($Mil) Q/Q Percent Change Y/Y Percent Change 44 Region Focus | First Quarter | 2011 Nonfarm Employment Unemployment Rate Real Personal Income Change From Prior Year First Quarter 2000 - Third Quarter 2010 Change From Prior Year First Quarter 2000 - Third Quarter 2010 First Quarter 2000 - Third Quarter 2010 8% 7% 6% 5% 4% 3% 2% 1% 0% -1% -2% -3% 10% 4% 3% 2% 1% 0% -1% -2% -3% -4% -5% -6% 9% 8% 7% 6% 5% 4% 3% 00 01 02 03 04 05 06 07 08 09 10 00 01 02 03 04 05 06 07 08 09 Fifth District 10 00 01 02 03 04 05 06 07 08 09 United States Nonfarm Employment Metropolitan Areas Unemployment Rate Metropolitan Areas Building Permits Change From Prior Year Change From Prior Year First Quarter 2000 - Third Quarter 2010 First Quarter 2000 - Third Quarter 2010 First Quarter 2000 - Third Quarter 2010 7% 6% 5% 4% 3% 2% 1% 0% -1% -2% -3% -4% -5% -6% -7% -8% Change From Prior Year 30% 13% 12% 11% 10% 9% 8% 7% 6% 5% 4% 3% 2% 1% 00 01 02 03 04 05 06 07 08 09 Charlotte Baltimore 20% 10% 0% -10% -20% -30% -40% -50% 00 01 02 03 04 05 06 07 08 09 10 Washington Charlotte Baltimore FRB—Richmond Manufacturing Composite Index First Quarter 2000 - Third Quarter 2010 First Quarter 2000 - Third Quarter 2010 30 30 20 20 10 10 First Quarter 2000 - Third Quarter 2010 16% 14% 12% 10% 8% 6% 4% 2% 0% -2% -4% -6% -8% -20 -10 -30 -40 -20 -30 -50 00 01 02 03 04 05 06 07 08 09 10 United States Change From Prior Year -10 0 Fifth District 10 House Prices 0 10 00 01 02 03 04 05 06 07 08 09 Washington FRB—Richmond Services Revenues Index 40 10 00 01 02 03 04 05 06 07 08 09 10 00 01 02 03 04 05 06 07 08 09 Fifth District 10 United States NOTES: SOURCES: 1) FRB-Richmond survey indexes are diffusion indexes representing the percentage of responding firms reporting increase minus the percentage reporting decrease. The manufacturing composite index is a weighted average of the shipments, new orders, and employment indexes. 2) Building permits and house prices are not seasonally adjusted; all other series are seasonally adjusted. Real Personal Income: Bureau of Economic Analysis/Haver Analytics. Unemployment rate: LAUS Program, Bureau of Labor Statistics, U.S. Department of Labor, http://stats.bls.gov. Employment: CES Survey, Bureau of Labor Statistics, U.S. Department of Labor, http://stats.bls.gov. Building permits: U.S. Census Bureau, http://www.census.gov. House prices: Federal Housing Finance Agency, http://www.fhfa.gov. Region Focus | First Quarter | 2011 45 Metropolitan Area Data, Q3:10 Washington, DC Nonfarm Employment (000s) Q/Q Percent Change Y/Y Percent Change Hagerstown-Martinsburg, MD-WV 2,417.5 -0.1 1.1 1,276.0 -0.8 0.3 97.4 -0.7 0.1 6.1 6.1 6.2 8.0 7.4 7.7 9.7 9.4 9.1 3,365 6.6 20.1 1,307 -0.5 18.6 148 -45.4 -28.8 Asheville, NC Charlotte, NC Durham, NC 167.1 -0.4 0.5 796.6 -1.1 -0.2 279.5 -0.7 -0.6 Unemployment Rate (%) Q2:10 Q3:09 7.9 8.5 8.9 11.0 11.2 12.1 7.2 7.4 8.3 Building Permits Q/Q Percent Change Y/Y Percent Change 552 41.9 81.6 1,235 -27.5 -38.1 628 24.4 57.8 Raleigh, NC Wilmington, NC 338.9 -0.5 -0.2 498.4 0.2 0.6 137.9 -0.6 -0.7 Unemployment Rate (%) Q2:10 Q3:09 10.3 10.8 11.6 8.1 8.4 9.1 9.3 9.7 10.1 Building Permits Q/Q Percent Change Y/Y Percent Change 536 3.5 -2.5 1,305 -16.7 -2.0 407 -30.7 -30.3 Unemployment Rate (%) Q2:10 Q3:09 Building Permits Q/Q Percent Change Y/Y Percent Change Nonfarm Employment ( 000s) Q/Q Percent Change Y/Y Percent Change Greensboro-High Point, NC Nonfarm Employment (000s) Q/Q Percent Change Y/Y Percent Change 46 Baltimore, MD Region Focus | First Quarter | 2011 Winston-Salem, NC Charleston, SC Columbia, SC 202.5 -1.5 -1.7 284.5 -0.4 1.0 340.9 -1.3 -0.6 Unemployment Rate (%) Q2:10 Q3:09 9.2 9.6 10.2 9.4 8.8 10.2 9.4 8.8 9.9 Building Permits Q/Q Percent Change Y/Y Percent Change 309 -1.3 -6.1 661 -10.8 -25.5 782 -7.1 -3.6 Greenville, SC Richmond, VA Roanoke, VA 294.8 -0.5 0.9 602.6 -0.8 0.7 153.4 -1.8 -0.5 9.8 9.5 11.1 7.8 7.7 7.8 7.3 7.3 7.5 318 -16.1 -19.9 1,033 0.4 6.1 113 -19.3 -3.4 Virginia Beach-Norfolk, VA Charleston, WV 738.3 -0.3 -0.3 148.7 0.4 0.4 113.4 -1.3 0.4 7.3 7.3 7.0 7.9 7.8 7.1 8.6 8.3 8.2 1,068 -8.1 -10.1 41 20.6 -12.8 9 12.5 28.6 Nonfarm Employment (000s) Q/Q Percent Change Y/Y Percent Change Nonfarm Employment (000s) Q/Q Percent Change Y/Y Percent Change Unemployment Rate (%) Q2:10 Q3:09 Building Permits Q/Q Percent Change Y/Y Percent Change Nonfarm Employment (000s) Q/Q Percent Change Y/Y Percent Change Unemployment Rate (%) Q2:10 Q3:09 Building Permits Q/Q Percent Change Y/Y Percent Change Huntington, WV For more information, contact Sonya Ravindranath Waddell at (804) 697-2694 or e-mail Sonya.Waddell@rich.frb.org Region Focus | First Quarter | 2011 47 OPINION Should the Financial Crisis and Historic Recession of 2007-2009 Change the Practice of Economics? BY J O H N A . W E I N B E RG and institutions to function so poorly. Both of these narraf course it should, and it will. As our cover story tives occupy places in the mainstream of financial in this issue of Region Focus makes clear, the economics. One is that financial fragility results from economics profession has always learned from externalities in the distribution of risk in markets. These events. And events of the magnitude and uniqueness as externalities have to do with the effects of one firm’s those seen in the period we’ve just come through can have performance — especially when it incurs large losses — on a profound impact on how scholars and policy analysts another’s condition. Because firms don’t take these external frame and approach questions. For instance, the discipline effects into account, they take on more risk than they otheris still learning about and adapting its work in response to wise would. This results also in the mispricing of risk. This the Great Depression, as debates continue regarding its narrative is really just a version of a concept — externalities causes, the effectiveness of policy responses, and how it — which has been a part of mainstream economic thought compares to other significant contractions throughout since at least the late 1890s when history and around the world. Alfred Marshall formally identified But I think those who foreOf particular interest in the the issue and then his student see wholesale change in economic science are likely to be disappointed. wake of the financial crisis is Arthur Pigou further developed the idea and its potential implications The events of the last few years are not cause to throw out the prevailthe profession’s approach to for public policy. There is an active body of theoretical research articuing paradigm in economic research the study of financial lating the conditions under which — the notion that resource allocasuch systemic externalities might tion can be understood in the markets and institutions. arise, although empirical validation context of individual optimization, has proved a challenge. with the reconciliation of conflictAn alternative, although not mutually exclusive narrative, ing goals being achieved through the equilibrium of a market suggests that the mispricing of risk and the associated mechanism. That itself is a pretty big tent, and it contains tendency of firms to take on too much risk is the result of within it many lines of research and ideas that may prove government policy. In particular, if market participants useful in making sense of our extraordinary recent past. believe that the government will protect firms or their Of particular interest in the wake of the financial crisis is creditors from severe losses in the event of a financial crisis, the profession’s approach to the study of financial markets then they will tend to underweight risk in making their and institutions. A caricatured depiction of the discipline’s investment decisions. Just like externalities, this will lead to approach to financial market behavior would be to say that the underpricing of risk. This moral hazard view of financial economists put too much weight on the efficient markets market dysfunction has also been a part of mainstream hypothesis and therefore missed indicators of dysfunction research for a long time. in markets. But the efficient markets hypothesis — roughly So economists were working out ideas about financial stated, that well-functioning markets do a good job of incorinstability well before the crisis. Unfortunately, that work porating relevant information about fundamentals into asset had not yet gotten us to the point of being able to quantify prices — is really just a benchmark against which to measure the effects of either externalities or moral hazard. The observed financial market performance. One important line events of the last few years will have a powerful influence on of research in the mainstream of financial economics is to how these and related lines of research continue, and should take apparent deviations from market efficiency — evidence help us better understand the relative importance of alternaof mispriced assets — seriously, and to seek to understand tive financial market imperfections. RF the frictions that cause observed behavior to differ from the benchmark. There are two narratives about the financial crisis that John A. Weinberg is senior vice president and director of represent different views about which forces caused markets research at the Federal Reserve Bank of Richmond. O 48 Region Focus | First Quarter | 2011 NEXTISSUE International Housing Policy As the United States debates how to reform housing finance policy, some useful insights can be drawn from other developed nations — many of which direct far fewer resources toward subsidizing housing finance. Interview Bruce Yandle of Clemson University discusses environmental policy, the deficit, and his experience making the transition from business to academic economics. Economic History Development Economics Despite decades of effort and billions of dollars donated in foreign aid each year, 1.5 billion people still live in extreme poverty, on less than $1.25 per day. What do economists know about why some countries thrive economically while others struggle to develop? What should rich countries and international institutions be doing to help? For more than 180 years, railroads have played a vital role in the economic development of the Fifth District and the United States. Pioneering carriers, such as the Baltimore and Ohio Railroad, overcame many obstacles to increase productivity and expand trade throughout the region and country. Disaster Relief Upfront Is it unwise to target donations in the wake of a disaster? Some economists say such donations may pour money into affected regions inefficiently and divert money from other needy areas. Donations may be more effective when organizations with broader missions can direct funds as needed. Does success in high-profile athletic events at universities lead to an increase in applicants and alumni donations? Visit us online: www.richmondfed.org • To view each issue’s articles and Web-exclusive content • To add your name to our mailing list • To request an e-mail alert of our online issue posting Federal Reserve Bank of Richmond P.O. Box 27622 Richmond, VA 23261 Change Service Requested Please send subscription changes or address corrections to Research Publications or call (800) 322-0565. • “A Regional Look at the Role of House Prices and Labor Market Conditions in Mortgage Default” by Sonya Ravindranath Waddell, Anne Davlin, and Edward Simpson Prescott • “Housing and the Great Recession: A VAR Accounting Exercise” by Samuel E. Henly and Alexander L. Wolman • “Optimal Contracts for Housing Services Purchases” by Borys Grochulski • “Mortgage Reform and Growing Exposure of the Federal Housing Administration’s Mortgage Mutual Insurance Fund” by Brent C Smith You can access Economic Quarterly and more at: www.richmondfed.org/publications