Of course, politics and psychology could not in the end be separated entirely from economics. What Americans think of themselves, their lives and their society has always reflected economic fortune. Inflation and disinflation had widespread effects. For most of the half-century cycle, there was an unmistakable pattern. In the first two decades or so, rising inflation weakened the economy. Then, over the next twenty years, disinflation strengthened it, until infectious optimism turned into unsustainable speculation and contributed to the ill-fated stock market and housing "bubbles."

With hindsight, we know that high inflation was inherently destabilizing. From 1969 to 1982, inflation averaged 7.5 percent, unemployment 6.4 percent. From 1990 to 2007, the comparable figures were 2.9 percent for inflation and 5.4 percent for unemployment. Similarly, productivity gains have crudely conformed to inflation's fluctuations. Recall that "productivity" is economic jargon for "efficiency" and that productivity growth (usually measured as output per hour worked) is the wellspring of higher living standards. The more efficient we are, the higher our incomes or the more time we have for leisure. Wages, salaries and fringe benefits all tend to rise with productivity. Without productivity growth, economic progress as most people understand the term would not exist. Since 1950, inflation and productivity have seemed to march hand in hand. From 1950 to 1965 (low inflation), annual productivity growth averaged 3.1 percent. When inflation rose in the late 1960s, productivity growth deteriorated (1965-70 average: 2.4 percent). In the years of the worst inflation, 1973-80, average productivity growth was a meager 1.1 percent. It improved in the 1980s to 1.8 percent, and when inflation returned to the 1950s level in the late 1990s, so did productivity growth (1995-2005 average: 2.9 percent).*18

This was not an accident. No one would contend that inflation single-handedly determined productivity trends. Indeed, productivity responds to so many influences—technology, government policies, competition, risk taking—that economists have never fully explained its movements. Some early gains after World War II undoubtedly reflected the adoption of new technologies whose introduction had been delayed by the war and the Depression: direct

* In fairness, if I had chosen other end points, the productivity picture would look different. For example, from 1970 to 1995, annual productivity growth averaged 1.7 percent, making it seem that inflation had little effect. It is precisely because the numbers can be arranged to tell many different stories that the subject is controversial. However, my selection is hardly contrived. It is common. In a recent report, for example, the Congressional Budget Office picked similar time periods. (See table 2-2, "The Budget and Economic Outlook 2008-2018," January 2008.)

distance dialing (which reduced the need for operators); faster and bigger planes for commercial aviation; advanced machine tools. Some recent gains originated with computers and the Internet. Still, higher inflation hurt. One reason was the "money illusion": the tendency noted earlier to mistake price increases for real gains. As inflation rose, companies' sales and profits grew rapidly. Managers believed they were doing better than they were; they paid less attention to the many small daily operational matters that improve efficiency. From 1964 to 1974, after-tax profits jumped from $41 billion to $95 billion. Because profits are how most managers evaluate themselves (and are evaluated), what was the problem?

Simple: The gains mostly reflected inflation. When some companies voluntarily published inflation-adjusted financial statements in the late 1970s, the results were sobering. In 1978, General Motors reported that its sales and profits were up 77 percent and 46 percent, respectively, from five years earlier. Impressive, it seemed. But when adjusted for inflation, the sales increase dwindled to 20 percent, and the profit increase disappeared altogether. GM was no fluke. Though reported profits rose smartly in the 1970s, profit margins—profits as a share of sales—fell from 17 percent in the 1950s to 11 percent in the 1970s.19 Initially, many executives may not have appreciated what was happening. But by the late 1970s, only the dullest manager could not have suspected the reality. Yet many corporate managers were "not anxious to move to accurate profit reporting" by adopting inflation-adjusted accounting, The Wall Street Journal editorialized in 1979. "They would rather be publicly pilloried for [price] gouging than explain losses and low profits to shareholders." As late as 1981, BusinessWeek chided:

Through more than a decade of inflation, a generation of corporate managements has refused to admit that the earnings reported to shareholders—and frequently cited as "record profits"—are not all that they are cracked up to be. Double-digit inflation has rendered the traditional yardstick of company performance illusory or suspect.20

Once inflation diminished, managers could no longer hide. By the 1990s, many firms complained that they'd lost "pricing power," even as pressures from Wall Street to increase profits intensified. Profits would now rise mostly from increased sales or reduced costs. Managers had to search for new ways to increase productivity. High inflation also hurt productivity growth in other ways. The most obvious was the added time and effort required to make frequent price changes—a phenomenon that economists call "menu costs." (The reference is to the costs of changing a menu.) As inflation rose, whether to increase prices 3 percent or 5 percent became as important to profits as productivity gains. Trying to comply with, or evade, the various forms of wage-price controls also consumed managers' time. Finally, inflation interacted with the tax code to reduce incentives for new, productivity-enhancing investments. Depreciation allowances—a noncash cost covering the aging of machinery and buildings—are intended to help companies pay for new equipment, machines, factories and offices. But the allowances are based on historic costs; inflation eroded their value. As inflation raised replacement costs for new investments, depreciation allowances were increasingly inadequate.

Interest rates were another crucial mechanism by which inflation reshaped the economy. Interest rates are the price of money: what people and firms pay to borrow. Of all the prices in the economy, interest rates are the most important, because they affect so much else. They overshadow other significant prices—say for wheat, oil or computer chips. In practice, many factors determine interest rates: the supply of savings; the demand for credit; the state of the business cycle; Federal Reserve policies; the nature of financial markets. But inflation and the expectations of future inflation play a large role, because lenders want to be protected against the possible erosion of the value of their money. Higher inflation causes—with an uncertain lag—interest rates to rise; and falling inflation causes—also with a lag—rates to fall. In 1965, 30-year fixed-rate home mortgages averaged 5.8 percent; by 1980, the average was 12.7 percent; and by

2005, it was back down to 5.9 percent. For the same years, commercial banks' "prime rate" offered to the best customers went from 4.5 percent to 15.3 percent to 6.2 percent. These dramatic swings profoundly affected credit markets, the stock market and the value of land and housing. There were enormous ripple effects.21

Remember the S&L crisis? To most Americans, "savings and loan associations"—also known as S&Ls and "thrifts"—are now relics. In

2006, there were only 845 of them, with about $1.5 trillion in assets; by contrast, more than 7,000 commercial banks had nearly $10 trillion in assets. But in 1975, the roughly 5,400 thrifts made about half the nation's home mortgages. S&L managers often lived a "three-six-three" day, wrote economist Lawrence White. They "could take in money at 3 percent on deposits; they could lend it out at 6 percent on mortgage loans; and [they] could be on the golf course by

3:00 in the afternoon." But this cushy arrangement required stable prices, because S&Ls "borrowed short" (short-term depositors could withdraw their funds anytime) and "lent long" (mortgages had fixed interest rates and 30-year maturities). If higher inflation pushed up deposit rates, the S&Ls' borrowing costs might exceed repayment rates on older mortgages. That's what happened.22

The S&L crisis is typically cast as a tale of inept government regulation and corrupt lending. S&Ls squandered their funds on ill-conceived housing projects, shopping malls and resorts. But the main story involves inflations destructiveness. By 1981, 85 percent of thrifts were unprofitable. As short-term interest rates rose, they faced a dilemma: either raise their own deposit rates, which might make them unprofitable, or face a huge outflow of deposits, which would make them insolvent. The advent of money-market mutual funds in the late 1970s rendered government interest-rate ceilings on deposits at banks and S&Ls ineffective; savers could move their money elsewhere. Only after S&Ls faced this squeeze did inept regulation and lending mushroom. Government liberalized S&Ls' lending authority in the hope that profits on new loans for commercial real estate would offset losses on old mortgages; but speculative new loans simply compounded the losses. The S&Ls' collapse cost taxpayers about $160 billion—the difference between what depositors (protected by federal deposit insurance) were owed and what the failed S&Ls' assets were worth.23

The interaction of inflation and credit markets caused other convulsions. American farmers borrowed huge amounts, based on the inflation of crop and land prices. "Lenders would come out to the farm," one Iowa farmer told journalist William Greider, "and they would say, 'That tractor looks a bit aging.' So the farmer would buy a new one. Why not?" Farmers' income and wealth seemed ample. From 1972 to 1975, wheat went from $1.34 a bushel to beyond $4.00, corn from $1.08 to $3.02. In Iowa, land prices quintupled, from $319 an acre in 1970 to $1,697 in 1982. But when interest rates rose and crop prices collapsed in the 1980s, many farmers were crushed by debt. Widespread foreclosures ensued. So did "tractor-cades" to Washington and some well-publicized suicides among farmers who lost land that had been in their families for generations.24

The so-called Third World debt crisis followed a similar trajectory. In the 1970s, commercial bank loans to Latin American countries barely existed; by 1982, these debts totaled $327 billion. It was reasoned that the rising prices for commodities, a mainstay of their exports, would enable them to pay foreign debts, denominated mostly in dollars. Mexico and Venezuela had oil; Brazil, coffee; Argentina, wheat and meat. But rising interest rates (most loans had "floating" rates that changed automatically) and falling commodity prices in the early 1980s destroyed this logic. In August 1982, Mexico defaulted on $80 billion of loans. Fifteen other Latin countries followed suit. The following years are called "the lost decade," as many Latin nations—burdened with debts they could not repay— suffered slow economic growth and rising poverty.25

Unsurprisingly, the swings in interest rates also played havoc with stocks. Half of U.S. households now own stocks or mutual funds. It is conventional wisdom that the 1990s'"high tech" frenzy was responsible for luring people into the market. This is only half true.

Go back to the 1950s, and you discover that stocks and stock ownership also flourished. From the end of World War II until 1965, the Dow Jones Industrial Average quintupled. The number of shareholders jumped from 6.5 million in 1950 to 30.9 million in 1970. But higher inflation halted the market's rise and squelched the enthusiasm for stocks. In August 1979, BusinessWeek wrote an obituary. "The Death of Equities: How Inflation Is Destroying the Stock Market" was the cover line. "Have you been to an American stockholders' meeting lately?" asked one young corporate executive. "They're all old fogies." Among those under sixty-five, the number of shareholders had dropped 25 percent during the decade. Poor performance had alienated younger investors.26

In 1982, the Dow was actually lower than in 1965. Inflation made "investors very cautious," BusinessWeek argued. Americans had learned "that inflation will lead to an economic downturn that will wreck corporate profitability and stock prices. This happened in 1974, when the worst recession since the Depression followed the last burst of double-digit inflation." But in a larger sense, stocks had fallen victim to the merciless logic of rising interest rates. Stocks, bonds, bank deposits and money market mutual funds all compete for investors' dollars. As rates on bonds and other interest-bearing investments rose, stocks had to stay competitive. Their earnings yields had to rise; paradoxically, this put downward pressure on stock prices. In January 1973, the Dow had hit a record of 1,051.70. By December 1974, it had dropped by almost half to 577.60. In 1979, when adjusted for inflation, stocks were still down 50 percent from the 1973 peak.

A simple example shows why. Suppose a Treasury bond pays a 5

percent rate. Stocks, being riskier, have to offer a higher yield, say 7 percent. A stock s yield is the company's per share earnings (profits) divided by its stock price. A company with earnings of $7 a share and a $100 stock price would have an earnings yield of 7 percent. Now assume that the interest rate on the Treasury bond jumps to 10 percent. To maintain the 2 percentage point premium over bonds, the stock's earning yield has to go to 12 percent. With $7 per share earnings, the stock's price would fall to $58 ($7 is 12 percent of $58). If profits had risen fast enough, stock prices might have increased; in practice, this didn't happen. Higher inflation sabotaged the stock market.*

But when inflation broke in the early 1980s—and interest rates began to tumble—the process reversed with the same frenzied logic. Falling rates and rising profits propelled shares upward. By 1986, the

* Many economists argue that this should not have been so—that investors misjudged inflation's effects on stocks. Inflation, they argued, benefited some companies by eroding the real value of their debt. Investors ignored that. Moreover, investors should have made decisions based on "real" (inflation-adjusted) and not nominal interest rates—and "real" rates were low in the 1970s and high in the 1980s. Perhaps. But two points need to be made. First, the actual connection between shifts in interest rates and changes in stock prices during much of this period (the 1970s, the 1980s and the early 1990s) is incontestable. Higher rates depressed stock prices; lower rates elevated them. The connection was not always automatic, but the broad relationship is obvious. Second, inflation made it harder to predict the future and to estimate "real" values over any extended period. Thus, investors may have been rational in comparing stocks and other investments based primarily on their present returns.27

Dow had doubled from 1982; by 1989, it had almost tripled; by 1996, it had more than sextupled. Investors flooded the market, well before the "high tech" obsession. In 1989, only 31.6 percent of households owned stocks or mutual funds; by 1998, the share was 48.8 percent. The market's relentless advance convinced many investors, often neophytes, that prices would move inexorably upward, even if there were periodic interruptions. It was this mass conviction that set the stage for the final speculative binge, the "tech bubble." Stocks rose to levels completely inconsistent with historic relationships. At one point,Yahoo!'s stock sold for 2,154 times earnings (profits); by contrast, the historical average price-to-earnings ratio of all stocks was closer to 14 or 15. Even so, the "bubble" was fairly short. One careful study dates its onset to between mid-1997 and late 1998; the market peaked in early 2000.28

Indeed, inflation helped transform the entire financial system— and not just the stock market. Though often arcane, high finance serves a simple purpose: to channel a society's savings into productive investment. From the Great Depression until the 1980s, much of the financial system was highly compartmentalized. Lending was dominated by banks and S&Ls, which provided most home mortgages and consumer and business loans. (Only blue-chip companies could raise capital by selling bonds; other firms borrowed from banks.) But inflation destroyed some of this system (the S&Ls) and damaged much of the rest (commercial banks—they suffered losses on bad loans to farmers, energy companies, real estate developers and developing countries). New ways had to be found to provide credit. What emerged was "securitization." Because banks and the few surviving S&Ls had limited funds, they—and others—increasingly originated loans but then bundled them into bondlike securities that were sold to pension funds, insurance companies, mutual funds, college endowments and other big investors. Thus were home mortgages, auto loans, credit-card debt and other types of loans increasingly financed.

In some ways, the new system was superior to the old. It tapped new sources of credit and spread risk. But in some ways, it was inferior. Some of the old system's safeguards were absent in the new. Banks and S&Ls were generally close to local borrowers—home owners, consumers, businesses—and held most of their loans in their portfolios. Therefore, they had an incentive to provide credit only to borrowers with good repayment prospects. With securitization, caution receded. Lenders and borrowers were often widely separated. An "originator" (say, a mortgage broker) might make a loan that would then be sold to an investment bank (say, Goldman Sachs) that would "securitize" it and sell it to final investors. All the middlemen had incentives to complete transactions from which they earned fees. Most middlemen did not hold final loans. With hindsight, we can see how the combination of overconfidence about home prices and careless lending practices fed the housing "bubble." Even today, inflation's side effects linger.

The American economy is always changing; that is probably its only permanent characteristic. Had there been no inflation, the economy would be different today from what it was fifteen years ago or fifteen years before that. But there were inflation and disinflation. They fostered the instability of the 1970s, the long expansions of the 1980s and 1990s, the swings in productivity growth, and the consumption, stock and housing booms. All of these changes had other causes, and some might have occurred in some fashion anyway. But all were also by-products of the inflationary experience.

o take even this truncated tour of the past half century is to confirm the huge effects of inflation—both its going up and its coming down—on national life. It was not a sideshow; it was part of the main show. It shaped politics, the economy, the national mood, financial markets and much more. Which highlights the central puzzle: If inflations so important, why is it so ignored?

By now, there is a vast literature recounting the American journey in recent decades. Virtually all of it consigns inflation to a cameo appearance.* It doesn't matter whether the authors are historians or economists; whether they are liberal or conservative; whether they are critics or champions of various presidents; or whether they are the presidents themselves. In Morning in America: How Ronald Reagan Invented the 1980s by Gil Troy, inflation does not merit a chapter or an index entry. But neither does it figure much in journalist Haynes Johnson s harsher portrait of Reagan in Sleepwalking Through History. In his memoir Keeping Faith, Jimmy Carter says that "during the early months of 1980, the most serious domestic problem was

* A striking exception to this inattention is William Greider's masterful Secrets of the Temple, an exhaustive and engaging history of the Federal Reserve through the late 1980s.

0 0

Post a comment