cycles, government would enable companies to become more caring. Relieved of fears of deep recessions or depressions, firms could establish long-term employment relationships that provided stable jobs, fair wages and ample fringe benefits. Perversely, this new social contract became a conveyor belt for higher inflation. Because the notion of "fair" wages compensated for past inflation, the reformed capitalism institutionalized a wage-price spiral. Not surprisingly, the repression of inflation—signaling that business cycles endured— undid some of these changes and practices.

Capitalism's early postwar reconstruction focused on jobs, because that was what concerned most people. Going back to the 1920s, some large firms (Eastman Kodak, Sears, Metropolitan Life) had experimented with "welfare capitalism," which aimed to provide job stability and more benefits, mainly pensions and profitsharing plans. These companies were exceptions. In general, the prewar job market was treacherous. "People moved around a lot," the economic historian Sanford Jacoby has said. "You got laid off and you moved on. And workers were wont to quit just as employers were quick to lay off." Wages were set mainly according to local labor market conditions (if there were surplus welders, their wages suffered, regardless of skill). Many foremen could hire and fire at will. Turnover was high, and hardly anyone had "rights" on the job or the "right" to a job. One autoworker recalled bitterly:

The annual layoff during the model change was always a menace to the security of workers. Along about June or July it started. The bosses would pick the men off a few at a time In October and November, we began to trickle back into the plants. Again the bosses had full say as to who was rehired first.

Years of service with the company meant nothing____[W]ork-

ers had no assurance of that he [sic] would be called back at any specific time.4

In the 1950s and 1960s, these arbitrary practices receded. There was a greater faith that workers'"relationship with the Organization [was] for keeps" because if they were "loyal to the company ... the company would be loyal" to them, as William Whyte wrote in his 1956 bestseller Hie Organization Man. This partly reflected the increasing power of unions, which had received new legal protections in the Depression and World War II to organize and conduct collective bargaining. By 1950, union membership reached 15 million; that was more than four times its 1930 level and more than a quarter of nonfarm employment. Terrified of being organized, many nonunion firms mimicked union bargaining preferences in their employment policies, emphasizing job security, seniority, increased fringe benefits and pay scales that narrowed the gap between the top and bottom.5

Particularly influential were settlements in the massive auto industry, starting with landmark agreements between the United Auto Workers and General Motors in 1948 and 1950. The 1950 contract guaranteed automatic wage increases covering inflation (a cost-of-living adjustment, or COLA) and an "annual improvement factor" (first 2 percent, then 3 percent after 1955), a pension and half the costs (later increased) of health insurance. The union gave up the right to national strikes between contracts for these hefty benefits; GM could make large investments knowing that, once a contract was in place, the investments could be used. When the 1950 five-year contract was signed, Fortune magazine dubbed it the "Treaty of Detroit": "GM may have paid a billion for peace but it . . . has regained control [over the] long range scheduling of production, model changes, and tool and plant investment." Ford and Chrysler made similar agreements, as did other unionized industries (steel, aluminum, chemicals). The resulting "pattern bargaining" in these industries meant that major firms didn't compete on labor costs. Auto companies had similar labor costs; steel companies had similar costs; tire companies had similar costs.6

Many large nonunion firms followed suit. Wages and salary increases compensated for inflation plus something. There were more generous fringe benefits: paid vacations, health insurance, pensions, sick leave. From 1950 to 1970, the number of Americans with group health insurance quadrupled from 22 million to 83 million. Wages and salaries were increasingly set through so-called internal labor markets in an effort to achieve results that seemed "fair." For many nonunion workers, that meant being paid according to elaborate job evaluations, an approach pioneered by the Hay Group, a consulting company. Companies adopted point systems to set comparable compensation for comparable workers. Points were awarded for skill, seniority and the nature of the job. The idea was for, say, a mechanical engineer and accountant with similar education levels, responsibilities and experience to be paid roughly the same, so that neither felt misused.7

Corporate executives saw themselves as managers—not capitalists—who harnessed the productive potential of huge organizations for the public good. A DuPont advertising slogan captured this atti tude: "Better Living Through Chemistry." As early as 1927, speaking at the Harvard Business School, General Electric chairman Owen Young had compared large firms to public utilities with responsibilities to the whole society. After the war, many executives embraced this way of thinking. Aside from shareholders, companies had to satisfy other "stakeholders"—workers, local communities and political leaders (who represented public goals). There was an ideology of management. Studies found a gap in opinion between business leaders in large and smaller firms. The first accepted social responsibilities, while the second "repeated free market rhetoric and denied any commitment to a broader group of stakeholders," wrote Ernie Englander of George Washington University and Allen Kaufman of the University of New Hampshire.8

The best-known theoretician of the ideology of management was Harvard economist John Kenneth Galbraith, whose 1967 bestseller The New Industrial State synthesized its central assumptions. Galbraith split the modern economy into two distinct sectors: a traditional sector of small and often family-owned businesses—stores, farms, dry cleaners, machine shops; and the "new industrial state" of megacorporations. The traditional sector abided by the standard laws of economics. Competitive markets set prices; firms were born and died; businesses were at the mercy of the market. By contrast, megacorporations enjoyed virtual immortality. With only a few big firms, many industries benefited from near-monopoly market power. Through massive advertising, companies could condition consumers to buy their products and could—without many competitors—set optimal prices. They controlled new technologies, because only they could muster the resources to hire the required engineers and scientists, undertake research and development and build new plants. Finally, they could pay for most new investments with retained profits and depreciation, as opposed to borrowing from banks or selling stock. Thus, managers escaped much discipline from lenders or shareholders.

Old-style capitalism was dead. Profits were assured. Demand would remain high, because government economic management would keep it high. If costs rose (often reflecting higher wages, imposed by unions), they could be passed along to consumers. Modest inflation was inconvenient, not crippling. These large firms were "the heartland of the modern economy nearly all communications, nearly all production and distribution of electric power, much transportation, most manufacturing and mining, a substantial share of retail trade, and considerable amount of entertainment____[M]ost work [is] done by five or six hundred firms." Their triumph, Gal-braith wrote:

assaults the most majestic of all economic assumptions, namely that man in his economic activities is subject to the authority of the market. Instead we have an economic system which, whatever its formal ideological billing, is in substantial part a planned economy. The initiative in deciding what is to be produced comes not from the sovereign consumer who, through the market, issues the instructions that bend the productive mechanism to his ultimate will. Rather it comes from the great producing organization which reaches forward to control the market that it is presumed to serve and, beyond, to bend the customer to its needs.9

Hardly anyone talks this way anymore. Some of Galbraith s arguments were simply wrong. He contended, for example, that entrepreneurs—individuals who invent or commercialize new technologies, products or services—were economic relics. Megacorporations controlled innovation. That was never true. In 1920, start-up RCA (not General Electric) pioneered radio; in the 1960s, start-up Xerox (not IBM) pioneered paper copying. Even as Galbraith wrote, Ray Kroc was starting the giant-to-be McDonald's, and David Packard and Ed Hewlett were creating a major electronics firm. Later, Bill Gates and Steve Jobs ended IBM's domination of the computer industry, and Sam Walton revolutionized retailing with Wal-Mart. All were classic entrepreneurs. Similarly, Galbraith underestimated the power of consumer sovereignty. A case in point: In 1985, Coca-Cola tried to replace its long-standing formula with something that, its marketers thought, had more appeal. Customers rebelled; the company sheepishly restored the old formula, renamed Coke Classic.10 But inflation and its side effects also demolished much of Gal-braith's intellectual superstructure—and the parallel assumptions held by many corporate managers. From the Volcker-Reagan recession, many Americans, and particularly corporate managers, learned that the business cycle had not yet been tamed. Firms could no longer assume they could pass higher costs, including higher labor costs, on to customers. If they tried, profits might suffer, because some customers could not afford higher prices and other customers might buy less. So there were practical pressures to hold down prices, and high-cost firms faced a profit squeeze. In extreme cases, they might go bankrupt. The whole cost-plus mind-set of managers began to submit to new realities. There was a hardening of thinking.

Greater competition reinforced the effect. "[I]t is impossible to understand why the American economy was so good in the 1990s—and why America did better than other countries—without understanding the role that more intense competition has played," writes economist Paul London. Much of this resulted from deliberate government policies to check inflation. Until the late 1970s and early 1980s, the railroad, trucking, phone and airline industries were regulated. Government agencies restricted competition and set prices. These industries were considered to be "natural" monopolies, or something close, which would operate more efficiently with limited or no competition.* Many economists disputed this logic, which often dated from the Great Depression when prices were falling and government policies tried to prop them up. Heavy regulation, the economists argued, suppressed innovation, encouraged inefficiency, and led to cost-push price increases that the government agencies ratified, because the alternative—letting companies go bankrupt—was unthinkable. By the mid-1970s, presidents, members of Congress and policy makers, desperate to control inflation, began to listen. Gradually, regulation of these industries was abolished.11

As a result, huge segments of the economy that had been sheltered from competition now faced lower-cost rivals. Foreign

* Truckers and airlines were regulated during the Depression, when falling prices plagued both industries. The Interstate Commerce Commission regulated railroads and truckers; the Civil Aeronautics Board, the airlines; and the Federal Communications Commission, AT&T, the nation s telephone near-monopoly.

competition did the same for other industries: steel, automobiles, machine tools, televisions, clothing. In the early decades after World War II, few Americans imagined being challenged by Europeans let alone Japanese (known for cheap toys and transistor radios) or, heavens, the Chinese (then called "the Red Chinese," a sworn enemy). Galbraith essentially ignored the whole subject of foreign competition. Some increase of competition was inevitable as Europe and Japan recovered from the destruction of World War II. But the great intensification of the 1980s stemmed from the dollar's steep ascent on foreign exchange markets. As inflation fell, the dollar's exchange rate rose, because overseas investors regained confidence in the currency. A higher dollar made foreign imports cheaper and U.S. exports more expensive. The upshot: severe pressures on many U.S. industries to reduce costs and increase efficiency.

Even without these changes, executives encountered new threats from the stock market. Galbraith was correct in concluding that managers were insulated from shareholders. In their classic 1934 book The Modern Corporation and Private Property, Adolf A. Berle, Jr., and Gardiner Means had noted that most managers could ignore disgrunded shareholders. The rules for electing corporate boards were rigged in favor of management. Unhappy shareholders couldn't easily evict a firm's directors or executives; the simplest solution was to sell their stock. But this insulation thinned in the 1980s with the emergence of the "market for corporate control." As interest rates fell and stock prices rose—again, reflecting lower inflation—investment syndicates borrowed huge amounts and bought all the stock of un-derperforming firms. The idea was to flip the company: overhaul management, improve performance and resell the shares at a higher price. (Because borrowing is known as "leverage," these transactions are called "leveraged buyouts," or LBOs.)* Some conglomerates— firms with many separate businesses—were tempting targets; they could often be operated more efficiently if split up into smaller operating units. But the threat of being acquired made most corporate chiefs feel vulnerable. They had "to reduce waste and boost productivity and profitability" or face a takeover, economist Roger Alcaly has noted. A low stock price could jeopardize their jobs. By one study, a quarter of 1,000 large firms received a hostile takeover bid sometime in the 1980s.12

In some sense, companies became more capitalist because they had no choice. They either adapted—or faded and failed. The belief structure held by corporate managers and popularized by Galbraith crumbled as competition increased and corporate independence decreased. Top corporate executives, who once enjoyed the tenure of college professors, could be dumped. In October 1992, Robert C. Stempel, General Motors' CEO, resigned after only twenty-seven months on the job, dispatched by disgruntled directors. The company wasn't moving fast enough, the directors thought, to reverse big losses. The change, said The New York Times, fulfilled what many experts had urged "for hidebound American corporations: the breakup of the clubby atmosphere in corporate board rooms, where top executives rarely face tough grading on their performance and

* This strategy is known as "private equity." Major private equity firms include Kohlberg Kravis Roberts and Co., the Carlyle Group and the Blackstone Group.

directors rarely take direct action." In January 1993, IBM CEO John F. Akers was forced out. Despite 100,000 job cuts since 1986 (so much for lifetime employment!), the firm had lost $5 billion in 1992. In March, IBM named its first outside CEO, Louis Gerstner, Jr. Just a few years earlier, the expulsion of top executives at two blue-chip firms was unimaginable.13

As competition increased and shareholder passivity decreased, companies were quicker to cut jobs and costs. There was a de-emphasis of "internal labor markets" in favor of local wages, which were often lower. In one well-publicized case in 1995, IBM cut the salaries of executive secretaries, because they were "way out of kilter" with local rates. Most of all, unions lost power because they were concentrated in manufacturing and service industries (autos, steel, trucking, telecommunications), which faced heavy competition from imports or nonunion companies with lower labor costs. Unionized firms shed workers and curbed wages and benefits; so the union sector contracted and the remaining unions were less influential. By 2005, only one in thirteen private industry workers belonged to a union, down from one in six in 1983.14

A new economic order had come into being mostly as a reaction to unanticipated events. The "mixed economy" that had seemed fairly placid and predictable increasingly resembled a Darwinian free-for-all. The softening of capitalism that had started after World War II stopped and, to some extent, went into reverse. Companies revised hiring, firing and compensation practices. Workers shifted assumptions about what they could expect. Inflation was usually not the immediate cause of these changes, but its side effects—on regu lation, stock prices, exchange rates and the business cycle—often were. The result was to strip away many illusions that, in the first decades after the war, had fostered the belief that capitalism had been so thoroughly improved that it had changed into something else entirely. Once these illusions disappeared, many of capitalism s basic characteristics reemerged: intense competition, constant change, the clamor for higher returns. And all these developments were amplified by a parallel set of changes abroad: what we now call "globalization."

o a degree unimaginable in 1980, capitalism has gone global.

Supply chains and production networks span continents. We live in a world where an American can buy a Ford that assembled in Mexico with a transmission from Japan and half its other parts from non-U.S. sources. Finance spills across national boundaries. A routine Wall Street Journal story in 2005—"Foreign Stocks Get New Push"—would have been a fairy tale a quarter century earlier. Most surprisingly, countries such as China and Russia have adopted some form of capitalism. In our mind's eye, globalization is easily explained. Capitalism bested communism, and nations copied the winner. Lower transportation and communication costs tied countries closer together. But this story has glaring omissions. Globalization required a strong America, and a strong America required that inflation be subdued. As a result, capitalism s prestige increased, and

0 0

Post a comment