A century and a half of the U.S. economy, from the railroad revolution to the information revolution
-
June 2001
Volume52Issue4
1. Colossus
Henry Luce, a founder of Time Magazine, coined the term “American century” to describe the twentieth century in which he lived and in which the United States became the world’s foremost power. But not even Luce could have foreseen America’s global position at the dawn of the twenty-first century, only 33 years after his death. Today, the United States is pre-eminent in the world militarily, technologically, scientifically, and culturally as no state has been since the apogee of the Roman Empire 1,800 years ago.
But this immense relative influence is possible only because the American economy makes it so. In 1998 the United States had a gross domestic product of $8.511 trillion. The second largest economy in the world, China’s, had just over half that, $4.420 trillion, if figured on a purchasing-power parity basis, which takes the local cost of living into account. Japan, which ranks number three in GDP, has an economy only about a third as large as that of the United States, $2.903 trillion. The greatest economic power of the nineteenth century, Great Britain, now ranks only seventh in GDP, at $1.252 trillion, only $152 million more than that of the state of California alone. Despite a recent modest slowdown and stockmarket tumble from once unimaginable highs, the American economy remains by far the strongest in the world.
It is tempting to say that the wealth machine called the American economy was inevitable. After all, the country is singularly blessed with a vast and fruitful national territory, immense natural resources, and a large and well-educated population. But Argentina has all these assets and was one of the leading nations economically a century ago, yet it has struggled through most of the twentieth just to avoid slipping to Third World status. Its GDP per capita is less than a third that of the United States.
How did the United States become the economic colossus it is today? To be sure, our natural assets had a great deal to do with it. But so too did our entrepreneurial spirit, perhaps an inheritance from our ancestors who had the courage to leave everything they had known and venture across the sea to a new land. Our faith in the capitalist system, respect for the rule of law, and usually stable politics are the gifts of England. And by no means the least of the reasons have been our great good luck and our singularly fortunate—but seldom remarked on—geographic position on the globe.
How we went from being a sprawling exporter of agricultural produce and raw materials, dependent on Europe for markets, manufactured goods, and technology, to the most advanced economy on earth is a story worth the telling. It makes sense to begin a century and a half ago, when the nation had evolved from a loosely joined collection of colonies to an equally loosely joined revolutionary enterprise, to a fledgling republic trying the levers of its newly invented government, to something any of us would recognize as the ancestor of the country we inhabit today.
2. Cotton, Gold, and Flesh
By 1851, the United States had nearly reached the full extent of its contiguous territory. But while the territory east of the Mississippi had all been formed into states, west of the Mississippi only California, Texas, and the states bordering the river had been admitted to the Union. Much of the rest was still unorganized and even unexplored. And most Americans lived in the East. Indeed, the nation’s center of population lay in what is now West Virginia.
The economy of the United States in the middle of the nineteenth century was sharply divided, on a line along which the nation itself would nearly cleave a few years hence. The Northern economy was characterized by agriculture based on the family farm, commerce (America in 1850 had a merchant marine second only to Britain’s), finance, and, increasingly, industry. The most important Northern export, however, odd as it may sound to our ears, was ice. Cut from ponds in winter and stored beneath mounds of sawdust, a byproduct of the lumber industry, ice was shipped as far away as India.
Textiles were the primary industry, centered in New England, where there were plenty of clear, fastrunning streams to power the mills. Few of these textiles were exported, and often they were competitive with European textiles in the domestic market only because they were protected by high tariff walls. And the country still imported most of its manufactured goods from Europe, especially Britain.
But while the North had a mixed economy, the South remained overwhelmingly agricultural. The reason was simple: Much of the South—with its rich soil, abundant rainfall, and warm climate—was the best place in the world to grow one of the mainstays of the nineteenth-century world economy. Cotton had been a luxury fabric in the eighteenth century because the fibers of the cotton plant are difficult to separate from the seeds and, once separated, hard to weave into cloth. Textile machinery solved the latter problem, and when Eli Whitney’s cotton gin solved the former one as well, the demand for cotton exploded as its price plummeted.
By the Civil War, the United States was exporting about four million bales a year. But much of this vast cotton trade was brokered through New York, already the country’s dominant financial center and leading port. Southern financial institutions such as banks and brokerages were small and usually weak.
Thus the Southern economy was greatly dependent on a single cash crop. And while immensely profitable, there was a terrible price to be paid for this dependence. Slavery had been a declining institution in the early days of the Republic, but cotton, a laborintensive crop, changed that dramatically, and the slave population in the South began to grow quickly just at the time when moral opposition to slavery was growing equally quickly in the North. By 1860 the price of a prime field hand was six times what it had been at the turn of the century, and the South had much of its capital invested in human flesh.
The United States was prosperous in 1851, especially compared with the previous decade, when it had been mired in a depression that started in 1837 and lasted longer than any other in American history. But while the depression had been lifting in the late forties, it was the California gold strike of 1848 that transformed the American economy.
The Gold Rush moved the country’s center of political gravity westward in a historical instant. Indeed, it was in the year 1851 that John B. L. Soule wrote in the Terre Haute
The economic effects of California gold were equally important. Gold was at the very center of the international economy in the nineteenth century. Britain, the leading economic power, had been on a pure gold standard since 1821, buying and selling unlimited quantities of pounds sterling for a fixed price in gold, and the Bank of England was the world’s de facto central bank.
But America’s money supply in mid-century was a hodgepodge. The federal government minted gold and silver coins, but state banks—thousands of them—provided the paper money. Sound banks backed the notes they issued with gold and government bonds in their vaults. Less sound banks often did not. A New York agency issued a newsletter called “The Bank Note Detector” to help the wary tell good notes from dubious ones.
California gold allowed a great increase in the money supply by letting sound banks issue more bank notes. In 1847, a typical pre-California year, the United States produced 43,000 ounces of gold. In 1848, thanks to California, the country produced 484,000 ounces; the next year the output was 1,935,000 ounces; and by 1853 it was up to 3,144,000 ounces, worth almost $65 million—$17 million more than the federal government’s total expenditures in that year.
The economy expanded rapidly under this freshet. Government revenues, a rough measure of economic activity in those years, were $29 million in 1844. A decade later they were $73 million. There were 9,021 miles of railroad track in the United States in 1850—and 30,626 in 1860. The telegraph spread even faster, and together with the railroad it began to knit the country together in a way that had never before been possible, and would remake the economy in the ensuing decades.
Pig-iron production, also an important measure of economic activity, soared from 63,000 tons in 1850 to 883,000 tons a mere six years later. Foreign investment poured into the country to finance this development, as American securities held overseas increased from $193.7 million in 1847 to $383.3 million in a decade.
But these ebullient numbers hid a problem: The American economy was on what a later generation would call autopilot. President Andrew Jackson had vetoed the recharter of the central bank, the Second Bank of the United States, which dissolved in 1836. The country would not have another until 1913, when the Federal Reserve came into being. As a result of having no institution to check “irrational exuberance,” economic expansions in the nineteenth century tended to run out of control and to end in financial crashes. This boom was no exception
Wall Street had become the dominant financial market in the previous decade, as the telegraph allowed people in distant cities to trade in New York. By 1856 there were 360 railroad stocks, 985 bank stocks, and 75 insurance stocks being traded on Wall Street, as well as hundreds of corporate, state, and federal bonds. The following year, the broker George Francis Train summarized the vagaries of Wall Street in rhymed couplets:
Monday, I started my land operations;
Tuesday, owed millions by all calculations;
Wednesday, my brownstone palace began;
Thursday, I drove out a spanking new span;
Friday, I gave a magnificent ball;
Saturday, smashed—with just nothing at all.
The “Saturday” came in the summer of that year. “What can be the end of this but another general collapse?” asked James Gordon Bennett in the New York Herald on June 27th. ”… Government spoilation, public defaulters, paper bubbles of all descriptions, a general scramble for western lands and town and city sites, millions of dollars, made or borrowed, expended in fine houses … silks, laces, diamonds and every variety of costly frippery are only a few among the many crying evils of the day.”
The crisis came in late summer, when Wall Street crashed and most of the country’s banks temporarily suspended payments in gold and began to call in loans as fast as they could.
The result was four years of depression, a depression that would be ended only by the Civil War
3. Paying For Union
By many measures—not least the total number of men killed and wounded—the Civil War was the greatest one that this nation has ever fought. So it is not surprising that while it transformed the country politically, it also transformed it financially and economically.
From the first, both sides confronted desperate financial problems. Because of the depression that had begun in 1857, the government in Washington had been operating in the red, borrowing mostly shortterm to make up the deficit. The national debt, only $28.7 million in 1857, had more than doubled to $64.8 million in 1860. In the last month of that year, as the states in the Deep South began to secede one by one, there was not even enough money in the Treasury to pay December’s congressional salaries.
The federal government’s expenses averaged only about $170,000 a day in the late 185Os. By early summer 1861, after the war began, they were running at $1,000,000 a day. By the end of the year they were up to $1,500,000. In December 1861, most Northern banks stopped paying their debts in gold, and the federal government was forced to follow suit a few days later. The country had gone off the gold standard and Wall Street panicked. “The bottom is out of the tub,” Lincoln said. “What shall I do?"
There are basically only three ways to finance a great war. First, the government can raise taxes. By the end of the war, the federal government was taxing nearly everything that could be taxed, including, for the first time, incomes. Roughly 21 percent of the cost of the war was raised by taxation. The Bureau of Internal Revenue, the ancestor of today’s 1RS, is by no means the least of that war’s legacies.
But the most important tax was the tariff. The South, dependent on buying manufactured goods from elsewhere, had always pushed for as low a tariff as possible; the North, with burgeoning industries to protect, wanted a high one. With the withdrawal of the South from the Union and the need to finance the war, the tariff was raised to unprecedented heights, making many American markets unprofitable for foreign producers. This, together with the war itself, acted as a huge stimulus to American industry, which began to grow as never before, both in absolute terms and as a percentage of the total economy.
THE SOUTH PAID FOR ITS WAR WITH PRINTING-PRESS MONEY—AND HAD 9,000 PERCENT INFLATION.
The second way to finance a war is by issuing printing-press money, the principal means by which we paid for the Revolution. In the course of the Civil War, the federal government issued $450 million in so-called greenbacks, financing about 13 percent of war costs and triggering an inflation that drove prices to about 180 percent of their antebellum levels. The South, with far fewer financing options, was forced to use printing-press money to pay for more than half of its wartime expenses. This caused a virulent inflation that had reached 9,000 percent by war’s end. The collapse of the South’s economy under this kind of financial pressure would be no small factor in its defeat.
The third method is to borrow. This the federal government proceeded to do, and on a scale no sovereign power before had ever contemplated. The national debt, $64.8 million in 1860, had by 1865 reached $2.677 billion, an increase by a factor of over 40. Total annual government expenses before the war had never exceeded $74 million. By 1865 interest expense alone was almost twice that amount.
This huge growth in the national debt had two great economic consequences. Before the war, far fewer than 1 percent of all Americans owned securities of any kind. But during the war, the government sold bonds to about 5 percent of the population of the North. Thus the war enlarged what Karl Marx—who in its penultimate year helped found the “First International”—would have called the “capitalist class.” For the first time, a large number of middleclass people became investors.
Second, this vast influx of bonds, and bondholders, into the country’s financial markets, transformed Wall Street almost overnight. Although the stock market had plunged at the outbreak of war—as stock markets almost always do—investors began to realize that the conflict would be a long one and not only would there be a vast increase in the amount of securities to be traded but much of the money that the government was spending would go to firms such as railroads, iron mills, textile manufacturers, and munitions companies, whose profits would be invested in, and capital needs met by, Wall Street.
The biggest boom the Street had ever known was soon under way. In just a year, this patch of Lower Manhattan went from being a relatively provincial place to being the second-largest market on earth. Only London, the financial capital of the world, exceeded it. But although now a very large market, it was not a well-regulated one. In fact, it wasn’t regulated at all. For a few brief years, Wall Street saw capitalism red in tooth and claw. This was exciting, to be sure, but no capital market can operate wholly untrammeled for long and survive. Customers will seek safer places to do business.
Fortunately, Wall Street began to regulate itself. When the New York Stock Exchange, the oldest and most prestigious exchange, and the Open Board of Stock Brokers, which was at various points in the war years the largest exchange in volume, merged in 1869, the new organization was powerful enough to impose order on the Street and enforce rules that made it a far safer place to invest.
The American money supply had also become safer. In 1863 Congress had established nationally chartered banks, which had to meet stringent auditing requirements and were allowed to issue bank notes only within carefully prescribed limits and even to a uniform design. The thousands of statechartered banks were driven out of the bank-note business by a stiff tax on their issuance.
After the war, the country began to move back toward the gold standard, which was fully instated in 1879. By preventing excess money creation, the gold standard makes inflationary government policies impossible. But the gold standard, which was popular in the Northeast, where the large banks and a good deal of the nation’s wealth were concentrated, was deeply resented in much of the rest of a country that was still dominated by agriculture. Farmers are chronic debtors, and debtors like inflation because it allows them to pay their debts in cheaper money.
So, while the political interests of the industrialists and bankers, increasingly identified at this time by the rubric “Wall Street,” were addressed by the return to the gold standard, those of the debtors were met by the free coinage of silver. Great silver strikes were made in the West in the 1870s. Just as the country was about to return fully to the gold standard, Congress passed the Bland-Allison Act in 1878. It required the Treasury to purchase between two and four million dollars’ worth of silver every month and to mint it as coins valued at the ratio of 16 to 1 with gold.
At first that was about the free-market price of silver compared with gold. But as silver continued to pour out of Western mines, its price sank to about 20 to 1. In 1890 Congress required the Treasury to buy 4.5 million ounces of silver a month—not far from the total output—and mint it, still at the ratio of 16 to 1. This was a sure recipe for disaster, as people spent the silver and kept the gold. The yellow metal began to trickle out of the Treasury. When the terrible depression of 1893 began, Congress hastily repealed these laws. But their appeal to debtors continued, and this inflationary policy would propel the 36-year-old William Jennings Bryan to the 1896 Democratic nomination for President when he electrified the convention with his “Cross of Gold” speech.
There had been two serious economic casualties of the Civil War, however. One was the American merchant marine. In 1860, 66.5 percent of American foreign trade was carried in American ships. But Confederate commerce raiders forced Northern shipowners to transfer their vessels to British registry. Most of them never returned. By the end of the war, only 27.7 percent of the country’s trade was carried in American bottoms. By 1912 it would be less than 10 percent.
The other casualty was the devastated American South. Always short of capital, the region had seen its holdings be obliterated by the war and by the freeing of the slaves. With the withdrawal of the last federal troops after the election of 1876, the old landowning class reasserted its control and a system of sharecropping evolved. The South would remain impoverished for generations. The poorest of all, the descendants of the freed slaves, began to migrate north to major cities after World War I, in search of both opportunity and greater freedom than could be found in the Jim Crow South.
But except for the South and the merchant marine, the American economy was far larger and stronger than it had been before the war. Its development over the next four decades would be one of the wonders of the world.
4. The Age of Steel
With the war over, many of the new federal taxes were repealed or allowed to expire. The income tax disappeared in 1872, not to return permanently for nearly half a century. But the high tariff remained. The greatly expanded manufacturing sector, fearing European competition, fought hard to keep it and succeeded. As a result, the government had revenues far in excess of its expenses and would not run a deficit until the deep depression of the 1890s
Protected by the tariff wall, manufacturing grew rapidly as the country developed at a furious pace. The railroad mileage that had stood at 30,626 miles in 1860 reached 166,703 in 1890. Only one thin strand of rails had connected the West Coast with the East in 1869. By 1900, four lines reached to the Pacific. And these roads, of course, also served the Midwest, allowing larger and larger grain harvests to be sent farther and farther away.
“Two generations ago,” Arthur T. Hadley wrote in his classic 1886 work of economics,
The United States, blessed with some of the finest grain-growing areas in the world, was more than able to hold its own in this new global competition. In 1866 the nation harvested 15 million acres of wheat. By 1900 the count was 49 million. Corn and oats (both major agricultural products in the age of the horse) also saw their acreage more than triple.
But the measure of economic power in the last part of the nineteenth century was steel, the miracle metal of the age. Steel, which is iron with a carefully controlled amount of carbon added, had been known since at least 1000 B.c. Its advantages over iron are many. It is harder, takes a better edge, and is much less brittle, making it better able to withstand shock. But steel was very expensive to manufacture until, in 1857, an Englishman named Henry Bessemer developed his converter, which could turn large quantities of iron to steel quickly and easily. The Bessemer process and, a few years later, the Siemens open-hearth method made steel cheap.
Whenever a new invention transforms an important commodity that was previously very expensive into something that isn’t, it is likely also to transform the world. That was true of the railroads, which made freight haulage cheap in the middle third of the nineteenth century, and it is true of the computer, which made calculation and information storage and retrieval cheap in the last third of the twentieth century. It was true of steel in the post-Civil War era.
Steel rails for railroads were far more durable than the wrought iron ones they replaced, and in less than 20 years the manufacture of wrought iron rails ceased, the last being produced in this country in 1884, a year that witnessed the forging of 1,145,000 tons of steel rails. By then, those steel rails cost only one-third what the iron rails had in the 186Os. Cheap, high-quality steel track fueled the boom that would by 1900 give us the largest rail network in the world.
Steel also made tall buildings possible. The first human structure to rise higher than the Great Pyramid of Egypt, built about 2800
Steel became the engine driving American industrialization in the late nineteenth century. In 1860 the country produced only 13,000 tons. Ten years later output had risen to 77,000 tons, and, just a decade after that, to 1,397,000 tons. By the end of the century, the United States was turning out 11,227,000 tons, more than Britain and Germany combined.
This astonishing growth was possible only because of the vast size of the American market and the fact that it was entirely a common market. The Constitution gave control of interstate commerce to the federal government, and the U.S. Supreme Court case of
Another reason for the success of the American steel industry were men like Andrew Carnegie. Born in Scotland in 1835, Carnegie arrived in this country with his impoverished parents as a boy of 13. After the Civil War, he saw opportunity in the burgeoning steel business in Pittsburgh. Within two decades, the Carnegie Steel Company was the largest in the world. Carnegie’s basic business principles were simple: In good times, plow profits back into the company to acquire the latest technology and to be the low-cost producer in order to remain profitable during downswings; in bad times, use the cash surplus to buy up less efficient competitors and expand market share. Many other American businessmen in other industries would emulate Carnegie’s techniques.
At this time too, Thomas Edison, not usually thought of as a businessman at all, invented one thing that has proved of singular importance to maintaining and extending American economic leadership, the research laboratory. His laboratory in West Orange, New Jersey, was the first in the world to be devoted to developing new technology in a systematic way. The idea was soon copied by corporations, notably General Electric (Edison was among its forefathers), AT&T (which set up Bell Laboratories), and DuPont. These laboratories would produce an endless stream of technology in the twentieth century that has remade the world.
Another rising industry in post-Civil War America was petroleum. Although petroleum had been known of since ancient times in areas where it came to the surface naturally, it was only in 1859 that Edwin Drake successfully drilled for it, in northwest Pennsylvania. The business grew as kerosene, the main petroleum byproduct in the nineteenth century, rapidly replaced the ever more expensive whale oil as the major illuminant in areas not served by gaslight, which is to say outside of cities. Petroleum production was only 2,000 barrels in 1859 but a decade later reached 4,215,000 barrels. By the turn of the century, American oil production was 63,621,000 barrels.
At first the petroleum business, centered in Cleveland, near the Pennsylvania oil field, was chaotic. There were more than 30 oil refiners in Cleveland alone in the 186Os, and prices fluctuated wildly, from a high of $13.75 a barrel to as low as ten cents. These huge swings made it very difficult for businesses to plan for the future and to control costs, and one of the Cleveland refiners set out to bring order to the industry. In 1870 John D. Rockefeller and his partners, notably Henry Plagier, formed a corporation, Standard Oil. That year, it controlled about 10 percent of the American oil-refinery business. Ten years later it had about 80 percent of a much larger industry, having bought up many of its competitors and driven others into bankruptcy.
Standard Oil always paid a fair price for its acquisitions, but if a firm declined to accept the deal offered, the company did not hesitate to use its power to ruin the competitor. And it employed other tactics, such as secret deals with railroads to ship Standard Oil products at low rates while charging its rivals high ones until they had no choice but to sell out. Learning how to regulate these new economic giants effectively, without destroying their economic utility in the process, occupied much of American politics in the twentieth century.
The new corporate empires would not have been possible without capital. The United States had always been a capital importer, selling corporate bonds and stock to European investors. But after the Civil War, Wall Street increasingly was able to supply the needed capital. The New York Stock Exchange, which had often traded less than 5,000 shares a day in the 185Os, had its first million-share day in 1886. Along with the rise of the stock exchange, Wall Street’s banks were becoming major forces in the American economy. As the new industrial corporations grew and combined into continentwide concerns, they had no choice but to come to Wall Street to get the needed capital. This gave the Street the opportunity to impose standards. Before the Industrial Revolution, most businesses had been family-owned, so the managers were also the owners. But as corporations grew larger and larger, the distance between the people who ran the companies and those who owned them increased, and their interests diverged. Owners wanted to know how well the managers were doing. The managers wanted to present their efforts in the best possible light.
But like the old family firms, the earliest of these publicly traded corporations did not issue reports of their affairs, even to their stockholders. And there was no general agreement about how the bookkeeping should be done.
Although there had been accountants for centuries, it was only in the 188Os that the profession began to organize formally. In that decade, Wall Street banks increasingly required companics that sought financing to have their books certified by independent accountants, and the New York Stock Exchange required that listed companies publish annual reports.
The most prestigious bank on Wall Street by the early 1890s was J. P. Morgan and Company, whose headquarters had been at 23 Wall Street, at the corner of Wall and Broad Streets, since the 1860s. It had become known simply as the “Corner.” By the turn of the century, J. P. Morgan was the most powerful banker not only on Wall Street but in the entire world. The absence of a central bank compelled the government in Washington in times of financial crisis to turn to Wall Street and, more and more, that meant turning to J. P. Morgan. In 1895, when a severe depression nearly forced the United States off the gold standard, it was Morgan who arranged a specie loan to prevent that outcome. In the banking crisis of 1907, again it was J. P. Morgan who summoned the other New York bankers and devised a means to prevent the closing of sound banks, aborting the crisis and averting another depression.
However, the recurring need to rely on a private individual in times of crisis slowly eroded the longstanding political opposition to a central bank. The Federal Reserve came into existence in 1913, the same year that J. P. Morgan died. Unfortunately, the Fed as it was originally designed was so hobbled by restrictions on its freedom of action that it proved unable to perform effectively in the great financial crisis of 1929–1933.
The United States, which had long been a major exporter of agricultural products and raw materials, remained so (cotton would be the country’s biggest export until the 1930s). But it increasingly became an exporter of manufactured goods as well. In 1865 only 22.78 percent of American exports were manufactured goods. By 1900 the proportion had risen to 31.65 percent. While the country exported only $6 million worth of iron and steel products in the year before the Civil War, in 1900 the United States exported $122 million worth of locomotives, stationary engines, rails, electrical machinery, wire, pipes, metal-working machinery, boilers, and other goods.
Europe suddenly woke up to the economic colossus that was building across the Atlantic and competing ever more effectively with European manufacturers, and there was an alarmed flurry of books with titles like The American Invaders, The Americanization of the World, and The American “Commercial Invasion ” of Europe. With the country exporting more and more, the high tariff that industry had backed since Civil War days became a liability, as it tended to generate retaliatory tariffs in other countries. In 1913, with the traditionally anti-tariff Democrats in power in both the White House and Congress for the first time in a generation, the tariff was significantly reduced. That year also saw the reintroduction of a then very modest federal income tax to help make up for the lost revenues.
5. "We Want to Stop It"
The sudden rise of these vast, and vastly wealthy, industrial empires and Wall Street banks produced a sea change in American politics. At the dawn of the Republic, the country had been overwhelmingly agrarian. But as industry developed, more and more people began migrating to the cities to seek jobs in the cash economy. The flood of immigrants from Europe tended to gravitate to the cities as well. At the end of the eighteenth century, New York had had a population about equal to present-day Altoona, Pennsylvania. A hundred years later, it was one of the great cities of the world.
Corporations also swelled in size and power. Before the Industrial Revolution there had been no large corporations at all. In the early days of the new economy, only the railroads had had work forces of more than a few hundred employees. As late as the end of the Civil War, there was not a single industrial corporation listed on the New York Stock Exchange. By 1900 there were dozens, many of them employing tens of thousands of people.
This growing army of industrial workers was at a great disadvantage relative to the managers of the corporation in deciding how to divide between labor and capital the wealth that was created by both of them together. The managers of a corporation spoke in one voice, but the workers spoke in their thousands. This made it easy for management to impose its own ideas as to wages, hours, and working conditions. When the workers tried to organize, in order to bargain more effectively, management, naturally, resisted furiously.
As early as 1677, 12 New York cartmen became the first strikers to be prosecuted in the colonies. But it was only after the Civil War era that the union movement really got under way. In 1886 the American Federation of Labor was formed, and an increasing number of strikes roiled the nation’s industry. Some of these strikes, such as the one at Andrew Carnegie’s Homestead Steel Works, are remembered for their violence even today.
But government at this time was firmly on the side of capital (the governor of Pennsylvania sent in about 8,000 state militia to protect replacement workers at the Homestead works, and broke the strike). Not until the 1930s would labor become a major force in the American economy.
Curiously, the American labor movement never adopted the socialism that European labor unions so wholeheartedly embraced. Even in the depths of the Great Depression, in 1932, the Socialist candidate for President, Norman Thomas, received only about 2 percent of the popular vote. This antipathy to socialism would prove to be a great advantage to the economic vitality of the United States in the late twentieth century.
In 1901 J. P. Morgan assembled the United States Steel Corporation out of Andrew Carnegie’s holdings and numerous other steel companies. The new corporation, capitalized at $1.4 billion (almost three times the federal government’s annual budget), controlled about 40 percent of the domestic steel market. A joke went around about the schoolboy who was asked when the world began. “God created the world in 4004
But the public anxiety regarding large corporations had begun long before U.S. Steel was created. Although the railroads competed fiercely on the trunk lines for freight business, on the branch lines where they had monopolies, they often engaged in shameless price gouging. Several states tried regulating these rates, but in 1886 the U.S. Supreme Court ruled that this violated the interstate commerce clause of the Constitution. The political fight to rein in the railroads moved to Washington. In 1887 Congress created the Interstate Commerce Commission, the first federal regulatory agency. The bill doing so required that “all charges … shall be reasonable and just” but did not define that vague language.
Three years later, Congress passed the Sherman Antitrust Act to control the proliferating combinations among industrial companies. The trust form of organization had been invented by Standard Oil to get around archaic state incorporation laws, which often forbade corporations to own the stock of other companies or to own property beyond state borders. As a single national market emerged, these restrictions made less and less sense, but they were politically difficult to change.
Corporations increasingly resorted to subterfuge to evade them, and in 1882 Standard Oil reorganized itself so that the stock of subsidiary corporations was held by a nine-man board of trustees. But in 1889 the state of New Jersey rewrote its incorporation laws to allow holding companies and out-of-state property ownership. Corporations rushed to be incorporated there, and the trust form of organization quickly disappeared—although the “trusts” have remained as a bogeyman of American politics ever since.
The Sherman Antitrust Act was as vague in its language as the Interstate Commerce Act, and several unfriendly court decisions in the 189Os rendered it nearly moot. But when Theodore Roosevelt became President in 1901, the situation began to change.
In 1901 J. P. Morgan organized the Northern Securities Corporation, which combined several railroads into a new company that dominated transportation in the upper Midwest. The Roosevelt administration announced the next year that it was suing the corporation under the Sherman Antitrust Act to break it up.
Morgan was stunned by the news and hastened to Washington to try to settle the matter quietly, as had been the practice in the past. “If we have done anything wrong,” Morgan said to Roosevelt, “send your man to my man and they can fix it up.”
“That can’t be done,” Roosevelt said.
“We don’t want to fix it up,” the Attorney General, Philander Knox, explained, “we want to stop it.”
From that point on, the government would play an ever-larger regulatory role in the American economy. This involvement is, perhaps, the largest single difference between the nineteenth-century American economy and the twentieth-century one.
6. The Street Comes of Age
As the twentieth century dawned, the American economy had become, by far, the largest national economy in the world. And while the country produced and exported vast and growing quantities of agricultural products and raw materials, it was also a thoroughly modern economy in terms of its manufacturing potential and technological abilities. Wall Street was now equal in size to London as a capital market and, once a major capital importer, was now a capital exporter.
The United States in a little more than a century had been transformed from a nearly empty wilderness to the equal of Europe, and the world’s economic center of gravity, long in Europe, was now in the mid-Atlantic.
Then, in 1914, Europe blew up. The politics surrounding a relatively trivial event, the assassination of the little-respected Archduke Franz Ferdinand, heir to the throne of Austria-Hungary, spun out of control, and by August 1st of that year all the Great Powers of Eurooe were at war.
Politically, the United States, following a century old policy of remaining aloof from European affairs, was determined to stay neutral. But economically, the country, now deeply enmeshed in a global economy, panicked. The New York Stock Exchange, like all the other major exchanges, did not open for business on the morning of August 1st. It would not be fully operational again until the following spring. A broker, sent out to investigate the rumor of an illicit market operating on New Street, just behind the Exchange, reported back that all he could find on New Street were “four men and a dog.”
It was widely believed by both economists and politicians that a general European war would be disastrous for the American economy: Gold owned by foreigners would be withdrawn from American banks and repatriated, causing the American money supply to contract and forcing the banks to call in loans while interest rates soared. Further, investors in Britain, France, and Germany held about $5 billion in American securities, and it was feared that these would be dumped on the market to facilitate weapons purchases, causing the market to crash. American agricultural exports would decline sharply as Britain shut off the sea lanes to Germany and Austria.
Rarely, even by the standards of the famously cloudy crystal balls of economists, have predictions proved so wrong. Instead of gold leaving the country, it flooded in as European nations sent their supplies of the precious metal to the United States for safekeeping. Much of it remains here to this day, now tucked away in the vaults 80 feet below street level at the Federal Reserve Bank of New York, in lower Manhattan. Allied investments in American securities were indeed liquidated, but this was handled slowly and skillfully, mostly by J. P. Morgan and Company, and the markets were not disrupted. As a result, the United States, a net debtor for its entire history, became the greatest creditor nation in the world.
Agricultural exports, instead of declining, soared. Russia had been one of the world’s great grain exporters before World War I. But with its access to world markets through the Black Sea cut off by Turkey, Russian exports slowed to a trickle, and it was mostly American grain exports that replaced them. With millions of young men called up for military service in other countries, agricultural production declined in all of them, spurring still more American exports. Between December 1913 and April 1914, the United States had exported 18 million bushels of wheat. In the same period a year later, it exported 98 million.
Horses, needed to haul guns and wagons and slaughtered at the front bv the tens of thousands, were purchased at top-dollar prices by the Allied armies and exported by the shipload. So many horses were shipped overseas that many American farmers switched over to tractors, and draft animals began to disappear from American farms, a fact that would have great consequences in ensuing decades.
But it was American industry that benefited most from the Great War. Stocks had plunged in late July 1914, as war seemed imminent. General Motors had fallen from 587/8 to 39, nearly 34 percent, on the last day of trading before the Exchange closed. Bethlehem Steel, the nation’s second-largest steel company, fell 14 percent. But by the following spring it was clear that, however disastrous the war might be for Europe, it would be a bonanza for U.S. manufacturers.
The biggest contract in Bethlehem’s history had been for $10 million. But in November 1914, the company signed a contract with the British Royal Navy to supply it with a total of $135 million in armor plate, gun barrels, and ships. DuPont, a good-sized powder company before the war, would provide the Allies with no less than 40 percent of their munitions during the conflict. Its annual military business rose by a factor of 276; its total business, by a factor of 26. It would emerge from the war an industrial giant.
The result on Wall Street, once the market reopened, was the greatest boom in the Street’s history as measured by the Dow Jones Industrial Average. General Motors rose from its warpanicked 39 to end 1915 at 500. Bethlehem surged from 46 1/8 to 459½.
The United States, whose economic interests had become more and more linked to Britain and France as it made massive loans to the Allies, entered the war in 1917. Although it suffered relatively few casualties compared with the carnage endured by European combatants, the war caused the national debt, which had been hovering around $1 billion since 1890, to rise to $25 billion by 1919. Taxes, especially incomes taxes, were raised sharply. The percentage of the United States GDP that flowed through the federal government increased equally and would never return to pre-war levels.
Even so, in November 1918, when the Central Powers asked for an armistice, the United States was, in a very real sense, the only victor. The economies of Germany, Austria, and Russia were devastated by the war and by subsequent revolution. Even the other nominal victors, Britain, France, and Italy, had been exhausted and nearly bankrupted. But the United States economy had flourished as never before. New York had replaced London as the financial capital of the world, and the dollar had become the most important currency.
The American Century had begun.
7. Internal Combustion
By 1920, the whole appearance of American city streets along with their noise and even their smell had changed from what they had been 20 years earlier. The horse, which had been the chief means of land transportation for 3,500 years, had given way to the automobile, and the country’s largest industry had been born.
The dream of a self-propelled vehicle had been around at least since the middle of the eighteenth century, but it was only toward the end of the nineteenth, when the internal combustion engine approached practicability, that the dream started to be a reality. Europe, especially France and Germany, made many of the first technical breakthroughs (such as the carburetor), and France was the largest automobile producer until 1904, when it was overtaken by the United States, which has led the world ever since.
The brothers Charles and Frank Duryea set up a small factory in Springfield, Massachusetts, in 1896. They manufactured and sold 13 automobiles that year, beginning the American automobile industry. By the turn of the century both the variety of cars and the companies that manufactured them were proliferating. In 1903—a year by no means atypical of the early days of the automobile business—57 automobile companies came into existence (Cadillac among them) and 27 went bankrupt.
At first, automobiles were manufactured for the rich. The Duryea brothers sold their cars for $1,300 each. In 1896 that was a very good annual wage for a skilled workman. Cheaper cars came on the market, but it was Henry Ford who took up the automobile and changed the world with it.
Ford’s Model T was a revolutionary concept in automobiles in that it was designed not as a rich man’s showboat or a little fair-weather putt-putt but as basic transportation for the common man. As such it changed both the American economy and, indeed, the entire landscape of the country. It made the automobile the engine of change in the twentieth century, just as the railroad had been in the nineteenth.
The raw numbers show vividly just how quick and how sweeping was the impact of the Model T. In 1900 America produced 4,100 automobiles; in 1908, the year of the Model T’s advent, the number had risen to 63,500; in 1909 it had nearly doubled, to 123,900. In 1916 it stood at 1,525,500. Having designed the Model T, Ford put all his energies into finding cheaper and better ways to produce it. Introduced at a price of $850, by the 1920s a new one could be bought for $275, despite the inflation caused by World War I. The most important innovation was the moving assembly line, instituted in 1914.
Previously, automobiles had mostly been built one by one. But Ford had visited a meatpacking plant in Chicago and had been deeply impressed by the speed and efficiency with which steers were transformed into steaks. He reasoned that if cattle could be disassembled this way, he could reverse the process with automobiles in order to assemble them more quickly and cheaply. He was right, and the assembly-line method quickly spread not only to other car companies but to nearly every industry engaged in manufacturing complex things. And so the American passion for “productivity,” the output of factories as measured against man-hours of labor, really got under way.
The apostle of productivity had been a man named Frederick Winslow Taylor. Born in 1856 into a wealthy Philadelphia family, he went to work in a factory at 18 instead of going to college and soon began developing his principles of time and motion. He used a stopwatch to study exactly how workers did their jobs, in order to devise ways for them to do those jobs more quickly and better. In 1911, toward the end of his life, he published
There was a price to be paid for the efficiency of the assembly line, however. The workers on the line spent their days attaching the same type of bolt to the same type of nut for hour after hour. The work was stultifyingly boring. The turnover in workers at Ford plants began to rise alarmingly, and Ford sought to reverse that trend by paying his workers five dollars a day, an astonishing wage for an industrial worker. Many of Ford’s fellow industrialists foresaw disaster, but in fact the wage hike more than paid for itself in increased productivity and lower turnover. It also, not entirely coincidentally, created many more customers for Ford, as many of his workmen could now afford to buy a car.
By the 1920s, Ford was producing well over half the cars in America, which, in turn, had on its roads more than 80 percent of all the cars in the world. Between 1922 and 1930, the United States turned out an astounding 30 million passenger cars. This vast production stimulated many other businesses: The steel, rubber, and glass industries flourished. Construction companies boomed as highways and garages were built. Oil companies, rapidly losing their kerosene business to the spread of electricity, more than made up for it with increased gasoline sales.
Henry Ford, however, stayed too long with his momentous car. Owning 100 percent of the stock in the Ford Motor Company, Ford could do as he pleased. He was sure that the Model T—cheap, basic transportation—was what the country wanted. For years he was right, and his company manufactured more than 15 million Model T’s between 1908 and 1926. But his major competitor, General Motors, tried a different approach.
General Motors had been created by William C. Durant in the first decade of the century by acquiring numerous small automakers. But while Durant was a great visionary, he was a poor manager. In the 1920s he lost control of General Motors to the DuPont Company and the Morgan Bank. Alfred P. Sloan became GM’s president and proceeded to create the largest and most powerful industrial enterprise on the face of the earth. He recognized that he couldn’t compete head-to-head with the Model T, so he went it one better. His low-end car, the Chevrolet, cost a little more but offered amenities the Model T did not have—different colors, for instance.
Sloan realized that automobiles had become more than basic transportation. They were symbols that expressed both personality and status. GM produced a carefully calibrated series of cars to appeal to every economic level, from the Chevrolet to the Cadillac. By 1927 GM had surpassed Ford in auto production, and Ford was forced to abandon the Model T and retool. The company would never regain its primacy.
8. Power and Peril
The other great engine of change that made the twentieth-century American economy so different from that of the nineteenth was electricity. Electricity began being scientifically investigated in the eighteenth century and had its first major practical application in the telegraph, which started to spread in the 184Os. Edison’s electric light, far superior in both illumination and safety to gaslight, spurred the demand for electricity, which quickly spread to the major cities. But it would be the middle of the twentieth century before the New Deal’s Rural Electrification Administration made it available nearly everywhere. The astonishing increase in our electricity consumption in the past hundred years is, in very important ways, a measure of the American economy over that time. In 1902 the country produced 6.0 million kilowatt hours of electricity. By 1929, 116.7 million; in 1970, 1.4 billion; and in 1997 we produced 3.1 billion kWh.
Because steam engines have to be large to be efficient, and electric motors can be made in nearly any size required, the latter began to replace the former in factories in the 189Os, markedly increasing productivity and thus lowering manufacturing costs. Small electric motors also made possible the powered household appliances—vacuum cleaners, refrigerators, washing machines—that became widely available in the 1920s and greatly contributed to that decade’s boom. Those appliances also replaced domestic servants, who were becoming harder and harder to find, especially after immigration was sharply curtailed in 1921. In 1900 domestic service was the single largest nonagricultural job category in the United States; by 1950 the housemaid was (proportionately) nearlv extinct.
Advertising—using newspapers, magazines, billboards, and, more than ever, the new medium of radio, which exploded in popularity in the 1920s—became an important force in the American economy. New advertising techniques created demand for newer and better products, driving the economy forward. Credit, once limited to the wealthy, became more widely available as banks and, increasingly, manufacturing companies financed such major purchases as cars and household appliances.
This surging prosperity was, of course, reflected on Wall Street. There had been a short depression as the economy adjusted to peace after World War I, but by 1922 it was over, and Wall Street and the national economy began to soar. The gross national product, $59.4 billion in 1921, rose 47 percent, to $87.2 billion, by 1929. Per capita income rose by more than a third, and federal taxes were sharply reduced.
Stocks rose along with the gross national product. General Motors, which had dipped as low as 14 in the Depression of 1920-21, hit 210 by 1926. The Dow Jones Industrial Average, which had been at 75.2 in the beginning of 1921, would reach 381.17 in 1929. The number of shares listed on the New York Stock Exchange tripled between 1924 and 1929, and loans made by brokerage firms to finance stock purchases nearly quadrupled.
But while the cities grew increasingly prosperous, the rural areas of the country, largely unnoticed by the big-city media, were sliding into economic trouble. The seller’s market in agricultural products brought about by World War I dried up with the armistice, and exports returned to more normal levels. Still, American agricultural production continued to climb. One major reason for this was the spread of the tractor, which not only made farmers much more productive but freed up land once used to grow feed for draft animals. In 1900 a third of the country’s cropland had been used for fodder. As the horses and mules disappeared, this land was increasingly used to grow human foodstuffs, and farm prices began a relentless decline.
This hurt the rural banks. In 1921 the United States was home to an astounding total of 29,788 banks, the vast majority of them small, one-branch affairs in country towns. State laws often made it impossible for banks to combine into larger, more financially stable institutions, so they were tied to local business and agricultural loans. As farm prices dropped, farmers defaulted on their obligations. At the same time, the automobile made it increasingly possible for people to shop—and bank—beyond the nearest village. Small-town banks, losing their local deposit monopolies on one side of the ledger and their loan business on the other, began to go under. These failures averaged 550 a year in the 1920s. That trickle would soon become a flood.
9. Bad Medicine
As the bull market in Wall Street increased in intensity, the Federal Reserve moved to dampen it. It raised the discount rate (what it charged member banks to borrow at the Fed) three times in 1928, until it reached a then-high 5 percent. This increase in the cost of money had the effect of slowing down the economy noticeably in early 1929. But Wall Street was by now in a world of its own, and the bull market turned into a classic bubble. The Fed did nothing to intervene, even though banks were borrowing at 5 percent in order to loan money to speculators at 12 percent.
As all bubbles do, this one finally burst. The Dow Jones Industrial Average reached its high on September 3,1929, a peak it would not attain again for 25 years. The next day the market broke sharply. It trended downward for several weeks until October 24, known afterward as Black Thursday, when prices went into free fall. Bankers raised a pool of $20 million to try to steady the market, and it worked for a few days. But on Tuesday, October 29, there was no stopping the collapse. By the time the dust settled, the Dow index had dropped about 29 percent.
Although the stock market crash changed the psychology of the country, it did not precipitate the Great Depression that followed. Indeed, the crash was an artifact of the contracting economy, not its cause. Instead, several mistakes in Washington converted an ordinary recession into the greatest economic calamity in the nation’s history.
First, the new President, Herbert Hoover, had promised to seek relief for hard-pressed farmers by raising tariffs on agricultural products. When the bill reached Congress, a special-interest feeding frenzy developed, and what emerged, called the SmootHawley tariff after its sponsors, was the highest tariff in our history. A thousand economists, predicting disaster, petitioned Hoover not to sign the bill.
This time the economists were right. Other countries immediately retaliated. Tariffs rose around the globe, causing a near collapse in world trade, which would remain lower in 1939 than it had been in 1914. In 1929 the United States exported $5.2 billion dollars worth of goods; in 1932, a mere $1.6 billion. Smoot-Hawley, instead of protecting American jobs, was destroying them by the hundreds of thousands.
The Hoover administration’s second mistake was its insistence on trying to balance the federal budget in the face of steeply falling revenues as the economy contracted. The tax increase of 1932 was the greatest in terms of percentage in American history. And of course it did not balance the budget. Instead, it caused the economy to contract even more severely.
Finally, the Federal Reserve did nothing. It kept interest rates high while the money supply shrank by a third. In other words, it continued to treat the patient for fever long after he had begun to freeze to death.
The result of these policy missteps was calamity. The gross national product, which reached $87.2 billion in 1929, had by 1932 fallen by more than half, to $42.8 billion. Unemployment, a mere 3.2 percent in 1929, rose to 24.8 percent in 1933. Nearly one worker in four had no job to go to. The stock market, over 380 in September 1929, reached as low as 41.22 in late 1932, a plunge of nearly 90 percent. All the gains in the market since the first day the Dow Jones had been calculated, in 1896 (when it had closed at 40.94), were gone.
So desperate were things on Wall Street that in late 1932 the interest rate paid on U.S. Treasury bills—short-term obligations—turned negative. Treasury bills are sold at a discount and mature at face value. But in late 1932 they were selling for above 101, to mature at par. The only explanation is that people with capital to invest were so worried about the future that they were willing to pay a premium in order to put their money into the safest of all possible investments, the short-term obligations of a sovereign power.
The collapse of the nation’s banking system shows most vividly how close the American economy came to irretrievable disaster in those terrible years. More than 1,300 banks closed their doors in 1930, unable to meet their obligations. Two thousand more followed in 1931. In 1932 no fewer than 5,700 banks failed, carrying with them into oblivion the hopes, dreams, and economic security of millions of American families.
By March 1933 a nationwide banking panic was under way, as people withdrew their money from all banks, sound and unsound alike. One of the first acts of the new Roosevelt administration, inaugurated on March 4, was to shut all the banks in the country until they could be examined. The hope was that once sound banks were found to be in good condition and allowed to reopen, depositors would leave their money in them, and the banking system could function once again. It worked, and a national financial catastrophe was narrowly averted.
10. Deliberate Deficits
The country that elected Franklin D. Roosevelt President in 1932 was a very different place from the one that had elected Herbert Hoover four years earlier. The severity of the Depression that had engulfed it had overwhelmed the mostly informal social services that were available to deal with poverty. And that poverty was everywhere. The homeless threw up ramshackle collections of huts, known as Hoovervilles, in places as visible as New York's Central Park. Furthermore, the traditional government fiscal policies of avoiding deficits and paying down the debt had not only been impossible to achieve as the economy spiraled downward, but the pursuit of those policies had greatly worsened the situation.
These new realities produced a fundamental change in American politics. Before the Depression, a balanced budget had always been the number-one goal of government fiscal policy; ever since, the goal has been to avoid a new Great Depression. In the years before 1930, the government had an annual surplus twice as often as it ran a deficit. In the years since, it has run deficits seven times as often as surpluses, despite the return of prosperity.
Indeed, Roosevelt made deficits a matter of deliberate policy for the first time, instituting an array of projects, collectively known as the New Deal, to get people back to work. These programs, often known by their initials; CCC, WPA, inundated Washington with what came to be called alphabet soup and, in a wartime mutation, gave rise to the acronym (the word entered the language only in 1943, by which time the military had created hundreds of them). But these programs couldn't end hard times. Unemployment, which reached a staggering 24.9 percent in 1933, was still at 17.2 percent as late as 1939, higher by far than it has ever been since.
Federal spending more than doubled between 1933 and 1940, from $4.6 billion to $9.6 billion (and the debt nearly doubled to finance it). Federal revenues as a percentage of the gross national product rose sharply as well. Revenues were only 3.6 percent of the gross national product in 1933. By 1940, they were 6.9 percent. This was the start of a continuing trend in which a nearly constantly increasing share of the nation's wealth would pass through the budget of the federal government every year. Whatever the politics this change engendered, the economic effect has been to make the federal budget act more and more like a fiscal flywheel, automatically supplying stimulus to the economy when it slows down.
The other great change brought about by the Depression and the New Deal was in the area of regulation. The banking system was overhauled. The Federal Reserve, which had failed to act as the Depression deepened, and in many cases couldn't have because of restrictions in the law, was reorganized and given broad new powers to oversee credit and the money supply. Learning from its mistakes, the Fed was to use these powers very effectively in the stock market crash of 1987. That is why today, only 14 years later, that crash is nearly forgotten.
The national banks lost their power to issue notes, and the Federal Reserve became the sole supplier of paper money—except for the Treasury's onedollar silver certificates, circulated until the 1960s. The Fed was also given increased power to regulate interest rates, including margin requirements on Wall Street.
The Glass-Steagall banking act of 1933 forced the great Wall Street banks, which were often conveniently blamed for all the economic troubles, to become either depository banks or investment banks. In other words, it separated the banking business from the securities business. J. P. Morgan and Company, which remained a depository bank, was forced to spin off Morgan Stanley as an independent investment bank and would never again have the extraordinary power it had possessed.
And Glass-Steagall established the Federal Deposit Insurance Corporation, to insure bank deposits up to $5,000 (now $100,000). The purpose was to remove any incentive for depositors to panic and suddenly withdraw funds based on rumors of impending bank failures. It has certainly been a success;the country has not experienced a serious banking panic since. But it also, in the long term, made banks less cautious about where and to whom they loaned money. This would have severe consequences half a century later, when the savingsand-loan industry vanished in a sea of red ink, and the federal government had to pay depositors billions.
Wall Street brokerage firms as well came under federal regulation for the first time, with the establishment of the Securities and Exchange Commission, intended to curb the excesses that had plagued Wall Street in the 1920s. The New York Stock Exchange had become one of the most important financial institutions in the country, but it had remained, in both form and substance, a private organization, run for the benefit of its members.
This had led to many abuses, as the members, operating on the floor of the exchange, formed pools and otherwise manipulated stocks at the expense of the general public. Roosevelt appointed Joseph P. Kennedy, himself a major stock speculator in the 1920s, as the SEC's first chairman. Many thought this meant the fix was in, but Kennedy proved an able choice and got the SEC off to a good start. It was the commission's third chairman, however, William O. Douglas (who later sat on the Supreme Court for more than 30 years), who instituted major reforms and persuaded the Stock Exchange to write a new constitution. That transformed it into a quasi-public institution, one far better able to meet the public needs than ever before.
The labor movement was also transformed by the New Deal. In the prosperous 1920s, membership in labor unions had shrunk considerably. Then, as the Depression deepened, the conflict between labor and capital intensified. The Roosevelt administration was strongly on the side of organized labor, and in 1935 the National Labor Relations Act reshaped the situation.
It guaranteed the right to organize and bargain collectively and outlawed many practices companies had been using to prevent union activity. The country's largest industries were unionized in the late 1930s, although not without considerable resistance and some real violence. On Memorial Day 1937, police fired on demonstrators outside the Republic Steel plant in Chicago, killing 10 and wounding 84.
By the end of the New Deal era, organized labor had become a major part of the economy, with political influence to match. By the 1940s, more than a third of America's workers would be unionized.
11. Arsenal
If the New Deal did not succeed in ending the Great Depression, the Second World War most certainly did. On December 29,1940, FDR gave his famous “Arsenal of Democracy” speech, describing massive aid to Britain and China as the best way for America to stay out of the fight. The United States needed a huge rearmament program anyway, though, because except for our navy, we were a third-rate military power.
With the attack on Pearl Harbor, the United States became in fact the arsenal of democracy. In the four and a half years after Roosevelt’s speech, American industry turned out 296,400 airplanes; 86,330 tanks; 64,546 landing craft; 3,500,000 jeeps, trucks, and personnel carriers; 53,000,000 deadweight tons of cargo vessels; 12,000,000 rifles, carbines, and machine guns; and 47,000,000 tons of shells.
The nation accomplished this feat, perhaps the most remarkable one of the entire industrial era, by temporarily converting its economy from freemarket to centrally planned. The War Production Board, headed by Donald Nelson, a former Sears, Roebuck executive, controlled all vital resources and allocated them according to the most urgent needs, creating entirely new industries when required. Synthetic-rubber production was nonexistent in 1939; in 1945 the country turned out 820,000 tons.
And this huge production was accomplished without harming the civilian economy. While certain things were rationed, Americans were able to maintain a more normal existence than the citizens of any of the other major combatants. The United States had the strength to simply build a war economy on top of its civilian one, and the gross national product expanded by 125 percent between 1940 and 1944, growing from $88.6 billion to $198.7 billion.
At war’s end, the United States was truly the only Great Power in the world, militarily as well as economically. The defeated powers, Germany and Japan, had seen their economies destroyed along with their capacities to wage war. The Soviet Union had suffered more than 20 million killed. Britain and France were effectively bankrupt.
All of them had endured terrible damage to their homelands. But the continental United States, sheltered by vast oceans, had been physically untouched by the war. Its economic capacities, far from being diminished, had been greatly enlarged. America accounted for more than half the world’s production. The center of international economic gravity was now in the American heartland.
Fortunately both for itself and for the rest of the world, the United States did not try to retreat once more into isolation, as it had after World War I. As the world’s greatest power, it had an obligation to lead, and it did so. A new world monetary order was established even before the war was over, at a New Hampshire resort called Bretton Woods. The agreement made there set up the International Monetary Fund and the World Bank to help stabilize currencies. The dollar would be fully exchangeable for gold, but only by foreign governments, not by citizens, and it would function as the world’s primary reserve currency. This meant that it also became the primary currency of world trade, as it has remained. Today, if, say, Saudi Arabia sells a tanker-load of oil to Japan, the deal is done in dollars, not in yen or riyals.
Also, the United States forgave the war loans it had made to its devastated allies, greatly improving their economic situations. And it provided massive aid not only to those allies, but equally to its defeated enemies through such programs as the Marshall Plan and the World Bank. This helped get these countries back on their feet while also providing large and growing markets for American goods. World trade began to grow quickly and leave behind its near collapse of the 1930s. And the United States led the way to continuing tariff reductions. In 1953 world trade totaled $167 billion. Just 17 years later, it was $639 billion. Today it is over $1.7 trillion.
The Second World War had a far greater effect on the American economy than had the First, and there was much foreboding regarding the country’s economic future after it. Although unemployment had almost vanished, memories of the Great Depression were still painfully fresh. Many economists, fearing the consequences of the end of war production, predicted that a renewed depression was inevitable. After all, there had been a sharp recession after the end of the first war.
But it didn’t happen. Instead, pent-up demand more than kept the economy humming. During the war, nearly everyone who had wanted a job that paid good wages had had one, in industries largely given over to military production. Civilian durable goods and housing had just not been produced. In 1943 the United States auto industry turned out a grand total of 100 cars, all for government use.
But because everyone had been working, per capita disposable income had doubled during the war years, and that increased income had gone almost entirely into savings. Americans put away $14.3 billion in 1940 and $44.7 billion in 1945, a savings rate never approached before (or, for that matter, since). Now, with industry switching back to civilian production, the demand for automobiles, washing machines, and so on, kept the factories busy.
The demand for housing, the construction of which had lagged seriously during the Depression and ceased entirely during the war, also drove the economy. Increasingly this housing was of an entirely new type, suburban single-family houses on small plots. Since the dawn of the Republic, the major trend in population movement had been from country to city. Cities had taken the lead in population by 1900, and that lead had kept increasing. But after the war, the suburbs began growing quickly, ultimately becoming the dominant political power centers.
With the country suburbanizing, the population of cities began to decline. Indeed, every major city in the Northeast except New York—ever the exception—has seen steep drops in population since 1950, often by as much as half. The surrounding areas, once farmland, are now tessellated with tract housing. So while cities have often remained the economic engines of the country, with their workers commuting in from the suburbs, their political clout has dwindled.
Also, the rapid spread of air conditioning following the war greatly facilitated migration to the warmer parts of the country, causing the economies of these areas to both grow and diversify. Migration was from south to north in the first half of the twentieth century, but the current reversed in the second half. Nothing exemplifies this shift so much as the decline of the political power of the self-described “Empire State.” In 1940, with a population of 13.5 million, New York ranked number one in population by a comfortable margin and had 47 electoral votes. After the 2000 census, New York, with a population of 18 million, stood only third and will have just 31 electoral votes in the next presidential election. Although the state’s population grew by 38 percent over those 60 years, other states had grown so much faster that New York has lost 34 percent of its electoral votes.
None of this is to say that the late 1940s were a period free of major problems for the American economy. Both inflation and labor unrest bedeviled the country in the immediate aftermath of the war. Strict wage and price controls had been imposed shortly after America entered the fight, and now there was intense political pressure to dismantle these wartime regulations. It was done perhaps too rapidly. Had Roosevelt lived, he might have been able to use his vast prestige to slow the process. But his successor, virtually unknown to most Americans before he entered the White House, was helpless to do so. Nevertheless, he got most of the blame for what ensued, and “to err is Truman” became a national joke.
With the controls lifted, prices adjusted to market realities suddenly, and inflation roared to life. Farm prices shot up no less than 14 percent in a single month and were 30 percent higher by the end of 1945. They rippled through the rest of the economy too. Meanwhile, with corporations having prospered mightily during the war, enjoying a 20 percent increase in profits, labor wanted its share of the pie. Strikes, forbidden during the war, multiplied so fast that by January 1946, fully 3 percent of the American labor force was on the picket line.
Many people felt the labor movement had become too powerful, and in response Congress passed in 1947, over President Truman’s veto, the Taft-Hartley Act. This gave companies many rights denied them by the Wagner Act of 1935, such as the right to express their opinion on organizing efforts (as long as no intimidation was involved). It also outlawed the closed shop, in which workers must belong to the union before they can be hired, and allowed states to outlaw the union shop, where workers are obligated to join after being hired.
Many states, especially in the South, did so. Companies attracted by the low cost of living in the South and its right-to-work laws began building factories in the old Confederacy. The American South, for 80 years almost a Third World country inside a First World one, began to catch up with the rest of the nation in economic development.
But by the time Taft-Hartley became law, labor unrest was already waning. There had been 116 million man-days (1.43 percent of total days) lost to strikes in 1946. In 1947 only 34.6 million were. By the 1990s, the average number of man-days lost to strikes had dropped to a mere 5.1 million (0.02 percent of total days). Both labor and management had learned how to maximize their broad common interests and minimize their inevitable conflicts.
The mid-1940s proved the high-water mark for organized labor, with unions representing 35.5 percent of the nonfarm work force in 1945. Today they represent only 13.5 percent.
One cause of the fall in union membership was the revolution in American higher education made possible by the GI Bill, one of the most consequential pieces of legislation in American history. Among its provisions, the GI Bill provided generous allowances for tuition for veterans wanting to attend college. The strength or the response rar exceeded anyone’s predictions, and millions of families that had never had a college graduate now sent their boys and, increasingly, girls to college. Today women make up more than half of all graduating seniors. In the 1930s only about 3 percent of the population had college degrees. Today nearly 25 percent do.
The new economy that began to dawn after World War II would empower individual workers to an unprecedented extent. One way it did this was by the spread of pension and profit-sharing plans offered by businesses and by a rise in the number of individual investors. Even at the height of the 1920s boom, less than 10 percent of the American population had any investments beyond savings accounts. But when Charles Merrill formed Merrill Lynch in 1940, he set out to tap the potential market of the average middle-class citizen. He succeeded beyond his wildest dreams, and by 1960 Merrill Lynch was bigger than the next four firms on Wall Street combined.
Merrill Lynch offered fully trained representatives and careful research to reassure customers that their money was in a safe place and being invested wisely. Slowly, the public so scarred by the memories of 1929 began to return to Wall Street, and families that had never invested before began to do so. Many put money into mutual funds, which allowed them to invest in stocks and bonds without having to pick them individually. The mutual funds and corporate and government pension funds became major players on the Street, fueling a rising volume that continues to this day. The first year to see a billion shares traded on the New York Stock Exchange was 1959. Today, average daily trading exceeds 1.2 billion.
Wall Street abounded in opportunities at the time, for it had not fully shared in the postwar boom. On December 31,1949, the Dow Jones stood at 200, only twice what it had been in 1940, even though the gross national product had nearly tripled. Many blue-chip stocks were selling for less than four times earnings, despite paying dividends of 8 to 12 percent.
Finally, in 1954, the Dow began to climb sharply and passed its old high of 381.17. It would continue up for another decade and more, passing 1,000 for the first time in 1967.
12. Tech, High and Higher
No small part of the new prosperity was generated by new technology. Although World War II was the greatest human disaster of the twentieth century, it was not an entirely unalloyed one. The enormous pressure of total war always accelerates technological development, and what emerges often turns out to have major civilian applications. The development of radar and of very large airframes for bombers made the modern airtravel industry possible years before it would otherwise have grown up.
The jet engine, developed too late to be important in the war, revolutionized air travel a decade later (in the process killing both the ocean liner and the longdistance passenger train). Not only did air travel become one of the driving forces of the postwar American economy, but aircraft construction became a major enterprise and a vital part of America’s exports. American planes, especially those manufactured by the Boeing Corporation, continue to dominate this extremely capital-intensive industry.
The jet also shrank the world by an order of magnitude, as the railroads had done a century earlier. Traveling from New York to Los Angeles had taken three days in the 1930s. By the 1960s it required only five hours. Europe, nearly a week’s journey from the East Coast by ship, was only about seven hours away by plane. Foreign travel, heretofore the privilege of the rich, became commonplace.
Out of the V-2 rocket, developed by Germany as a terror weapon, emerged the modern space industry, which has become nearly as vital a part of the American economy as agriculture or automobiles. Hundreds of satellites carry vast data streams, knitting the country and the world together in ways never possible before, and at a fraction of the price of undersea cables.
The fall in the cost of moving data is vividly illustrated in the number of overseas telephone calls originating in the United States. In 1950 we placed about one million overseas calls. By 1970 the number had risen to 23 million. In 1980 it was 200 million. By 1994 the number was up to 3 billion. This is all the more remarkable when one considers that the first year in which more than half of American households had any telephone at all was 1946.
Space has also become a platform from which to measure and monitor the earth, both as a whole and nearly every square inch of it separately. Weather satellites now allow more careful storm tracking and far more accurate long-range predictions than ever before possible. Other satellites keep track of land use, forest fires, pack ice, many forms of traffic, and a thousand other things, including, of course, the activities of potential enemies.
Top-of-the-line automobiles these days come with geo-positioning systems that determine the car’s exact location by using signals from satellites and then giving the driver directions to his destination. Farmers now use satellite-derived data to tell them precisely where extra fertilizer is needed as they tend their fields with tractors linked to sensors in space.
And, of course, the technology of the rocket ended this country’s long immunity to foreign attack. As a consequence of this, and because the Soviet Union proved an aggressively hostile power, the United States was forced to spend billions on something it had never needed before, a vast peacetime military establishment. In the 1950s, at the height of the Cold War, 58.7 percent of the federal budget was devoted to military spending, up from just 15.6 percent in 1939.
But none of this extraordinary new technology would have been possible except for World War II’s greatest gift to the future, the computer. The word computer has been in the English language since the middle of the seventeenth century, but before the middle of the twentieth it meant a person who did calculations for a living, compiling such things as actuarial tables for life-insurance companies.
The problem was that human computers (usually women, by the way, who were thought to be steadier and more accurate workers) could not perform calculations fast enough, and of course made mistakes. (The great nineteenth-century mathematician William Shanks calculated the value of pi to 707 decimal places in 1873. Seventy-two years passed before anyone found out that he had made a mistake after digit 527 and the next 180 were therefore wrong.)
The idea of a calculating machine went back as far as Charles Babbage, in early-nineteenth-century England (his machine, a wonder of intricate brass gearing, is on display at the British Museum). But only in World War II did a pressing need, vast government money, and the requisite underlying technology combine to make possible a true electronic computer. The first successful one, called ENIAC, for Electronic Numerical Integrator And Computer, was completed at the University of Pennsylvania by Presper Eckert and John Mauchly in 1946, after three years of effort.
ENIAC was the size of a bus, filling 40 filing cabinets, each nine feet high, with 18,000 vacuum tubes and uncounted miles of wiring. It sucked up enough electricity to power a small town. It was, by modern standards, glacially slow. To program it, its operators had to change its wiring by hand on switchboard-like grids. But it worked (although people had to stand by constantly to replace vacuum tubes as they blew and to remove the occasional errant insect—the origin of the term debugging. Computers rapidly shrank in size, especially after the far smaller—and far more reliable and cheaper—transistor, invented at Bell Labs in 1947, replaced the vacuum tube.
Computers quickly spread to laboratories and military installations that needed the ability to do millions of calculations quickly. But they also found uses in offices that had to handle heavy amounts of data processing. By the 1960s, banks, insurance companies, and large corporations were depending on computers, replacing hundreds of thousands of mindnumbing clerical jobs.
But computers remained big and mysterious, hidden away in air-conditioned rooms and tended by technicians in white coats. And they remained very expensive. The reason they were so expensive is what is known as the tyranny of numbers. A computer’s power is dependent not only on the total number of transistors it has but also the number of connections between transistors. If there are only 2 transistors, only 1 connection is needed to link them. But 3 transistors need 3 connections to be fully linked, 4 transistors need 6, 5 need 10, 6 need 15, and so on. As long as these connections had to be made by hand, the cost of building more powerful computers increased far faster than did their power, limiting how effective thev could be.
The microprocessor changed everything. The integrated circuit, first developed in 1959, allowed many transistors to be wired together simultaneously when they were manufactured. In 1971 Intel marketed the first microprocessor, in effect a small computer on a chip of silicon that utilizes integrated circuits. Although the price of designing a microprocessor and of building the machine tools necessary to produce each design was very high indeed, once that investment was made, microprocessors could be turned out like so many high-tech cookies. The tyranny of numbers was broken, and the computer age began. Intel’s first chip had 2,300 transistors on it. Its newest one for personal computers, the Pentium 4, has 24,000,000.
The computer age developed with astonishing speed, as the cost of computing, thanks to the microprocessor, collapsed. Computing power that would have cost a thousand dollars in the 1950s today costs a penny or two. Microprocessors are now found in nearly everything more complex than a reading lamp. And the personal computer, something nonexistent outside of science fiction 30 years ago, now resides in more than half of all American homes. American schools have one computer for every five children, 25 times as many as they did as recently as 1983.
The most important spinoff of the computer revolution, the Internet, spread explosively in the 1990s and now envelops the globe in a wholly new communications medium of vast and still barely discerned potential. Its influence on the future will be at least as great as the most important spinoff of the steam engine, the railroad, had on its time. Already we are deep into the information age, and it drives the American economy. Manufacturing, the very heart and soul of the economy as late as the 1960s, now accounts for only 14 percent of the gross national product. And while agriculture remains the single largest component of American foreign commerce, information and expertise dominate the country’s exports. Legal and accounting services run rich surpluses; Hollywood dominates the world markets in movies and television the way Southern cotton once dominated the world market in fibers.
The change wrought by the microprocessor can be easily grasped. As recently as the 1960s, if every computer in the world had suddenly shut down, the average person would not have noticed. Today, civilization would collapse. Telephones would not work, cars would not run, household appliances would freeze. Banks, stock markets, and governments would fail to operate. And broadcasters and newspapers would be unable to gather, write, and distribute the news of the calamity.
13. Grappling With the Globe
The great depression and the second World War not only changed the American economy profoundly, they also changed the discipline of economics. The classical economics first developed by Adam Smith looked on government more as an obstacle to achieving and maintaining prosperity than as a means to it. But Britain’s John Maynard Keynes, the most influential economist since Smith, saw it differently. Keynes was interested in the big picture, aggregate demand and supply in an entire national economy.
In the long run, these two must, obviously, balance out. But as Keynes explained in his famous aphorism, in the long run “we are all dead.” In the short run, demand can outstrip supply, and inflation results. Or demand can lag, and depression occurs. Keynes felt that government should take an active role in regulating supply and demand by manipulating both fiscal policy (how much the government taxes and spends) and monetary policy (how much money creation is allowed). Keynes also held that the size of a country’s internal national debt (debt held by its own citizens, instead of foreigners and their governments) did not much matter, as the pluses and minuses would automatically balance out.
Keynesianism, as it came to be called, quickly dominated the economics profession. It is not hard to see why. Before Keynes, politicians didn’t need economists to help them run the country any more than they needed astronomers. Keynesian economics made them indispensable. By the 1960s, Keynesian thinking would completely dominate in the halls of government, and Richard Nixon would admit that “we are all Kevnesians now.”
The Kennedy administration was the first to wholeheartedly adopt a Keynesian model of the American economy, and Kennedy’s chief economic adviser, Walter Heller, boasted of being able to “fine-tune” the economy by artful policy moves. It was not to be. President Johnson’s attempt to have both guns (the Vietnam War) and butter (his Great Society programs) brought on serious inflation while at the same time the economy stalled. This unprecedented situation, dubbed “stagflation,” was supposedly impossible under Keynesian economic models.
The Nixon administration tried to deal with the inflation by employing a remedy that went back to the Code of Hammurabi. For the first time in peacetime history, the federal government imposed wage and price controls. They dampened inflation temporarily, but they proved impossible to sustain in the long run and were soon abandoned. At the same time, President Nixon unilaterally broke the link between the dollar and gold. Foreign governments would no longer be able to count on converting their dollar reserves into gold should they so choose. Inflation, already bad, became much worse not only in the United States but around the world.
The federal budget, as a consequence, began to slide out of control. The national debt, which had reached $269 billion in 1946, equal to nearly 130 percent of the gross national product, stabilized after the war and, as the economy grew quickly, shrank rapidly as a percentage of gross national product. By 1970 it was only 39 percent of the country’s output of goods and services. But it nearly tripled in the next decade as the government borrowed heavily to maintain social programs. Thanks to the worst inflation since the Civil War, the debt as a percentage of GNP held steady. But inflation, which in 1980 reached a staggering 13.5 percent, ravaged the savings and investments of American citizens and pushed them into ever-higher tax brackets.
The stock market, which had briefly penetrated the 1,000 mark on the Dow in the late sixties and again in the early seventies, entered a ferocious decline. In December 1974, the Dow hit bottom at 577.60, wiping out all the gains since the Eisenhower administration. The Wall Street writer Andrew Tobias, in an article titled “The Bulls of Wall Street (Both of Them),” noted that if greed and fear were the only two emotions known in the stock market, then “I think it’s time we put in a good word for greed.” Although the market recovered in the latter half of the seventies, it remained far below its earlier peaks in real terms.
A major problem besides inflation was the serious erosion of the extraordinary dominance the United States had enjoyed in the world economy right after World War II. The economies of the other belligerents had recovered, and so too had their economic competitiveness. Indeed, they had one advantage over the United States. Having had, in many cases, to start again from scratch, they could incorporate the latest equipment and technology in their new factories and become the low-cost producers.
With world trade expanding, and shipping costs and U.S. tariffs down, these countries could now compete effectively in the American market. Nowhere was this more clear than in the linchpin of the twentieth-century American economy, the automobile industry. American cars had grown ponderously large and thirsty while innovation had faltered in both manufacturing techniques and technology. By the 1960s, the Volkswagen Beetle was commonplace on American roads, and while not a major threat to Ford, Chrysler, or GM, it proved a harbinger.
The United States, where the petroleum industry was born and which remained a petroleum exporter until the 1950s, had become a major importer of the vital fuel by the 1970s. When the Yom Kippur War broke out in the Middle East in 1973 and the Arab countries boycotted exports to the United States, the resulting oil shortages came as a profound shock. Long lines and rising prices at gas stations severely disrupted the economy and produced a surge in demand for more fuel-efficient foreign automobiles.
Foreign cars turned out to be better made too, and the new competitive pressure they introduced forced the domestic auto makers to shape up. It was not an easy or painless process. Many people who had earned good livings for decades found themselves out of jobs as the car industry made itself more efficient. This was repeated in nearly every other major industry. The American steel industry employed 398,000 workers in 1980, and they produced 95.2 million tons. In 1997, steel production had increased to 131 million tons, but with only 112,000 workers.
Many of these jobs went abroad to less developed countries where labor costs were much lower. The communications and air-travel revolutions were allowing companies to operate as unified organizations all around the globe, just as the railroad and telegraph had promoted the growth of national markets a century earlier.
In 1980, the country, wearied of all the inflation and unemployment, unseated an elected President, Jimmy Carter, for the first time since Herbert Hoover, and put Ronald Reagan in the White House. Reagan, together with the chairman of the Federal Reserve, Paul Volcker, proceeded to break the back of the chronic inflation by taking interest rates to unprecedented levels. This induced a severe recession that pushed unemployment over 10 percent for the first time since the Great Depression and caused the stock market to fall below 800. But the medicine, distasteful as it was, worked. Inflation subsided, and the economy took off. The stock market began to move up sharply in 1982, and the Dow crossed 1,000 for the third and so far final time.
But because inflation fell while President Reagan insisted on building up the military and Congress insisted on maintaining social programs, federal budget deficits climbed to levels unknown in peacetime, and the national debt exceeded five trillion by the mid-1990s, 20 times what it had been at the end of World War II.
One major consequence of the globalization of markets has been a severe loss of sovereign governments’ power to control their internal economies. When France elected a socialist government in 1981, the new president, Fran°ois Mitterrand, tried to implement a traditional socialist program, including nationalizing the banks. Currency traders, who by then operated around the globe as a single unified market, often trading as much as a trillion dollars a day, began dumping the franc until the French government had no choice but to reverse course. It was a pivotal moment in world history. For the first time, a free market had dictated policy to a Great Power.
For the second greatest power in the world, the Soviet Union, the new high-tech, market-driven world economy was a death knell. Its command economy was able to produce a first-rate military, but the repressive Soviet system could not compete with the increasingly high-tech economies that were rapidly developing in the world’s other major nations, which depended on the free flow of information by means of the Internet, faxes, photocopiers, and telephones.
Attempts at reform in the late 1980s proved impossible to control. The Soviet empire in Eastern Europe collapsed in 1989, the Soviet Union itself two years later. Its death ensured that capitalism and, to a greater or lesser extent, democracy were the models for developing nations to emulate.
Japan had already re-established itself as a leading economic power by the 1970s, and many of its East Asian neighbors, including China, Taiwan, Hong Kong, and Singapore, began exponential growth that made them major forces in world markets. The United States, facing both Europe across the Atlantic and the new “Asian tigers” across the Pacific, was uniquely well placed geographically to benefit from the new globalization. In 1993 it signed a treaty with Canada and Mexico, NAFTA, to establish a North American common market that has grown vigorously in the years since. Other nations in the Western Hemisphere will likely join in the future.
14. Promontory
And so, by the late 1990s, all the pieces had come together to give the United States the strongest economy not only in the world but in its own history as well. The creative destruction of the old manufacturing-based economy had resulted in the birth of a new information-based economy that was ever-increasingly globally integrated. Because we were the first nation to begin the process of converting to the new economy, we were the first to emerge from the often painful process and to enjoy the benefits it ultimately brings.
Will the current prosperity last? That is not for a historian to say, although history certainly teaches that the business cycle cannot be repealed. But history also shows that we can survive, and triumph over, whatever the business cycle might bring our way. The twentieth century was born in an era of prosperity, progress, and optimism, but it was all nearly destroyed before the century was half over by two military conflicts that were unimaginable in scale and scope to those who greeted their new era on January 1,1901. Nor could they, on the other hand, have imagined the world in which we, their great-grandchildren, now live, or the level of prosperity and technological opulence that nearly all members of American society now enjoy as we enter the twenty-first century.
We can only hope that our great-grandchildren will be able to look back on us from similar heights, however exciting and sometimes painful the journey there may turn out to be.
John Steele Gordon writes the Business of America column for American Heritage and is the author of The Great Game: The Emergence of Wall Street as a World Power, 1663-2000.