How a debt-ridden banana republic became the greatest economic engine the world has ever known
-
February/March 2007
Volume58Issue1
It has been 400 years since European settlement began in what is now the United States. In that time, a land occupied by a few million Neolithic hunter-gatherers has been transformed into the mightiest economy ever known, producing nearly one-third of the world’s goods and services. There are few economic sectors, indeed, from agricultural exports to jet-aircraft production to entertainment, in which the United States does not lead.
In these four centuries of economic history, there have been many turning points that changed the future of American business. Some of these turning points were for the better, some for the worse, and some for both. Here are 10 of the most significance:
Robert Morris, who had helped greatly in financing the Revolution, turned George Washington down when the President offered him the post of Secretary of the Treasury. Morris wanted to be free to speculate in land and other opportunities to make money. It was a poor decision on Morris’s part (he would end up in debtor’s prison), but it was very good for the country because Washington then turned to Alexander Hamilton. Still in his early thirties, Hamilton was both a genius and a prodigious hard worker. There was much work to do, because the national financial situation was desperate.
The old federal government under the Articles of Confederation had lacked the power to tax. Instead it was dependent on requisitions from the states, and they were sometimes forthcoming and sometimes not. The massive debt left over from the Revolutionary War was unpaid, as was the interest due on it. The money supply was chaotic; it was a hodgepodge of foreign coins and “continentals,” the paper money issued by the Continental Congress during the war that depreciated rapidly and traded at pennies on the dollar. In 1789, the United States was financially and economically, nothing more than a very large banana republic.
Hamilton had to accomplish four things to transform it: (1) develop a system of taxation to fund the government and establish a customs service to collect the tariff, destined to be the main federal tax; (2) organize a monetary and banking system; (3) refund and rationalize the national debt in ways that would gain the confidence of the marketplace; and (4) devise a mechanism to allow the government to borrow as necessary.
Hamilton accomplished all this in the first two years of his tenure. And though the Treasury was the biggest of the new government departments (it had 40 employees to the State Department’s mere 5), it was largely Hamilton’s work in both conception and political execution.
The results were astonishing. The American economy, which had been mired in depression for much of the 1780s, revived wonderfully (helped, to be sure, by the outbreak of war in Europe). Federal revenues were a meager $3.6 million in 1792, the first year for which statistics are available, but, by 1800, they topped $10 million. Government bonds began selling at a premium in Europe. The banking system grew rapidly, centered on the Bank of the United States, established by Hamilton under a federal charter, and its notes traded at par throughout the Union. For the first time since colonization had begun 200 years earlier, the United States had a reliable and convenient money supply.
Hamilton was fought, tooth and nail, by the developing political opposition under Thomas Jefferson, and parts of his program, especially the Bank of the United States, would later be dismantled (Hamilton’s shade might take comfort in the fact that his 1784 creation, the Bank of New York, the very first corporate stock to be traded on the New York Stock Exchange, continues to flourish). Still, thanks to Hamilton, the economy of the new nation was off to the races and began the growth that has been the wonder of the world to this day.
Cotton was an expensive fabric at the end of the 18th century, despite the mechanization of the English cloth industry. In the new United States there was little cotton grown, except on the Sea Islands, along the coast of Georgia and South Carolina. This long-staple cotton (now often called Egyptian cotton) requires a very lengthy growing season and sandy soil in order to flourish. Short-staple or upland cotton is much less demanding. It needs only 200 frost-free days, and any good soil will do.
Short-staple cotton had a big problem, however. Unlike Sea Island cotton, upland cotton’s seeds are sticky and deeply embedded in the cotton boll. While a field hand could pick perhaps 50 pounds of cotton a day, it took fully 50 man-days to handpick the seeds from that amount of cotton fiber. As long as ginning—as removing the seeds is called—was this labor-intensive, short-staple cotton could not compete in the world market with other fibers.
Then a young man with a gift for tinkering named Eli Whitney changed that and the American economy and American history as well. After graduating from Yale in 1793, Whitney accepted a job as a tutor in South Carolina, and, after the job did not work out, he visited a friend in Georgia. There, he saw cotton growing for the first time and heard how much the laborious ginning process limited demand.
As Whitney recollected, he immediately “struck out a plan of a machine in my mind.” The machine was simplicity itself. Whitney took a wooden roller and studded it with nails. As the roller was turned, the nails picked up the cotton fiber from a compartment above and, when they passed through a comb, pulled the fibers through it, leaving the seeds behind. A rotating brush then swept the cotton off the nails into another compartment.
Whitney’s first crude gin immediately allowed a single laborer to do in one day what had previously taken him 50 days. In other words, it reduced the cost of ginning cotton by 98 percent.
The economic utility of this device was so obvious that the first gin Whitney built was stolen. And, while he patented an improved model the next year, he was not able to enforce the patent, as any competent carpenter could build a cotton gin in an afternoon out of readily available materials. Whitney would realize only a few thousand dollars from his epochal invention.
But, if the effect of the cotton gin on Whitney’s personal finances was relatively modest, its effect on the economy of the country can hardly be overstated. In 1793, the United States exported only about 488,000 pounds of cotton, less than one percent of total world production. The next year the total more than tripled to 1.6 million pounds and, by 1801, exports reached almost 21 million pounds. Southern cotton soon began supplying New England’s rapidly growing textile industry as well, greatly fueling this country’s nascent Industrial Revolution.
The Deep South, especially in Alabama’s Black Belt and the rich alluvial soils of the Mississippi Delta, turned out to be an ideal place in which to grow cotton. Production doubled on average in every succeeding decade, as more and more land was devoted to the highly profitable crop, until it reached two billion pounds in 1860. By 1830, the United States was producing half the world’s cotton; 20 years later, it was 70 percent. Cotton would be the single most valuable export of the United States until the 1930s.
But, while Whitney’s invention made ginning much less labor-intensive, cotton was still a highly demanding crop, requiring about 70 percent more labor than corn to produce a good yield. Fortunately there was a ready supply of low-cost laborers; unfortunately they were slaves.
Slavery had been on the wane in the United States since the middle of the eighteenth century, as the idea that it was morally wrong began to spread. Vermont had been the first place in the Western Hemisphere to outlaw it, and most Northern states quickly followed suit. In 1787, the Northwest Ordinance forbade slavery north of the Ohio River. In the South, while there was little political push to abolish slavery, manumission became fashionable. George Washington freed his slaves in his will, as did many other planters of his generation.
But, with the birth of the “Cotton Kingdom,” the demand for slaves increased sharply, and the price of a good field hand with it. The slaveholders, who made up about 5 percent of the Southern population but had disproportionate political power, found themselves possessed of an ever more valuable capital asset in their slaves. As a consequence, they became more and more resistant to abolition in any form.
Slaveholding areas that were too far north to grow cotton, such as Virginia and Maryland, began to supply the burgeoning demand for slaves in newly opened areas in the Deep South. Between 1790 and 1860, 835,000 slaves were “sold south,” at an incalculable cost in human misery, as families were broken up.
While cotton enriched the country economically, it greatly deepened the political divide between North and South. Had Eli Whitney invented the cotton gin only a decade later than he did, the quickly growing abolitionist movement might have had enough time to gain irresistible momentum. In that case, the curse of slavery could have been lifted from the land decades earlier than it was and the most terrible war in American history avoided. But, of course, we can never know that.
The steam engine designed in england by Thomas Newcomen in the early 18th century and greatly improved by James Watt in the 1760s was the first new source of work-doing energy since the windmill, invented a thousand years earlier. In 1807, Robert Fulton, financed by his partner Robert Livingston, built the world’s first practical steamboat, remembered as the Clermont.
Livingston was a member of one of New York’s most politically influential families, and he had persuaded the New York state legislature to grant him and Fulton a monopoly on steamboat navigation in New York waters—provided they built a boat capable of traveling four miles an hour. When the Clermont averaged four and a half miles on its first trip from New York City to Albany, the monopoly was theirs. By 1812, they had six boats plying New York waters, which the state defined as running up to the high-tide mark in neighboring states.
Needless to say, entrepreneurs in nearby states who saw opportunity in steamboats didn’t like the monopoly. Nor did the passengers who had to pay higher fares because of it. In 1819, one of these entrepreneurs, Thomas Gibbons of New Jersey, decided to do something about it.
Gibbons put a steamboat on the New York–New Brunswick run, the first leg of the fastest route to Philadelphia. As captain, he hired a young man in his twenties named Cornelius Vanderbilt.
Vanderbilt would tie up at whatever pier seemed to be free of New York authorities and then disappear into the city until just before departure time in order to avoid being arrested. The authorities did not dare seize the boat itself, knowing that New Jersey would quickly retaliate by seizing the first monopoly steamboat it could lay its hands on.
While Vanderbilt played cat and mouse with the New York authorities, Gibbons went to court, and eventually, in February 1824, the case was heard in the U.S. Supreme Court.
Gibbons’s lawyer, Daniel Webster, argued that because the Constitution’s interstate commerce clause, which gives Congress the power to “Regulate Commerce … among the several States,” was both sweeping and exclusive, the monopoly was unconstitutional as a state encroachment on federal power.
A unanimous Supreme Court agreed. Today, this is taken for granted, but at the time it was a breathtaking expansion of federal power. The decision was greeted with public jubilation, and for good reason. Thanks to competition, fares quickly fell on average by 40 percent, and in just two years the number of steamboats working New York waters increased from 6 to 43.
The long-term effects were even more profound: States stopped granting monopolies to influential local citizens, as they all were now presumptively unconstitutional, while other barriers to interstate commerce fell as well.
In his classic work The Supreme Court in United States History , Charles Warren calls Gibbons v. Ogden the “Emancipation Proclamation of American Commerce.” That is not an exaggeration. With the decision, the United States became the world’s largest truly common market, its goods free to move throughout vast territories unhindered by parochial concerns and regulations.
And the timing could not have been better. The power of steam to move goods cheaply over long distances, merely hinted at by the steamboat, was soon to grow by orders of magnitude. The railroad, beginning less than a decade thence, would make an integrated national economy a reality. Thanks to Gibbons v. Ogden, American businessmen would be able to take full advantage of it, and did they ever.
The cost of overland transportation had been a limiting factor in the world economy since time immemorial. Any material with a low value-to-weight ratio, such as foodstuffs, that couldn’t be transported to distant markets by water couldn’t be sold in those markets at a price anyone would pay. This meant that national economies were fragmented into an infinity of local ones.
Until the Industrial Revolution, there was only one way to reduce these transportation costs: build artificial rivers. By the end of the eighteenth century England was well laced with canals, greatly facilitating industrialization as factories could sell their goods profitably throughout the entire country.
But the new United States was 10 times the size of England and far less developed. And a considerable mountain range divided the more developed eastern seaboard from the fertile, resource-rich, and rapidly growing West. Settlers west of the Appalachians had no choice but to send their crops down the Mississippi to market.
Along the whole great chain of mountains that stretched from Maine to Alabama, there was only a single gap—where the Mohawk River tumbles into the Hudson near Albany—at which a canal was even theoretically possible.
The idea of building a canal to connect the Hudson with the Great Lakes there had been around for many years but always dismissed as hopelessly impracticable. Even Thomas Jefferson thought the idea “little short of madness.” DeWitt Clinton, however, did not. Born into a prominent New York family (his uncle had been governor of New York and then Vice President under James Madison), Clinton would be the mayor of New York City and governor of the state for most of the first quarter of the nineteenth century. A shrewd politician, he built public support for the canal and pushed it through a reluctant state legislature.
One can understand the reluctance, for the project was huge by the standards of the day. At 363 miles, the Erie would be, by far, the longest canal in the world. It would require moving, largely by hand, 11.4 million cubic yards of earth and rock—well over three times the volume of the Great Pyramid of Egypt—and building 83 locks in what was still a semiwilderness. The budget, seven million dollars, was about equal to one percent of the gross domestic product of the entire country. Nonetheless, when the federal government refused to help, New York decided to go it alone. It was a gigantic roll of the economic dice, but one that paid off beyond even Clinton’s dreams. The Erie Canal put the Empire in the Empire State.
The canal was a success even before it fully opened, as traffic burgeoned on the completed parts, helping fund continuing construction. When it was finished in 1825, ahead of schedule and under budget, traffic was tremendous from the start. It is not hard to understand why. Before, it had taken three weeks and cost $120 to ship a ton of flour from Buffalo to New York City. With the canal, it took eight days and cost $6.
Produce that had gone down the Mississippi to New Orleans now began to flow eastward. In a few years the Boston poet and physician Oliver Wendell Holmes (father of the Supreme Court justice) described New York as “that tongue that is licking up the cream of commerce and finance of a continent.” In 1800 about 9 percent of American exports passed through the port of New York. By 1860, it was 62 percent.
With the opening of the Erie Canal, New York became the greatest boomtown the world has ever known. The population of New York had been increasing by about 30,000 every decade since 1790, with 123,000 inhabitants in 1820. By 1830, however, New York’s population had reached 202,000; by 1840, 313,000. It was 516,000 in 1850 and 814,000 in 1860. Development roared up Manhattan Island, at the astonishing rate of about two blocks a year.
Thanks to the Erie Canal, by the 1840s, New York’s financial market was the largest in the country. In that decade the telegraph began to spread quickly, allowing more and more people to trade in the New York market, which has dominated American financial activity ever since.
Even so, perhaps the greatest consequence of the Erie Canal was that its success made the country far more receptive to other projects of unprecedented scale and scope and encouraged its entrepreneurs and politicians to think big. The result was a still-continuing string of megaprojects—the Atlantic cable, the Brooklyn Bridge, the Panama Canal, Hoover Dam, the interstate highway system, the Apollo missions—that have marked the economic history of the United States and shaped the national character.
Nothing has characterized the industrial revolution as much as the ever-increasing consumption of energy. As steam-powered presses brought down the costs of books, magazines, and newspapers, the demand for cheap interior lighting increased dramatically. In the cities, gaslight became available beginning early in the 19th century. But the gas-works that transformed coal into gas were expensive to build, and the network of pipes that distributed the gas could operate profitably only in densely populated central cities.
For those beyond the reach of gas, whale oil was the illuminant of choice. But, as the demand for whale oil steadily increased in the early 19th century, the supply of whales declined rapidly. This, of course, caused the price to soar. In the 1850s when a dollar a day was a good wage, a gallon of whale oil cost $2.50.
Other illuminants were utilized. One was camphene, made from turpentine. It produced a bright light but had a nasty habit of exploding. Another was kerosene, which could be made from coal, but the process was expensive.
The solution to the need for a cheap, abundant illuminant came from an unexpected source, rock oil. Petroleum, which means “rock oil” in Latin, had been known since ancient times from areas where it seeps to the surface naturally. But its chief use had been medicinal.
In 1853 a Dartmouth graduate named George Bissell happened to be visiting his old school and noticed in a laboratory a bottle of rock oil. He knew that it was flammable and suddenly he wondered if it could be turned into a marketable illuminant. He asked Benjamin Silliman, Jr., one of the country’s leading chemists, to investigate the possibilities while he organized a few investors to form a company. Silliman soon reported that rock oil was easily fractionated into various substances, including kerosene. Silliman was sufficiently impressed with the possibilities that he bought 200 shares in Bissell’s company.
But, while it was now clear that there was a market for products made from rock oil, there was as yet no good supply. Most rock oil in this country came from northwestern Pennsylvania, where it was skimmed off ponds. Then, in 1856, Bissell had a second bright idea. On a hot summer day, he was shading himself under a druggist’s awning when he saw a bottle of medicine made from rock oil that featured on its label a derrick of the sort used to drill for salt. Bissell wondered if one could drill for oil.
Bissell’s company hired a man named Edwin Drake to go to northwestern Pennsylvania and find out. Drake, who seems to have awarded himself the title of colonel by which he is often known, had a great deal of trouble persuading a salt-drilling crew to try to drill for oil, but on August 27, 1859, he struck it at 69 feet. Once a pump was attached to the well, Drake found himself with more oil than he had barrels to store it in.
In 1859 the total American production of petroleum was 2000 barrels. Ten years later, it was 4.2 million barrels. By the turn of the 20th century, American production was 60 million barrels and the United States was the world’s leading exporter of petroleum and its byproducts.
With the development of the gasoline-powered automobile in the last decade of the nineteenth century, the demand for petroleum would soar. Today, the American economy consumes 7.6 billion barrels of petroleum a year, 30 times the per capita consumption in 1900, and petroleum is one of the country’s largest and most capital-intensive industries. More, petroleum has become the linchpin of the world economy and thus of international politics. Armies march to defend or acquire its sources.
Thomas Edison is the very symbol of yankee ingenuity. While most great inventors are known for one invention—Alexander Graham Bell for the telephone, Charles Goodyear for vulcanized rubber—Edison is remembered for dozens of them. Among many others, he invented or made fundamental improvements to electric light, the phonograph, the stock ticker, the movie camera, and the telephone, each of which fathered or greatly facilitated major segments of the American economy.
But, perhaps because it could not be patented, Edison is seldom remembered for what is probably the most significant of all his inventions, the industrial research laboratory. Edison in effect invented the industrialization of the process of invention, with consequences that continue to this day.
Already famous for his many contributions to the telegraph industry and not yet 30 years old, Edison was living in a brownstone in Newark, New Jersey, with his wife, father, and daughter and was manufacturing telegraph equipment. He needed more room and bought a house and two tracts of land in Menlo Park, New Jersey, then a farm town 25 miles southwest of New York City.
On a dirt road called Christie Street, he ordered the construction of a building, two stories tall and 100 feet long by 30 wide. In it he installed a laboratory, machine shop, and carpentry shop. Equally important, he employed people able to turn his ideas into reality, including machinists, carpenters, and a glass blower. It was in Menlo Park that Edison would develop some of his most famous inventions, including electric light and the phonograph.
But even more important than the inventions themselves was the process. Laboratories in the past had mostly pursued pure research, with little or no regard for the practical applications that might flow from that research. Menlo Park was all about practical application, turning ideas into products that would have commercial potential.
For instance, the light bulb by itself isn’t good for much; it needs a source of electricity. With the resources of Menlo Park, Edison was able not only to invent the light bulb but to design a complete system of generating plants, wires, and meters—in truth the modern electric power industry. Edison perfected the light bulb in 1879. By 1882, with his electrical generating plant on Pearl Street in lower Manhattan, he brought electric light to an urban neighborhood.
The new corporate giants that were emerging in the last two decades of the nineteenth century, such as General Electric, of which Edison was a founder, quickly adopted the idea of the industrial research laboratory. An incredible outpouring of inventions has resulted ever since. Just a few of the products to emerge from this process include artificial rubber and nylon (Du Pont), the transistor (AT&T), and the microprocessor (Texas Instruments and Intel).
Edison’s original idea, the systematizing of the process of research and development, has been the single most important factor behind American dominance of emerging technology in the twentieth century. The wholehearted embrace of the concept ensures that that dominance will continue. In 2006, the United States spent as much on research and development as did the European Union and Japan combined.
In 1893, the carburetor was invented in Europe. It was the last piece of the puzzle needed to make the automobile with a gasoline-powered internal-combustion engine a practical technology. Mechanics and amateur tinkerers in both Europe and America began building automobiles, essentially by hand. These vehicles were expensive and sold only to the rich.
By 1900, the United States was producing about 4,000 automobiles a year, and the companies producing them multiplied in a classic economic Darwinian competition. In 1903 alone no fewer than 57 car makers opened for business, and 27 went bankrupt. One of the automobile manufacturers that opened that year was the Ford Motor Company, whose principal owner (sole owner after 1915) was Henry Ford.
Ford’s father was a farmer near Dearborn, Michigan, and Henry received only a modest rural education. Ford hated farming, but he proved a born mechanic. In 1896 he built his first automobile in the carriage house behind his house, and in the next few years he built racecars that broke speed records. But Ford wasn’t really interested in racing or crafting cars for the wealthy. He wanted to produce cars for the average man.
It was a revolutionary concept that would have consequences Henry Ford never imagined and made him one of the most famous people in the world. In 1932 Aldous Huxley published his classic novel of the future, Brave New World , in which the people of that world reckon time not from the birth of Christ but from the birth of Henry Ford.
In 1908, Ford introduced the Model T. It was designed to be both rugged—to handle the usually awful roads of that time—and cheap to manufacture. At $850 not only was it much less expensive than the average automobile, it also cost only about a penny a mile to run. It was an instant success: 10,607 Model T’s sold that year, more than two and a half times as many cars as had been sold in America just eight years earlier.
Ford, having designed what he regarded as the perfect vehicle (and for its time it was), bent all the company’s efforts to reducing manufacturing costs to make the Model T accessible to an ever-larger segment of the population. In 1913 he introduced the assembly line, a fundamental concept in manufacturing ever since.
By 1916, the price of a Model T had dropped to $360, and Ford sold 730,041 of them that year. By 1920 Ford was building half the cars in the world, and the 5,000-year reign of the horse as the prime local mover of humans and freight had come to an end.
The cheap car remade the American economy. By the 1920s automobile manufacture was consuming 20 percent of the nation’s steel production, 80 percent of its rubber, and 75 percent of its plate glass. The need for roads gave an enormous boost to the construction industry and stimulated quarrying and cement manufacture. By the 1920s automaking had become the country’s largest manufacturing industry. It still is.
No country in world history has seen the economic growth the United States experienced in the first 150 years of its existence. And no industrialized nation has experienced a depression to equal what the United States endured from the fall of 1929 to the winter of 1932–33. In those three and a half terrible years the country went from blue-sky prosperity to near collapse. Unemployment increased from 3.1 percent to 26 percent, a figure that would have been far worse had not many jobs been made part time. Hours worked declined by 30 percent. Industrial production dropped by nearly half, as did GDP. The country produced 4 million automobiles in 1929 but only 1.7 million in 1931. The federal government’s budget went from a surplus of $734 million in 1929 to a deficit of $2.7 billion in 1932.
By the end of that year, as the nation waited for Herbert Hoover to finish his term on March 4, economic matters were deteriorating nearly by the minute. The index of industrial production declined 12.5 percent between December and March. Farm mortgages were being foreclosed at the rate of 20,000 a month. Then, in February, the banking system began to collapse. Were it to go, everything else would go too, for a modern economy can no more function without a banking system than a living body can function without a circulatory system.
More than 5000 banks, mostly small and rural, had collapsed since 1929, and the sight of crowds outside banks demanding to withdraw their money had become common. But now panic began to become generalized, and every bank was feared to be failing. On February 14, 1933, the governor of Michigan ordered all the state’s banks closed for eight days to prevent a fast-spreading panic from engulfing the entire state banking system.
Panic, of course, is highly contagious, and state after state followed Michigan in ordering its banks closed. By March 4, every bank was shut in 32 states, and most were in 6 more. In the remaining 10 states withdrawals were sharply limited, in Texas to no more than $10 a day. On inauguration day the New York Stock Exchange announced it would not reopen that morning and did not say when it would. The mightiest national economy on the face of the earth nearly ceased to function.
If one needs proof of the importance of psychology as a force in the economic universe, the first week of the Roosevelt administration provides it. On March 4 Roosevelt gave one of the very few inaugural addresses that have been remembered. The very first paragraph contains a line that has become an enduring part of the American political lexicon: “Let me assert my firm belief, that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.”
The American people, listening by radio, responded. In the first week of the Roosevelt Presidency, the White House received 450,000 letters and cards. The Hoover White House had needed one clerk to handle its mail; Roosevelt needed 70. The next day the President ordered all the nation’s banks closed and Congress into session. Congress met the following Thursday, March 9, to consider the Emergency Banking Relief Act. The House passed it by acclamation in 38 minutes. It passed the Senate with equal speed, and Roosevelt signed it into law that evening.
On Sunday, Roosevelt gave his first fireside chat to an enormous radio audience. He told the country that when the banks began to reopen the following day, they would have had their books examined and would be a safer place to keep one’s money than under the mattress. The country believed him, and money and gold began to flow back into the banking system. The heart of the American economy, stilled for a week, began to beat again.
Roosevelt’s great inaugural speech proved the turning point of the Great Depression. Prosperity would not return until World War II, but all the economic indicators, which had been declining for three and a half years, turned around one by one. The year 1933 turned out to be one of the best years, in percentage terms, in Wall Street history, the Dow Jones rising 60 percent. The gross domestic product in 1934 was 17 percent higher than in 1933.
History, of course, abounds with what-ifs, and there is a very large one here. On February 15, the day after Michigan had closed all the state banks, President-elect Franklin Roosevelt came within inches of being assassinated (the bullets meant for him fatally struck instead the mayor of Chicago, with whom he was talking). One can only wonder what would have happened to the American economy and thus to the Republic itself had his vice president, the fiscally conservative, pay-as-you-go, and uninspiring John Nance Garner, given the inaugural address on March 4, 1933.
The Erie Canal was the most successful government-funded capital project of the early nineteenth century, with incalculable economic and social effects, almost all of them for the good. The GI Bill may well be the most successful government-funded capital project of the mid-twentieth. The capital, of course, was not invested in ditches and locks, it was invested in people. By doing so, it created the much larger middle class of educated people with financial assets that has powered the American economy ever since.
All these good results were entirely unforeseen. Perhaps never has there been a greater or happier example of the law of unintended consequences.
As the end of World War II approached, economists and business leaders alike were nearly unanimous in predicting renewed depression. Since Pearl Harbor the American economy had almost doubled as civilian needs were met, while a vast war machine poured out matériel for the Allied cause. But with victory in sight, it was feared that declining government spending (the federal budget would in fact drop by nearly two-thirds over the next three years) and 12 million soldiers and sailors pouring back into the job market would drive down wages and send unemployment (virtually nonexistent during the war) soaring.
To meet that anticipated crisis, on June 22, 1944, President Roosevelt signed into law the Serviceman’s Readjustment Act, which had passed both houses of Congress unanimously. Ostensibly it was intended to reward servicemen and women for their bravery and sacrifice, but its unstated purpose was to slow down their return to the job market.
To do that, it provided generous benefits for veterans who chose to pursue more education, and 8 million of them attended college and technical schools. In 1950, nearly half a million college degrees were awarded, twice the number of a decade earlier. Between 1945 and 1952, the federal government spent $14 billion (not that far from $100 billion in today’s money) on GI Bill educational benefits, hugely increasing the country’s “intellectual capital,” just as the information age was dawning. And because historically each generation has tended to have about two more years of formal education than its parents, this investment has paid off ever since.
The GI Bill also revolutionized home ownership. The bill provided for Veterans Administration mortgages, up to $2,000 at first but soon as much as $25,000 or 60 percent of the loan, whichever was less. Protected from default, many banks were willing to make housing loans with no money down. Millions of young families, happily creating the baby boom, were thus able to have something their parents had not: substantial financial assets. Instead of paying rent, they were building equity.
People such as William Levitt adapted Henry Ford’s idea of the assembly line to building houses. (The houses remained still, of course, while crews swarmed from one to the next, performing the same task over and over.) As GI Bill families poured into these new housing developments that grew up around every American city, the suburbs quickly became the pivot of American politics.
Thanks in surprisingly large measure to the GI Bill, in a generation the country was transformed from a nation of haves and have-nots to one of haves and have-mores, with consequences for American business that are still playing out.
The first digital computers appeared immediately after the end of World War II. ENIAC (an acronym for Electronic Numerical Integrator and Computer), which came online in 1946, was the size of a bus and used 18,000 vacuum tubes and miles of wiring. The programming was done by physically switching the wires on a switchboard-like grid. All those tubes and the necessary cooling system used as much electricity as a small town. By modern standards it was glacially slow.
Computers quickly shrank in size and grew in power, especially after the invention of the transistor by Western Electric in 1947. The transistor does exactly what the vacuum tube does but is much smaller, cheaper to manufacture, and far more durable.
Still, computers remained hugely expensive both to buy and to operate, requiring highly trained personnel. They were restricted to governments and large corporations with equally large data-processing needs, such as insurance companies and banks. Twenty years after ENIAC, had every computer in the world suddenly shut down, the average man in the street wouldn’t have noticed until his bank statement failed to show up at the end of the month.
Today, little more than a generation later, if every computer in the world were to stop operating, civilization would collapse in seconds. Automobiles and household appliances would not run, ordinary purchases could not be paid for, business offices would cease to function, communications beyond direct conversations would become impossible. Airplanes would fall out of the sky.
What happened? Simple: the microprocessor. A computer’s power is relative to both the number of transistors and the number of connections between them. If there are only two transistors, then 1 connection is needed. If there are three, then 3 are needed to connect them all. Four transistors require 6 connections, five need 10, six need 15, and so on. If these connections need to be made, essentially, by hand, then the cost of building more powerful computers grows far faster than their computing power. Mathematicians call this the tyranny of numbers.
In 1959, Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor developed the integrated circuit, a series of interconnected transistors laid down on a chip of silicon. In other words, the transistors and their connections were manufactured simultaneously. The tyranny of numbers was overcome. In 1971 Intel introduced the first commercial microprocessor, which is nothing less than a small computer on a silicon chip, to power a handheld calculator.
While the investment needed to design a microprocessor is very high, as is the cost of the machinery needed to manufacture it, once those investments are made, microprocessors can be turned out like so many high-tech cookies. This reduces the cost of each one by orders of magnitude.
They quickly increased in both power and speed, and Gordon Moore, in his famous Moore’s law, correctly predicted that the number of transistors on a chip would double every 18 months. The first Intel microprocessor had 2300 transistors. An Intel Core 2 has as many as 290 million, and there is no end in sight.
As the power of computers increased, the cost per calculation collapsed. What cost a thousand dollars in the 1950s costs a fraction of a cent today. As the price of computers declined, their use began to increase at a fantastic rate. The cheap computer caused (and is continuing to cause) an economic revolution, just as the steam engine did 200 years earlier and for precisely the same reason: It drastically reduced the cost of a fundamental economic input.
In the steam engine’s case, it was the cost of work-doing energy; in the computer’s, the cost of storing, retrieving, and manipulating information. Previously, only human beings could perform these tasks. Now, computers could increasingly do them at lower cost and at far higher speed and accuracy. Many products that could not exist without computers—cordless phones, cell phones, DVDs, CDs, digital cameras, GPS systems—started to flow into the marketplace, and endless new commercial uses for them began to be exploited.
And just as the railroad proved to be the most economically important subsidiary invention of the steam engine, so the Internet is proving to be that for the microprocessor. The internet has already caused a revolution in publishing, news transmission, advertising, retail sales, and many other important parts of the economy. As a result, economic and political power is being redistributed to a degree we have not seen in this country since the days of Jacksonian democracy and the birth of the mass media 150 years ago.
The opportunities for entrepreneurs today are greater in number than they have ever been, even in this most entrepreneurial and opportunity-rich of countries.