Outline of The U S Economy
Outline of The U S Economy
ECONOMY
2012 EDITION
CONTENTS
Cover
CHAPTER 1: The Challenges of this Century
The world’s largest and most diverse economy currently faces the most severe
economic challenges in a generation.
CHAPTER 2: The Evolution of the U.S. Economy
The economy has expanded and changed, guided by some unchanging principles.
CHAPTER 3: What the U.S. Economy Produces
Large U.S. multinational firms have altered their production strategies and their roles
in response to globalization as they adapt to increasing competition.
CHAPTER 4: Competition and the American Culture
Competition has remained a defining characteristic of the U.S. Economy grounded in
the American Dream of owning a small business.
CHAPTER 5: Geography and Infrastructure
Education and transportation help hold together widely separated and distinct regions.
CHAPTER 6: Government and the Economy
Much of America’s history has focused on the debate over the government’s role in the
economy.
CHAPTER 7: A U.S. Economy Linked to the World
Despite political divisions, the United States shows no sign of retreat from global
engagement in trade and investment.
CHAPTER 8: A New Chapter in America’s Economic Story
The United States, in its democratic way, faces up to immense economic challenges.
PREFACE
“The panic itself was felt in every part of the globe,” The Wall Street Journal reported. “It was as if
a volcano had burst forth in New York, causing a tidal wave that swept with disastrous power over
every nation on the globe.” One of the after-effects: “an accumulation of idle money in the banking
centres.” The date of this item? January 17, 1908.
Given the sobering news that of late has arrived with distressing frequency, preparing this edition
of Outline of the U.S. Economy has been a real challenge. We have tried to approach the task with a
sense of historical consciousness. In addition to the 1908 events depicted above, the United States has
endured a Great Depression (began 1929), a Long Depression (began 1873), a Panic of 1837—“an
American financial crisis, built on a speculative real estate market,” says Wikipedia—and assorted
other recessions, panics, bubbles, and contractions, and emerged from each with its economic vigor
restored and its republican institutions vibrant.
We hope that our readers will find this new entry in our Outline series frank, informative, and
above all useful. We offer it in the spirit of optimism embedded deeply in American life.
—The Editors
CHAPTER 1
The Challenges of this Century
The world’s largest and most diverse economy currently faces the most severe economic challenges
in a generation.
From left, Vice President-elect Joe Biden and his wife, Jill, President-elect Barack Obama and his wife, Michelle, stop in
January 2009 on their way to inauguration and big challenges.
© AP Images
“We are still the nation that has overcome great fears and improbable odds.”
PRESIDENT BARACK OBAMA
United States of America
2010
The economy of the United States, which generates nearly $15 trillion a year in goods and services, is
the largest in the world and, by most measures, the most innovative and productive. American
households and employers make millions of daily decisions about what to spend, invest and save.
Many layers of laws, policies, regulations and court decisions both constrain and stimulate these
decisions. The resulting economy reflects market and individual choices but is also structured and
shaped by politics, policies and laws.
This edition of Outline of the U.S. Economy, updated in 2012, offers historical context for
understanding the interplay of individual economic decisions and the legal and political framework
that surrounds them. It is a primer on how the U.S. economic system emerged, how it works and how
it is shaped by American social values and political institutions.
The United States’ entrepreneurial and opportunistic culture supports competition and risk taking in
the economy, but many Americans also rely on government social “safety nets” to help them through
unemployment and retirement. These conflicting currents shape the U.S. economy. The most
fundamental questions about how the U.S. economy works and which policies best serve the nation
have been debated since the nation’s founding. Today’s economists and political leaders continue the
debate.
For more than two centuries the U.S. economy has responded to new opportunities and rewarded
long-term investment—but it has also proved vulnerable to booms and crashes. The cycle of highs
and lows swung violently in the first decade of the 21st century, culminating in the global financial
panic of 2008 and the “great recession” that followed.
Harper’s Weekly published scenes of U.S. farm life in the 1860s, years when America was poised to become a world
manufacturing power.
Courtesy of Library of Congress
“Those who labor in the earth are the chosen people of God, if ever he had a chosen people.”
THOMAS JEFFERSON
1787
By the time that General George Washington took office as the first U.S. president in 1789, the young
nation’s economy was already a composite of many diverse occupations and defined regional
differences.
Agriculture was dominant. Nine of 10 Americans worked on farms, most of them growing the food
their families relied on. Only one person in 20 lived in an “urban” location, which then meant merely
2,500 inhabitants or more. The country’s largest city, New York, had a population of just 22,000
people, while London’s population exceeded one million. But the handful of larger cities had a
merchant class of tradesmen, shopkeepers, importers, shippers, manufacturers, and bankers whose
interests could conflict with those of the farmers.
Thomas Jefferson, a Virginia planter and principal author of America’s Declaration of
Independence, spoke for an influential group of the country’s Founding Fathers, including many from
the South. They believed the country should be primarily an agrarian society, with farming at its core
and with government playing a minimal role. Jefferson mistrusted urban classes, seeing the great
cities of Europe as breeders of political corruption. “Those who labor in the earth are the chosen
people of God, if ever he had a chosen people,” Jefferson once declared.
Opposing Jefferson and other supporters of a farm-based republic was a second powerful political
movement, the Federalists, often favored by northern commercial interests. Among its leaders was
Alexander Hamilton, one of Washington’s principal military aides in the American Revolutionary War
(1775-1783), in which the American colonies had won recognition of their sovereignty from Britain.
Hamilton, a New Yorker who was the nation’s first secretary of the Treasury, believed that the young,
vulnerable American republic required strong central leadership and federal policies that would
support the spread of manufacturing.
In 1801, Jefferson became the third U.S. president and headed the Democratic-Republican political
party, later to be called the Democratic Party. In 1828, war hero Andrew Jackson from Tennessee
won election as the candidate of Jefferson’s wing, becoming the first U.S. president from a frontier
region. His combative advocacy for “ordinary” Americans became a main theme of the Democrats.
He declared in 1832 that when Congress acts to “make the rich richer and the potent more powerful,
the humble members of society—the farmers, mechanics, and laborers” who lack wealth and
influence—have the right to protest such treatment.
Hamilton argued that America’s unbounded economic opportunities could not be achieved without
a system that created capital and rewarded investment. Hamilton’s Federalists evolved into the Whig
Party and then the Republican Party. This major branch of American politics generally favored
policies to spur the growth of U.S. industry: internal infrastructure improvements, protective tariffs on
the import of goods, centralized banking, and a strong currency.
A BALANCING OF INTERESTS
The U.S. Constitution, ratified in 1788, sought to ground the new nation’s experiment in democracy in
hard-won compromises of conflicting economic and regional interests.
“The framers of the Constitution wanted a republican government that would represent the people,
but represent them in a way that protected against mob rule and maximized opportunities for careful
deliberation in the best interests of the country as a whole,” says professor Anne-Marie Slaughter of
Princeton University. “They insisted on a pluralist party system, a bill of rights limiting the power of
the government, guarantees for free speech and a free press, checks and balances to promote
transparent and accountable government, and a strong rule of law enforced by an independent
judiciary.”
The lawmaking power was divided between two legislative houses. The Senate, whose
membership was fixed at two senators from each state (and until 1914, who were chosen by the state
legislatures rather than by direct election), was assumed to reflect business and landholder interests.
The Founders created the House of Representatives, with membership apportioned among the states
by population and elected directly by the people, to adhere more closely to the views of the broader
public.
Another essential constitutional feature was the separation of powers into three governmental
branches: legislative, executive, and judicial. James Madison, a primary author of the Constitution
and, beginning in 1809, the nation’s fourth president, said that “the spirit of liberty…demands checks”
on government’s power. “If men were angels, no government would be necessary,” he wrote, in
defense of the separation principle. But Madison also believed that the separations could not be
absolute and that each branch ought properly to possess some influence over the others.
The president thus appoints senior government leaders, chief federal prosecutors, and the top
generals and admirals who direct the armed forces. But the Senate may accept or reject these
candidates. Congress may pass bills, but a president’s veto can prevent their becoming law unless
two-thirds of each congressional house votes to override the veto. The Supreme Court successfully
claimed the right to strike down a law as unconstitutional, but the president retains the ability to
nominate new Supreme Court justices. The Senate possesses an effective veto over those choices, and
the Constitution assigns to Congress the power to fix the size of the Supreme Court and to restrict the
court’s appellate jurisdiction.
The Constitution outlined the government’s role in the new republic’s economy. At Hamilton’s
insistence, the federal government was granted the sole power to issue money; states could not do so.
Hamilton saw this as the key to creating and maintaining a strong national currency and a creditworthy
nation that could borrow to expand and grow.
There would be no internal taxes on goods moving between the states. The federal government
could regulate interstate commerce and would have sole power to impose import taxes on foreign
goods entering the country. The federal government was also empowered to grant patents and
copyrights to protect the work of inventors and writers.
The initial U.S. protective tariff was enacted by the first Congress in 1789 to raise money for the
federal government and to provide protection for U.S. manufacturers of glass, pottery, and other
products by effectively raising the price of competing goods from overseas. Tariffs immediately
became one of the young nation’s most divisive regional issues.
Hamilton championed the tariff as a necessary defensive barrier against stronger European
manufacturers. Hamilton also promoted a decisive federal hand in the nation’s finances, successfully
advocating the controversial federal assumption and full payment of the states’ Revolutionary War
debts, much of which had been acquired at low prices by speculators during the war. These measures
were popular among American manufacturers and financiers in New York, Boston, and Philadelphia,
whose bonds paid for the country’s industrial expansion.
But the protective tariff infuriated the predominantly agricultural South. It raised the price of
manufactured goods that southerners purchased from Europe, and it encouraged European nations to
retaliate by reducing purchases of the South’s agricultural exports. As historian Roger L. Ransom
observes, western states came down in the middle, objecting to high tariffs that raised the prices of
manufactured goods but enjoying the federal tariff revenues that funded the new roads, railroads,
canals, and other public works projects that their communities needed. The high 1828 barriers,
dubbed the “Tariff of Abominations” by southern opponents, escalated regional anger and contributed
to sectional tensions that would culminate in the U.S. Civil War decades later.
By 1800, the huge tracts of land granted by British kings to colonial governors had been dispersed.
While many large landholdings remained, particularly the plantations of the South, by 1796 the
federal government had begun direct land sales to settlers at $2 per acre ($5 per hectare),
commencing a policy that would be critical to America’s westward expansion throughout the 19th
century. The rising tide of settlers pushed the continent’s depleted Native American inhabitants
steadily westward as well. President Jackson made the displacement of Indian tribes government
policy with the Indian Removal Act of 1830, the forced relocation of the Choctaw tribe to the future
state of Oklahoma over what came to be called “the trail of tears.”
The first regional demarcations followed roughly the settlement patterns of various ethnic
immigrant groups. Settlers from England followed the path of the first Puritans to occupy New
England in the northeastern part of the country. Pennsylvania and other Middle Colonies attracted
Dutch, German, and Scotch-Irish immigrants. There were French farmers in some of the South’s
tidewater settlements while Spain provided settlers for California and the Southwest. But the sharpest
line was drawn by the importation of African slaves, which began in America in 1619.
In the South, slave labor underpinned a class of wealthy planters whose crops—first tobacco, then
cotton, sugar, wool, and hemp—were the nation’s principal exports. Small farm holders were the
backbone of many new settlements and towns and were elevated by Jefferson and many others as
symbols of an “American character” embodying independence, hard work, and frugality.
Some of the Founding Fathers feared the direction in which the unschooled majority of Americans,
a “rabble in arms” in one author’s famous description, might take their new country. But the image
that prevailed was that of the farmer-patriot, once captured by the 19th-century philosopher Ralph
Waldo Emerson’s depiction of the “embattled farmers” who had defied British soldiers, fired “the
shot heard round the world,” and sparked the American Revolution.
President Jefferson’s purchase of the Louisiana territory in 1803 from France doubled the nation’s
size and opened a vast new frontier that called out to settlers and adventurers.
A SPIRIT OF INVENTION
Across the country, a flow of inventions sparked dramatic increases in farm output. Jefferson himself
had experimented with new designs for plow blades that would cut the earth more efficiently, and the
drive to improve farming equipment never slackened. In Jefferson’s time, it took a farmer walking
behind his plow and wielding his sickle as many as 300 hours to produce 100 bushels of wheat. By
the eve of the Civil War, well-off farmers could purchase John Deere’s steel plows and Cyrus
McCormick’s reapers, which cut, separated, and collected farmers’ grain mechanically. Advanced
windmills were available, improving irrigation.
In the next 40 years, steam tractors, gang plows, hybrid corn, refrigerated freight cars, and barbed
wire fencing to enclose rangelands all appeared. In 1890, the time required to produce 100 bushels of
wheat had dropped to just 50 hours. In 1930, a farmer with a tractor-pulled plow, combine, and truck
could do the job in 20 hours. The figure dropped to three hours in the 1980s.
Eli Whitney’s cotton gin, introduced in 1793, revolutionized cotton production by mechanizing the
separation of cotton fibers from sticky short-grain seeds. Cotton demand soared, but the cotton gin
also multiplied the demand for slave labor. Whitney, a Massachusetts craftsman and entrepreneur,
fought a long, frustrating battle to secure patent rights and revenue from southern planters who had
copied his invention, one of the earliest legal struggles over the protection of inventors’ discoveries.
Whitney did succeed on another front, demonstrating how manufacturing could be dramatically
accelerated through the use of interchangeable parts. Seeking a federal contract to manufacture
muskets, Whitney, as the story was told, amazed Washington officials in 1801 by pulling parts at
random from a box to assemble the weapon. He illustrated that the work of highly trained craftsmen,
turning out an entire product one at a time, could be replaced with standardized processes involving
simple steps and precision-made parts—tasks that journeymen could handle. His insights were the
foundation for the emergence of a machine tool industry and mass production processes that made
U.S. manufacturing flourish, eventually producing “a sewing machine and a pocket watch in every
home, a harvester on every farm, a typewriter in every office,” journalist Harold Evans notes.
The 19th century delivered other startling inventions and advances in manufacturing and
technology, including Samuel Morse’s telegraph, which linked all parts of the United States and then
crossed the Atlantic, and Alexander Graham Bell’s telephone, which put people in direct contact
across great distances. In 1882, Thomas A. Edison and his eclectic team of inventors introduced the
first standard for generating and distributing electric energy to homes and businesses, lighting offices
along New York’s Wall Street financial district and inaugurating the electric age.
And a transportation revolution was launched with the completion of the first transcontinental
railroad, when converging rail lines from the East and the West met in Utah in 1869.
“The American economy after the Civil War was driven by the expansion of the railroads,” writes
historian Louis Menand. During the war, Congress made 158 million acres (63 million hectares)
available to companies building railroads. Railroad construction fed the growth of iron and steel
production. Following the first connection, other lines linked the country’s Atlantic and Pacific
coasts, creating a national economy able to trade with Europe and Asia and greatly expanding U.S.
economic and international political horizons.
The Richest Man in the World
In the post-civil war gilded age, a generation of immensely wealthy industrialists rose to
prominence. Hailed as “captains of industry” by admirers and as “robber barons” by critics, these
titans dominated entire sectors of the American economy. By the end of the 19th century, oil had its
John D. Rockefeller, finance its J. Pierpont Morgan and Jay Gould, and tobacco its James B. Duke
and R. J. Reynolds. Alongside them were many others, some born into wealthy families, and some
who personified the self-made man.
None climbed further than Andrew Carnegie. He was the son of a jobless Scottish textile worker
who brought his family to the to the United States in the mid-1800s in hopes of better opportunities.
From this start, Carnegie became “the richest man in the world,” in the words of Morgan, who along
with his partners would in 1901 purchase what became U.S. Steel. Carnegie’s personal share of the
proceeds was an astonishing $226 million, the equivalent of $6 billion today, adjusted for inflation,
but worth much more than that as a percentage of the entire U.S. economy then.
Carnegie’s life exemplifies how an industrializing America created opportunities for those smart
and fortunate enough to seize them. As a teenager in Pennsylvania, Carnegie taught himself the
Morse code and became a skilled telegraph operator. That led to a job as assistant to Thomas A.
Scott, a rising executive in the Pennsylvania Railroad, one of the nation’s most important lines. As
Scott advanced, becoming one of the most powerful railroad leaders in the country, his valued
protégé Carnegie advanced too, sharing lucrative financial investments with Scott before going into
business himself to build iron bridges for the railroad. By the age of 30, Andrew Carnegie was a
wealthy man.
After quitting the railroad, Carnegie also prospered in oil development, formed an iron and steel
company, and shrewdly concentrated on steel rails and steel construction beams as railroad, office,
and factory construction soared. His manufacturing operations set standards for quality, research,
innovation, and efficiency. Carnegie also availed himself of secret alliances and advance knowledge
of business decisions, practices forbidden by today’s securities laws as “insider” transactions but
legal in Carnegie’s era.
Andrew Carnegie was a study in contrasts. He fought unionization of his factories. As other
industry leaders did, Carnegie imposed hard, dangerous conditions on his workers. Yet his concern
for the less fortunate was real, and he invested his immense wealth for society’s benefit. He financed
nearly 1,700 public libraries, purchased church organs for thousands of congregations, endowed
research institutions, and supported efforts to promote international peace. When his fortune proved
too great to be dispensed in his lifetime, Carnegie left the task to the foundations he had created,
helping to establish an American tradition of philanthropy that continues today.
CONVULSIVE CHANGES
Convulsive changes caused by industrialization and urbanization shook the United States at the end of
the 19th century. Labor movements began and vied for power, with immigrants helping to adapt
European protest ideologies into American forms.
By the 1880s, manufacturing and commerce surpassed farm output in value. New industries and
railroad lines proliferated with vital backing from European financiers. Major U.S. cities shot up in
size, attracting immigrant families and migration from the farms. A devastating depression shook the
country in the first half of the 1890s, forcing some 16,000 businesses to fail in 1893 alone. The
following year, as many as 750,000 workers were on strike, and the unemployment rate reached 20
percent.
Farmers from the South and West, battered by tight credit and falling commodity prices, formed a
third national political organization, the Populist Party, whose anger focused on the nation’s bankers,
financiers, and railroad magnates. The Populist platform demanded easier credit and currency
policies to help farmers. In the 1894 congressional elections, Populists took 11 percent of all votes
cast.
But American politics historically has coalesced around two large parties—the Republican and
Democratic parties have filled this role since the mid-1800s. Smaller groupings served mostly to
inject their issues into either or both of the main contenders. This would be the fate of the 1890s
Populists. By 1896, the new party had fused with the Democrats. But significant parts of the Populist
agenda subsequently found their way into law by way of the trans-party Progressive movement of the
20th century’s first two decades. Among the innovations were direct popular election of senators and
a progressive national income tax.
American Progressivism reflected a growing sense among many Americans that, in the words of
historian Carl Degler, “the community and its inhabitants no longer controlled their own fate.”
Progressives relied on trained experts in the social sciences and other fields to devise policies and
regulations to reign in perceived excesses of powerful trusts and other business interests. Writing in
1909, Herbert Croly, author of the hugely influential The Promise of American Life and first editor of
the New Republic magazine, expressed the Progressive’s credo in this way: “The national government
must step in and discriminate, not on behalf of liberty and the special individual, but on behalf of
equality and the average man.”
The influence of Progressive thought grew rapidly after the assassination of President William
McKinley in 1901 thrust Vice President Theodore Roosevelt into the White House. Adventurer,
naturalist, and scion of wealth, “Teddy” Roosevelt believed the most powerful corporate titans were
strangling competition. Businesses’ worst excesses must be restrained lest the public turn against the
American capitalist system, Roosevelt and his allies argued.
The New York World newspaper, owned by the influential publisher Joseph Pulitzer, editorialized
that “the United States was probably never nearer to a social revolution than when Theodore
Roosevelt became president.” Roosevelt responded with regulations and federal antitrust lawsuits to
break up the greatest concentrations of industrial power. His administration’s antitrust suit against the
nation’s largest railroad monopoly, Northern Securities Company, was a direct attack on the nation’s
foremost financier, J.P. Morgan. “If we have done anything wrong,” Morgan told Roosevelt, “send
your man to my man and they can fix it up.” Roosevelt responded, “That can’t be done.” The Supreme
Court’s ultimate decision against Northern Securities was a beachhead in the government’s campaign
to restrict the largest businesses’ power over the economy.
DEREGULATING BUSINESS
The 1980s tax cuts were only one part of a broad movement to reduce government’s economic role.
Another was deregulation.
During the 1970s, a number of thinkers attributed some of the nation’s economic sluggishness to the
web of laws and regulations that businesses were obliged to observe. These regulations had been put
in place for sound reasons: to prevent abuse of the free market and, more generally, to achieve greater
social equity and improve the nation’s overall quality of life. But, critics argued, regulation came at a
price, one measured by fewer competitors in a given industry, by higher prices, and by lower
economic growth.
During the economically trying 1970s and early 1980s, many Americans grew less willing to pay
that price. President Gerald R. Ford, a Republican who succeeded Richard M. Nixon in 1974,
believed that deregulating trucking, airlines, and railroads would promote competition and restrain
inflation more effectively than government oversight and regulation. Ford’s Democratic successor,
Jimmy Carter, relied heavily on a key pro-deregulation adviser, Alfred E. Kahn. Between 1978 and
1980, Carter signed into law important legislation achieving substantial deregulation of the
transportation industries. The trend accelerated under President Reagan.
The intellectual and political trends favoring deregulation were not limited to the United States.
Movements to empower private businesses and reduce government’s influence gained momentum in
Great Britain, Eastern Europe, and parts of South America. In the United States, courts and legislators
continued to carve away government regulations in important industries, including
telecommunications and electric power generation.
The most dramatic step was the 1984 breakup of the American Telephone and Telegraph Company,
the nationwide telephone monopoly. Prior to the government’s action, AT&T dominated all phone
service, both local and long-distance, and it argued that admitting new service providers would
threaten network reliability. AT&T obliged Americans to rent their telephones from its Western
Electric subsidiary, a monopoly that stifled the development of innovative types and styles of phones.
A far smaller rival, MCI Communications, contended that technology advances would enable
competition to flourish, benefiting consumers.
The federal government took up MCI’s cause, filing an antitrust suit asking a federal judge to end
AT&T’s monopoly. AT&T capitulated, agreeing to split off its local telephone service into seven new
regional phone companies. This began an era of intense competition and innovation around the
convergence of phones, computers, the Internet, and wireless communications. (AT&T maintained its
long-distance network, but in 2005 the company was purchased by one of its former local phone
subsidiaries.) While many American consumers found the changes in phone service confusing, they
eagerly snapped up a speedy parade of new communications products.
The loosening of regulations on electric power service in the 1990s has been far more
controversial, and its benefits disputed. For a century following Thomas Edison’s time, most
Americans purchased electricity from companies that operated legal monopolies in their regions.
State commissions regulated these utilities’ local rates, while federal regulators oversaw wholesale
sales across state lines. Prices were generally based on the costs of making electricity, plus a
“reasonable” profit for the utility.
About half of the U.S. states chose to open electric service to competition in the hope that new
products and lower prices would result. But these moves coincided with sharp increases in energy
prices beginning in 2000. A political backlash against electricity deregulation ensued, worsened by a
scandal surrounding the failure of Enron Corporation, a Texas-based energy company that had been a
key promoter of competitive electricity markets.
The deregulation movement stopped in midstream after 2000, leaving an electricity industry
partially regulated and partially deregulated, and divided by divergent regional agendas. Some areas
of the country rely on coal to generate electric power. Elsewhere, natural gas turbines, hydro-dams,
or nuclear plants are important sources of electricity, and in the 2000s, wind-generated power began
to grow. These differing regional interests slowed movement toward a national response to climate
change issues, including such possible measures as the development of renewable electricity
generation and an expanded power transmission grid. Instead, state governments have been the
principal policy innovators.
TECHNOLOGY’S UPHEAVAL
Technology is changing the fundamentals of economic competition, and often faster than government,
political leaders, and the public can keep pace. The computer age grew out of a confluence of
discoveries on many fronts, including the first computer microprocessor, created in 1971. This
breakthrough combined key functions of computer processing that had been separate operations—the
movement of data and instructions in and out, the processing of data, and the electronic storage of
results—onto a single silicon chip no bigger than a thumbnail. It was the product of scientists at Intel
Corporation, a three-year-old start-up technology company that had attracted the support of wealthy
venture capitalists willing to bet large investments on new, unproven entrepreneurs. The raw material
for semiconductors gave the name Silicon Valley to the California region south of San Francisco that
became the center of U.S. computer innovation.
Before the invention of the silicon computer chip, computers were massive devices serving
government agencies and large businesses, and operated by specialists. But in 1976, two secondary
school dropouts, Steve Jobs and Steve Wozniak, developed a small computer complete with
microprocessor, keyboard, and screen. They called it the Apple I, and it began the age of personal
computing and the dispersal of computer power to every sector of the economy.
The personal computer rapidly became an indispensable communications, entertainment, and
knowledge tool for homes and offices. IBM, the computer giant that had dominated mainframe
computers since the 1950s, produced a personal computer in the 1980s that quickly overtook Apple’s
lead. But IBM, in turn, was driven from PC manufacturing by competitors in the United States and
Asia who outsourced component fabrication to lowest-cost manufacturers and minimized production
costs of an increasingly low-margin item.
The biggest winner in this competition was Microsoft, a Redmond, Washington-based start-up
grounded in software, not manufacturing. Its founder, Bill Gates, had seized on the importance of
dominating the internal operating software that made the personal computer work. As rival computer
manufacturers rushed to copy the IBM model, Microsoft’s software became the standard for these
machines, and they steadily and relentlessly gained market share at the expense of other operating
system vendors. Gates’s company wound up collecting half of every dollar of sales by the PC
industry.
Gates moved into a realm of wealth comparable to that of John D. Rockefeller and Andrew
Carnegie, two titans of an earlier age of dynamic economic growth. Like his two predecessors’
companies, Gates’s Microsoft was attacked by competitors and governments for its dominance. And
Gates, like Rockefeller and Carnegie, became one of history’s most generous philanthropists,
committing billions of dollars to long-term campaigns to fight illnesses in Africa, improve education
in America, and support other humanitarian causes.
Rivaling the impact of the personal computer was another epochal breakthrough. The Internet,
including the searchable World Wide Web, accelerated a global sharing of information of every form,
from lifesaving technologies to terrorists’ plots, from dating services to the most advanced financial
transactions.
Like much American innovation, the Internet had roots in U.S. government science policy. The idea
of a self-standing highly redundant network to link computers was conceived as a way to defend
government and research computers against a feared nuclear attack on the United States. But despite
its ties to government, the Internet achieved its global reach thanks to pioneering scientists such as Sir
Tim Berners-Lee and Vinton Cerf, who insisted that it must be an open medium that all could share.
Unlocking the Internet
This Google logo commemorates the visit by Britain’s Queen Elizabeth II to Google’s London office.
© AP Images
In 1998, two graduate students at stanford university in California thought they saw how to unlock
the Internet’s rapidly expanding universe of information. A decade later, Google—as they called
their invention—had become the dominant Internet search engine in most of the world. Its revenue
topped $20 billion in 2008, half from outside the United States, and its employees numbered 20,000.
Its computers could store, index, and search more than one trillion other Web site pages. So
ubiquitous had this search engine grown that its very name had become a verb: When most people
want to find something on the Internet, they “google” it.
Although this astonishing success has rarely been matched, its ingredients are a familiar part of
the U.S. economic story. Google illustrates how ideas, entrepreneurial ambition, university research,
and private capital together can create breakthrough innovations.
Google’s founders, Sergey Brin and Larry Page, started with particular advantages. Brin, born in
Moscow, and Page, a midwesterner, are sons of university professors and computer professionals.
“Both had grown up in families where intellectual combat was part of the daily diet,” says David
Vise, author of The Google Story. They met by chance in 1995 at an orientation for new doctoral
students at Stanford University’s graduate school, and by the next year they were working together at
a new Stanford computer science center built with a $6 million donation from Microsoft founder
Bill Gates.
As with other Internet users, Brin and Page were frustrated by the inability of the existing search
programs to provide a useful sorting of the thousands of sites that were identified by Web queries.
What if the search results could be ranked, they asked themselves, so that pages that seemed
objectively most important were listed first, followed by the next most important, and so forth?
Page’s solution began with the principle that sites on the Web that got the most traffic should stand at
the top in search reports. He also developed ways of assessing which sites were most intrinsically
important.
At this point, Stanford stepped in with critical help. The university encourages its PhD students to
use its resources to develop commercial products. Its Office of Technology Licensing paid for
Google’s patent. The first funds to purchase the computers used for Google’s searches came from a
Stanford digital library project. Their first users were Stanford students and faculty.
The linkages between university research and successful business innovation have not always
thrived in regions where technology industries are not well rooted. But Stanford, in Palo Alto,
California, stands at the center of Silicon Valley, a matrix of technology companies, investment
funds, and individuals with vast personal fortunes that evolved during the decades of the computer
industry’s evolution.
In 1998, Brin and Page met Andy Bechtolsheim, a co-founder of Sun Microsystems, an
established Silicon Valley leader. Bechtolsheim believed that Brin and Page could succeed. His
$100,000 personal check helped the pair build their computer network and boosted their credibility.
A year later, Google was handling 500,000 queries a day and winning recognition across the Internet
community. Google’s clear advantages over its rivals and the inventors’ commitment attracted $25
million in backing from two of Silicon Valley’s biggest venture funds. And the founders got the
money without having to give up control of the company.
A decade after its founding, Google’s goals have soared astronomically. As author Randall
Stross, author of Planet Google, puts it, the company aims to “organize everything we know.” Its
initiatives include an effort to digitize every published book in the world.
Google has emerged as a metaphor for the openness and creativity of the U.S. economy, but also
for the far-ranging U.S. power that so worries foreign critics. Human rights advocates and
journalists blasted Google’s 2006 agreement to self-censor its search engine in China at the
direction of Beijing’s government. Google answers that these kinds of restrictions will fade with the
spread of democracy and individual freedoms. If that proves true, this example of American
entrepreneurship will have been an agent of that change.
GOVERNMENT IN ACTION
The emergency responses by U.S. government across a broad front—the White House, Congress and
the Federal Reserve—were among the most dramatic in history, according to economists Alan S.
Blinder and Mark Zandi. The federal government and the Federal Reserve (central bank) seized
control of the two largest U.S. home mortgage firms and bailed out leading banks and a major
insurance company—actions that would have been politically unthinkable before the crisis. An initial
$700 billion bank rescue plan proposed by President George W. Bush won bipartisan support in the
U.S. Congress.
Americans elected new national leadership in the midst of the crisis, choosing Barack Obama as
their new president. President Obama and the 110th Congress adopted a stimulus bill at the beginning
of 2009 that included an estimated $787 billion in tax cuts and targeted government spending on
infrastructure and energy—the largest economic rescue measure ever.
The financial intervention is credited with averting a catastrophe. Blinder and Zandi estimate that,
without the government’s response, 8.5 million more jobs would have been lost in 2010 and the
economy would have suffered a widespread price collapse.
The massive economic stimulus plan passed by the U.S. Congress early in the Obama
administration also sought to fuel expansion of new, technologically advanced energy and
environmental initiatives. These developments, it was hoped, would create new markets at home and
overseas for American companies and millions of jobs for workers across a wide range of skill
levels.
The Obama administration invested an unprecedented $32 billion in stimulus funds, and billions
more in tax credits and loan guarantees, in a wide range of clean-energy research and development
initiatives in 2009 and 2010. The ventures spanned many fronts: advanced nuclear reactors, wind and
solar generation, advanced storage batteries, “smart” electricity meters and electricity grid
monitoring equipment, and biomass and greenhouse gas sequestration from coal plants. Many projects
combined research from U.S. universities and national laboratories with financial backing from
private venture investors, augmented by government grants in a characteristic synergy of U.S.
innovation.
Job growth resumed in 2010. The stock market recovered slowly. Prices of large U.S. company
securities had fallen by more than half between January 2008 and March 2009. By mid-2011, rising
stock prices had erased the losses from 2008.
The dollar maintained its reputation as a safe haven for investors throughout the crisis. But the
government’s actions to stimulate the economy did not trigger the hoped-for strong rebound. Cautious
U.S. corporations were holding cash rather than spending money on expanding production and hiring
workers. Although the recession ended in June 2009, according to the U.S. National Bureau of
Economic Research, the U.S. unemployment rate remained near 10 percent in 2009 and 2010 and
about 9 percent in 2011.
At the end of 2010 both monetary policy and fiscal policy were straining to keep the economy from
faltering. With short-term interest rates already near zero, the Federal Reserve used a controversial
initiative to buy $600 billion worth of bonds in an attempt to drive down long-term interest rates. The
Federal Reserve has signaled that it intends to keep interest rates low into 2013.
In the meantime President Obama negotiated a stimulus package that extended expiring 2001 tax
cuts for two more years through 2012 and extended unemployment insurance payments through 2011.
Passed by a divided Congress, the package was projected to increase the national debt by $900
billion. At the beginning of 2012 a divided Congress was still struggling to agree on tax policy.
Some positive economic developments happened at the end of 2011, notably a drop in the
unemployment rate to 8.5 percent in December, the lowest level since February 2009. The U.S.
economy had added jobs for 15 months in a row.
While the U.S. economy appeared to continue strengthening as 2012 began, many uncertainties
remained, including a possible recession in Europe, a slowdown in China and other emerging
markets, and continued wrangling over U.S. tax policy.
CHAPTER 3
What the U.S. Economy Produces
Large U.S. multinational firms have altered their production strategies and their roles in response to
globalization as they adapt to increasing competition.
Standing by itself, U.S. manufacturing would be the eighth largest economy in the world.
U.S. MANUFACTURING INSTITUTE
2006
The U.S. economy is in the midst of its second radical conversion. The first represented a shift from
agriculture to manufacturing. The past quarter-century has witnessed a further evolution toward
finance, business services, retailing, specialized manufacturing, technology products, and health care.
The first revolution mated European capital to America’s burgeoning 19th-century expansion, while
the current transition reflects Americans’ response to unprecedented global competition in trade and
finance.
Like other economies, the U.S. economy comprises a circular flow of goods and services between
individuals and businesses. Individuals buy goods and services produced by businesses, which
employ individuals and pay them wages and benefits, providing the income that individuals use to
make new purchases of goods and services and investments, or to save.
The most common measure of the U.S. economy is the federal government’s report on the gross
domestic product (GDP). GDP records the value in dollars of all goods and services purchased in the
United States by individuals and businesses, plus investments, government spending, and exports and
imports from abroad. (It does not include sales by foreign companies located in the United States or
by American companies operating in foreign countries.)
GDP is made up both of goods and services for final sale in the private-sector market and
nonmarket services, such as education and military defense, provided by governments. In principle,
the value of goods and services in the market reflects an exchange between willing buyers and sellers
and is not fixed by government, with some notable exceptions such as government farm and energy
subsidies.
In 2011, the $15.1 trillion U.S. gross domestic product comprised approximately $10.7 trillion in
personal spending by American consumers; $1.9 trillion in private investments for homes, business
equipment, and other purposes; and $3 trillion spent by governments at all levels, minus an
international deficit of $578 billion—the difference between what the United States imported and
exported and its net financial transactions with the rest of the world.
Looking at GDP another way, in 2010 governments collected $2.7 trillion in taxes, roughly 60
percent of that on personal income and the rest on production and business profits. Governments paid
out $3.2 trillion in benefits, primarily to individuals, and $202 billion in interest to holders of
government debt. (The United States places near the middle of major economies in its overall tax
burden, ranking 18th out of 35 nations surveyed in 2009 by the Organization for Economic
Cooperation and Development.)
GDP sources are broken down into major economic sectors such as manufacturing and retail sales.
Comparing the 2010 output of these sectors with 1980 shows the magnitude of the shift from goods to
services over the past 30 years. In 2010, manufacturing provided 12 percent of total U.S. domestic
output of goods and services. In 1980, its share was 20 percent. Finance and real estate services
overtook manufacturing, contributing 21 percent of the U.S. economic output in 2010 versus 16
percent in 1980. Suppliers of professional business services, including lawyers and consultants,
contributed as much value as manufacturing—12 percent of the domestic economy. This figure was
only 7 percent in 1980. Retail and wholesale trade, at 12 percent, was slightly lower than in 1980.
The category of health care and private educational services was 9 percent in 2010, compared to 4
percent in 1980. Government at all levels accounted for 14 percent of the country’s economic output
in 2010, essentially unchanged from 1980. Oil and gas production dropped to just over 1 percent of
the nation’s output in 2010, from 2 percent in 1980.
Excluding government’s share of the economy, goods-producing companies made up 21 percent of
total private-sector output in 2010, down from 34 percent in 1980. The services sector climbed from
67 percent to 79 percent during that period.
A “greeter” awaits customers entering one of the stores of the chain Wal-Mart, the largest private employer in the United States.
Courtesy of Wal-Mart
The story of wal-mart’s stunning rise within a single generation from a commonplace, low-price
variety store in Arkansas to the world’s largest and most powerful retailer illustrates many
fundamental shifts taking place in the U.S. economy. Wal-Mart’s fixation on beating competitors’
prices and squeezing its operating costs to the bone year after year has proved to be a potent
strategy. By 2006, The Wal-Mart Effect author Charles Fishman reported, more than half of all
Americans lived within eight kilometers of a Wal-Mart store.
Although Wal-Mart typically sought out U.S. manufacturers to stock its shelves, as the company
grew, Wal-Mart management accelerated their search for lower-cost products and components in
overseas markets. Today, Wal-Mart has become the most important single conduit for foreign retail
goods entering the U.S. economy.
Wal-Mart’s spread across the American landscape has provoked intense opposition from critics,
led by labor organizations fighting what they view as the company’s antiunion policies. Wal-Mart
workers make half the wages of factory workers, or less, and have sometimes had wages capped to
hold down store costs. Personnel turnover is relatively high, but the company reports it routinely
gets 10 applications for every position when a new store opens. The company is using its economic
clout to promote energy-efficient products, solar energy installations at its stores, and fuel
conservation by its truck fleet, and has urged employees to support its “green” strategies. Its “big
box” stores, exceeding 13,000 square meters in size, have been vilified by some for overwhelming
nearby small-town merchants.
However, retailing in the United States has always been intensely competitive, with losing
technologies and strategies falling by the wayside. The spread of electricity in cities and the
invention of the elevator in the 1880s enabled retailing magnate John Wanamaker and imitators to
create the first downtown department stores. Then Sears and other catalog stores opened a new
retailing front—shopping from home. The movement of Americans who followed the Interstate
Highway System to ever more distant suburbs undermined local merchants long before Wal-Mart
reached its leviathan size. And Wal-Mart’s recent U.S. growth has slowed, as it and other big
retailers face competition from Internet shopping and specialty marketers.
The older, simpler U.S. retail model of a century ago, when community-based merchants sold
largely made-in-America products, might have provided a more stable economic base for some
communities. But this static model often failed to adapt to new conditions generated by the nation’s
dynamic economic, social, and political institutions.
Some of the wealth amassed in the economy goes to good causes. Microsoft founder and billionaire Bill Gates, shown here with
a Mozambique vaccine trial patient, has made philanthropy his new job.
© AP Images
“Americans…are also hustlers in the positive sense: builders, doers, go-getters, dreamers, hard
workers, inventors, organizers, engineers, and a people supremely generous.”
WALTER McDOUGALL
2004
Joseph Schumpeter, an Austrian-born economist, coined the term “creative destruction” in 1942 to
describe the turbulent forces of innovation and competition in Western economies. He called it the
“essential fact about capitalism.” The “incessant gales” of markets cull out failing or
underperforming companies, clearing the way for new companies, new products, and new processes,
as he put it.
Creative destruction was a philosophy that appealed to critics of the New Deal social and
economic intervention that took hold during the Great Depression, and it maintains an influential
following today. “I read Schumpeter in my 20s and always thought he was right,” said former Federal
Reserve Chairman Alan Greenspan, “and I’ve watched the process at work through my entire career.”
Today “destructive technology” is the label for change-forcing innovation and technology.
The juxtaposition of creation and destruction captures the ever-present tension between gains and
losses in the American market economy. The process has never been without critics and political
opponents. But because the winners have substantially outnumbered the losers, the churn of
competition remains a defining characteristic of the U.S. economy.
Outsiders often equate the U.S. economy with its largest corporations and what they make and do.
They may be surprised, then, by the vital part that small businesses play. Napoleon is said to have
dismissed England as “a nation of shopkeepers.” The phrase could also be applied in considerable
degree to the United States, whose shop owners and other small businesses account for over half of
the private-sector U.S. workforce and economic output, excluding farming. (“Small” businesses are
defined as having fewer than 500 employees.)
A typical American town or suburb of more than 10,000 people is populated with individual
business owners and small firms—car dealers; accountants and lawyers; physicians and therapists;
shoe repairers and cleaning establishments; flower and hardware stores; plumbers, painters, and
electricians; clothing boutiques; computer repair shops; and restaurants of a half-dozen ethnic flavors.
Many of the small retailers compete with national chains boasting billions of dollars in revenue and
thousands of employees.
Despite the odds against them, small businesses account for a vast majority of job growth,
particularly as major manufacturing companies trim employment in the face of stiff global
competition. In 2004, for example, the number of jobs in small businesses grew by 1.9 million overall
from the year before. Larger companies with 500 employees or more lost 181,000 net jobs.
(Economists point out that many small businesses provide goods and services to large companies and
thus are tied to their fortunes.)
PRAISING WORK
The original contours of the American economy were defined by a culture that elevated conscientious
work into a national value. “In the beginning America was the land and the land was America,” wrote
anthropologist and businessman Herbert Applebaum. Unlike Britain, the New World offered the
promise of landownership to the typical settler, at least once the Native American peoples had been
driven off. But the land was useless without an investment in “backbreaking and continuous work,”
Applebaum added. The farmer had to master a dozen tradesman’s skills. The tradesman had to farm.
Necessity bred a deep strain of individualism within the communal settlements that spread across the
land.
As the American colonies prospered and then combined in their unlikely Revolutionary War
victory, Americans increasingly viewed work not merely as a requisite of survival but as the path to
success.
“Significant numbers of Americans believe that anyone, high or low, can move up the economic
ladder as long as they are talented, hardworking, entrepreneurial, and not too unlucky,” wrote Yale
University law professor Amy Chau. This belief helps explain the relative weakness of class-based
political movements in the United States and the acceptance—however grudgingly—by most
Americans of greater disparities in wealth than are found in other developed nations, Chau and other
commentators say.
The sociologist and political economist Max Weber, writing a century ago in his influential The
Protestant Ethic and the Spirit of Capitalism, argued that Protestant religions helped build
capitalism’s foundation by endorsing hard work, honesty, and frugality. That spirit survives, but in
changing forms, says the urban studies theorist Richard Florida.
In his 2005 book, The Flight of the Creative Class, Florida argues that the protest movements of
the 1960s and 1970s eventually sparked new perceptions of work. Increasingly not just hard work,
but fulfilling, interesting, fun work became the goal of the baby-boom generation that dominated the
U.S. economy in the last third of the 20th century.
But even this cultural turn reflected traditional American traits. A streak of pragmatism, skepticism,
and contrariness runs deep in the American character, historians say. “The American’s attitude toward
authority, rules, and regulations was the despair of bureaucrats and disciplinarians,” writes
Commager.
American history suggests that whatever future form it takes, the individualism and contrariness
that seem wired into the national culture will continue to fuel Americans’ hustling, striving nature.
CHAPTER 5
Geography and Infrastructure
Education and transportation help hold together widely separated and distinct regions.
Pittsburgh, Pennsylvania, became a steelmaking center at the confluence of rivers, coal beds, and rail.
© Gianna Stadelmyer/Shutterstock
“It is one of the happy incidents of the federal system that a single courageous state may… serve
as a laboratory and try novel social and economic experiments…”
JUSTICE LOUIS BRANDEIS
U.S. Supreme Court
1932
As a continental nation spanning much of the territory between two great oceans, the United States is
blessed with tremendous natural resources: a treasure of forests, seacoasts, arable land, rivers, lakes,
and minerals. School atlases of North America once located important economic resources with
simple icons placed on a map: office skyscrapers marking the Eastern Seaboard’s metropolitan
centers; factories flanking the Great Lakes industrial belt; stacks of wheat and grazing livestock on the
Great Plains; cotton in the Old South and eastern Texas; coal in the Appalachian Mountains of the
East and on the eastern slopes of the Rocky Mountains; iron ore in Minnesota’s Mesabi Range; oil
wells in the Southwest, California, and Alaska; timber and hydropower in the Southeast and
Northwest.
Of course these resources were found in many places. The area around Pittsburgh, Pennsylvania,
became a center of steelmaking because of the nearby coal deposits and its rail and river connections
to the rest of the country. Gary, Indiana, and Birmingham, Alabama, were big steel cities, too. John D.
Rockefeller’s oil fortunes were made in Pennsylvania, but Texas’s plains, the coastal states along the
Gulf of Mexico, southern California, and Alaska also sheltered large oil preserves. Even so, those
old schoolbook maps correctly pinpointed the different centers of America’s resource wealth from
which the economy grew.
A similar 21st-century economic map would look very different. Old manufacturing cities around
the Great Lakes have lost hundreds of thousands of production jobs over the past two decades. Other
metropolitan areas have grown on the strength of their technology and finance sectors. Even so, the
American economy retains its strongly regional character.
A NATION OF REGIONS
Distinct regions emerged in America’s first century as immigrants from different lands moved to parts
of the country where their skills might best be suited and their families welcomed. Scandinavian
farmers landed in Minnesota; Jewish immigrant tradesmen from Europe’s cities settled in New York
and other major northern cities; Mexican farm workers beat a path to California’s orchards and fields.
Settlers followed kinsmen, creating clusters of common customs that took root in each region.
Journalist Dan Morgan has observed that orderly New England “Yankees” moving from their homes
in the northeastern United States to Ohio laid out plans for future towns with schools and courthouses
“before the first harvest was in.” German immigrants erected sturdy dairy barns in Pennsylvania, built
to last, and they did, as one generation followed another. Farmers and townspeople in the East sought
land or fortune on western frontiers, braving life-threatening challenges. Those who made it
implanted a strong individualistic strain that still characterizes the western outlook.
This clustering of people, skills, and resources fostered the emergence of distinct regional
identities and personalities. Journalist Joel Garreau, in his book The Nine Nations of North America,
suggests that the United States, Canada, Mexico, and the Caribbean contain separate North American
regions with different, defining characteristics. The U.S. regions are New England; the old industrial
states around the Great Lakes; the South with its historical legacies and new economic dynamism; the
breadbasket of farmlands from the Midwest to the Great Plains; the thinly settled wilderness and
desert regions along the Rocky Mountains; the center of Latino presence in Texas and the Southwest;
the nucleus of environmental activism along the Pacific Coast; and the tip of Florida with its ties to
the Caribbean.
“Some are close to being raw frontiers; others have four centuries of history. Each has a peculiar
economy; each commands a certain emotional allegiance from its citizens. These nations look
different, feel different, and sound different from each other,” Garreau wrote. “Some are clearly
divided topographically by mountains, deserts, and rivers. Others are separated by architecture,
music, language, and ways of making a living. Most importantly, each nation has a distinct prism
through which it views the world.”
Differences in character affected how each region developed. An example is water. The first
settlers reaching America from Britain brought with them the traditions of English common law.
Owners of “riparian” property—on the banks of lakes and rivers—had the right to claim use of the
“natural flow” of water past their lands. But this principle was tested by economic competition. Mill
owners, key players in the northern colonies’ economy, could claim competing rights to the same
river.
To settle these disputes, American courts created the doctrine of “reasonable use.” It is, in effect, a
requirement that users fairly share water resources. What was reasonable in these disputes varied
from state to state and region to region, but it often meant that a bigger mill or factory could make a
greater claim on a river’s flow than a smaller one. The factory cities that sprung up along the rivers of
the northeastern United States owed their existence to shared water supplies.
The California gold rush of 1848 led to an entirely different doctrine, one that met the miners’
needs and would shape the uses of water throughout the West. A miner finding a gold seam would
claim the land and water from the nearest creek to wash dirt away from the precious nuggets. The
miner’s claim established a “first-in-time, first-in-use” priority allowing him to take as much water as
he required.
After the gold rush ended, the miners’ approach to water rights became an established custom.
Unlike the principle of shared resources in the East, the miners’ “prior appropriation” doctrine, as it
became called in the West, allowed pioneering developers to claim vast amounts of water to support
the expansion of cities in arid Southern California and other southwestern states and to help western
farmers grow crops on dry land by tapping immense underground water aquifers without limitations.
Los Angeles and Las Vegas exist as metropolitan cities today because of the western water rights
doctrine.
The example of water rights illustrates the variety of regional policies, laws, and practices that
emerged within a diverse Union. U.S. Supreme Court Justice Louis D. Brandeis framed the case for
the diversity of state policies in a widely noted dissenting opinion on a 1932 case before the court: “It
is one of the happy incidents of the federal system that a single courageous state may, if its citizens
choose, serve as a laboratory, and try novel social and economic experiments without risk to the rest
of the country.” States remain laboratories of policy innovation in education, energy supply, and
public transportation.
UNIFYING FORCES
The landscape of U.S. history is covered with travelers’ paths. The economic blight throughout the
South after the U.S. Civil War sent thousands of Scotch-Irish immigrants and their children drifting
westward to find open farms in Texas and native American Indian territory. “When conditions became
intolerable, they exercised their ultimate right as Americans—the right to move on,” Dan Morgan
wrote. They chalked “GTT” on abandoned front doors and departed. Their neighbors knew the
initials meant “Gone to Texas.”
The Great Depression and dust storms of the 1930s forced the greatest migration in the nation’s
history, as 300,000 people from Oklahoma, Texas, Missouri, and Arkansas headed for California’s
fertile central valley. Fearful California authorities raised a sign in Tulsa, Oklahoma, warning, “No
Jobs in California. If you are out of work keep out!” But the Okies, as they were called, went anyway.
The movement of people was triggered by both opportunity and necessity. A long-running migration
of African Americans out of the South continued throughout the 20th century as farm mechanization
displaced hand labor. The greatest transition began during World War II, when northern steel and auto
factories offered jobs to African Americans to fill wartime vacancies. Economic necessity prevailed
over traditions of racial bias.
New England’s textile industry over the past century gradually moved to the South, where land was
cheaper and labor unions weaker. In recent decades, foreign auto and truck companies have set up
factories across the South, welcomed by growth-minded business and civic leaders. Today, once-
empty towns in Wyoming are filling up with newcomers taking jobs in the state’s expanding coal
industry.
The mobility of American workers is well documented. One study in the past decade reported that,
on average, U.S. college graduates would work for 11 employers before retirement. The U.S. Bureau
of Labor Statistics calculated that college graduates would hold 13 different job positions, counting
promotions and changes of employers, before reaching 38 years of age.
The willingness of Americans to “get up and go” is recorded by the national census taken every 10
years. The 1990 U.S. census found that just 60 percent of the country’s people were living in the same
state where they were born. And that average concealed considerable variations among the states.
Eighty percent of Pennsylvanians surveyed in that census, and more than 70 percent of residents of
other states, including Iowa, Louisiana, Michigan, Minnesota, and Mississippi, were living in their
birth state. But only 30 percent of Florida’s residents could say the same.
Migration continued in the beginning of the 21st century. From 2000 to 2004, the northeastern
United States lost a net average of 246,000 residents a year, and the Midwest’s population declined
by an average 161,000 people a year. But the South gained 352,000 people a year on average. In the
West, Pacific Coast states lost an average 75,500 residents a year, but the Rocky Mountain states
gained an average 130,000.
REGIONAL CENTERS
As international competition and foreign trade became larger factors in the U.S. economy during the
first decade of the 21st century, a shift of jobs away from the older centers of factory production
accelerated. The regions gaining jobs have been regional centers where technology and finance are
strongest, as shown by government data on job gains and losses for major U.S. cities from 2000 to
2007.
While job growth throughout the United States averaged less than 1 percent a year during those
seven years, Huntsville, Alabama, a center of U.S. space technology, had a 42 percent increase in
“professional, scientific, and technical” jobs. Austin, Texas, where semiconductor production has a
strong footing, had a 22 percent gain in the same category of technology jobs. In Northern Virginia,
whose economy is built on the presence of major contractors who work on the federal government’s
technology missions, jobs in the professional and scientific category expanded by 31 percent from
2000 to 2007, and computer system design jobs grew by the same percentage.
In contrast, Chicago, America’s “second city” and the centerpiece of the old manufacturing
Midwest, lost 19 percent of its goods-producing jobs over those seven years. South Bend, Indiana,
another old factory city, lost 18 percent of its goods-producing jobs. Detroit, Michigan, home of the
U.S. car industry, suffered a 35 percent drop in goods-producing jobs.
Well before the start of the 21st century, many had concluded that America’s economy could no
longer prosper simply by employing Yankee ingenuity to convert its wealth of natural resources into
products for sale at home and abroad. Nor could it rely on older industries that had been centerpieces
of state and regional economies to hold their places in competitive markets.
Since the 1980s, many local officials have tried to stimulate their economies by investing in their
region’s education and technology resources. Some governors have created technology
“greenhouses”—giving space in research facilities to help entrepreneurs develop new products and
processes. Universities have developed courses to equip scientists and engineers with specific skills
needed by local companies.
Such regional strategies lost momentum in the 2000s decade as the economy grew and
unemployment shrank. But the steep recession that began in 2008 was expected to renew interest in
these policies.
CHAPTER 6
Government and the Economy
Much of America’s history has focused on the debate over the government’s role in the economy.
Rachel Carson, a government scientist, raised concerns about pesticide use that led to government environmental regulation.
© Underwood & Underwood/Corbis
“Then a strange blight crept over the area and everything began to change....There was a
strange stillness....The few birds seen anywhere were moribund; they trembled violently and
could not fly. It was a spring without voices. On the mornings that had once throbbed with the
dawn chorus of scores of bird voices there was now no sound; only silence lay over the fields and
woods and marsh.”
RACHEL CARSON
Silent Spring
1962
The United States was established on the mutually reinforcing principles of individual enterprise and
limited governmental influence. The rage of the American colonists over a range of taxes imposed by
the British Crown helped trigger the Revolutionary War in 1775. “Taxation Without Representation”
was a battle cry. The new republic’s first secretary of the Treasury, Alexander Hamilton, succeeded
in establishing a national bank but lost his campaign for a federal industrial policy in which
government would promote strategically important industries to strengthen the nation’s economy and
its military defense.
But this predisposition toward free enterprise was not absolute. From the beginning, the country’s
governments—federal, state, and local—have protected, regulated, and channeled the economy.
Governments have intervened to aid the interests of regions, individuals, and particular industries.
Just how far the government should go in doing this always has been a central political issue.
The legal justification for economic regulation rests on a few sections of Article I of the U.S.
Constitution. These give Congress authority to collect taxes and duties, borrow on the credit of the
nation, pay the federal government’s debts, create and regulate the value of U.S. currency, and
establish national laws governing bankruptcies and the naturalization of immigrants. States were
barred from taxing trade with other states. The Constitution’s authors recognized that the young
country had far to go to match European scientific and industrial leadership; in part for this reason,
they empowered Congress to give authors and inventors exclusive rights to profit from their creations
for a limited period.
The most general—and controversial—constitutional language on the economy lies in the 16 words
of Article I, Section 8, which authorize Congress to “regulate commerce” with foreign nations, with
the native American Indian tribes, and among the states. This application of the commerce clause to
the states has been used during the past century to justify far-reaching government programs on issues
the Founding Fathers could never have imagined.
Interpretation of the commerce clause divides Americans who want an activist federal government
from those who advocate a more limited central authority. The U.S. Supreme Court has often been
called on to resolve disputes over the reach of the commerce clause. Some of the important 19th-
century decisions interpreted the clause narrowly, finding that, while shipments of goods along rivers
that passed several states were covered by the commerce clause, manufacturing was a local activity
and not covered.
But the court’s decisions grew more expansive in the 20th century, upholding important New Deal
programs affecting employment and agriculture. In the 1960s, the judiciary broadly interpreted the
term “interstate commerce,” as it held that Congress did possess the power to pass the landmark civil
rights laws that forbade private businesses from engaging in racial discrimination. In these cases the
courts carefully scrutinized the evidentiary record for ties to interstate commerce, in one instance
finding it in the wheat used in the hot dog rolls served by a “private” club that practiced
discrimination in membership. Beginning in the 1990s, a number of Supreme Court rulings sought to
narrow those earlier decisions by focusing the commerce clause on controversies directly centered on
economic activities.
Although economic regulation has diminished since the 1970s, its protections still play an essential
role, affecting the health of workers; the safety of medicines and consumer products; protection of
motorists and airline passengers, bank depositors and securities investors; and the impact of business
operations on the environment.
Organizers for the Office Workers Union stage a rally on Wall Street in New York City in 1936.
© Time & Life Pictures/Getty Images
When president woodrow wilson traveled to the 1919 Paris Peace Conference at the end of World
War I, the U.S. delegation he assembled included Samuel Gompers, the slight, 69-year-old son of
poor Jewish immigrants from Holland by way of Britain. Gompers had risen from an apprentice
cigar maker in New York City to become president of the American Federation of Labor, the
country’s largest union organization.
Gompers’s leadership of the AFL during the turbulent birth of the union movement defined the
unique role of labor organizations in the United States. For most of the century that followed, despite
periods of violent conflicts with company managements, U.S. labor leadership never frontally
attacked the capitalist market structure of the nation’s economy. Its goal was a greater portion of the
economy’s fruits for its members. “We shall never cease to demand more until we have received the
results of our labor,” Gompers often said. But he also held that “the worst crime against working
people is a company which fails to operate at a profit.”
Although these goals sound today to be within the boundaries of mainstream political debate,
labor’s efforts to organize railroad, mine, and factory workers a century ago produced constant
confrontations, many of them violent and some deadly. The strike by steelworkers at Andrew
Carnegie’s Homestead, Pennsylvania, plant in 1892 caused a bloody fight pitting workers and their
families and friends against company-hired guards, and ultimately state militia. The core of the
dispute was a power struggle between workers and management over work rules governing the
plant’s operations. Although Carnegie said he favored unions, he backed the goal of his deputy,
Henry Clay Frick, of regaining unchallenged control over the plant. After a series of assaults,
gunfights, and an attempted assassination of Frick, the strike was broken. Gompers’s AFL would not
take the strikers’ side, and the plant remained non-union for 40 years.
But over the following decades, labor’s demand for a larger share of the economic pie and relief
from often brutal working conditions were adopted increasingly by political reformers and then
national political candidates. Even in the darkest years of the Great Depression, when a quarter of
the nation’s workforce was unemployed, American labor unions mostly concentrated on securing
higher wages and better working conditions and not on assuming traditional management
prerogatives to make fundamental business decisions. Nor did U.S. labor unions follow the example
of European unions by embracing radical politics or forming their own political party. American
labor instead typically used its financial and organizational clout, greatest in the industrial states of
the Northeast and the Midwest, to back pro-labor political candidates.
The legitimacy of organized labor was guaranteed by the National Labor Relations Act of 1935,
commonly known as the Wagner Act. Part of President Franklin D. Roosevelt’s New Deal, the law
established the rules under which workers could form unions and employers would be required to
bargain with them, and also established a National Labor Relations Board to enforce those rules.
During the prosperous years following World War II, U.S. labor unions enjoyed their greatest
success. Automobile manufacturers, to cite one example, found it preferable to negotiate generous
wages and benefits, passing through the costs to American consumers.
But global and domestic developments gradually changed the economic climate in ways
unfavorable to industrial unions. Many U.S. manufacturers expanded or shifted operations to
southern states, where labor unions were less prevalent. Beginning in the 1980s, manufacturers
turned increasingly to foreign sources of products and components. When steel and other
manufacturing plants closed down across the northeastern and midwestern states, people started
calling the region the Rust Bowl, an echo of the devastating 1930s’ Dust Bowl erosion of
midwestern farmland. In the southern Sun Belt, much domestic industrial job growth focused on new,
nonunion factories established by foreign manufacturers, Japanese and German carmakers prominent
among them.
One symbolic moment in the relative decline of organized labor occurred early in the first
administration of President Ronald Reagan (1981-1989). Ironically, Reagan came from a union
background; a successful actor, he rose to head the Screen Actors Guild, where he led a campaign to
block communist efforts to infiltrate the union. In 1981, Reagan confronted a strike by the
Professional Air Traffic Controllers Organization. The strike was illegal, as federal employees
were by law permitted in many cases to unionize but prohibited from striking “against the public
interest,” as the commonly used phrase went. Reagan gave the controllers 48 hours to return to their
jobs, then fired the 11,000-plus who refused to return, replacing them with new workers and
breaking the union.
The outcome reflected the American public’s lack of sympathy for public employee strikes, and it
also reflected waning union membership. At the end of World War II, one-third of the workforce
belonged to unions. By 1983, it was 20 percent, and by 2007, the figure had dropped to 12 percent.
One bright spot for organized labor was growth in the services sector, particularly among public
service employees such as teachers, police officers, and firefighters, whose jobs could not easily be
outsourced. This trend is illustrated by the growth of the Service Employees International Union,
whose ranks nearly doubled between 1995 and 2005 to reach 1.9 million members at a time when
industrial union rolls were shrinking. The SEIU represents workers at the bottom of the income
scale, including janitors, nurses, custodial workers, and home-care providers. Many of their jobs
lack health Insurance and other benefits that come with high-paid work. Another major union, the
National Education Association, represents more than 3 million public school teachers and
employees.
Labor organizations such as the AFL-CIO (an umbrella organization of many unions), SEIU, and
NEA assisted President Barack Obama’s successful 2008 election, helping staff his voter
registration and turnout drives. The unions hoped that the incoming Obama administration would
advance new legislation strengthening their efforts to organize workplaces.
THE ANTITRUST LAWS
The government’s antitrust authority came from two laws, the Sherman Antitrust Act of 1890 and the
Clayton Act of 1914. These laws, based on common law sanctions against monopolies dating from
Roman times, had different goals. The Sherman Act attacked conspiracies among companies to fix
prices and restrain trade, and it empowered the federal government to break up monopolies into
smaller companies. The Clayton Act was directed against specific anticompetitive actions, and it
gave the government the right to review large mergers of companies that could undermine
competition.
Although antitrust prosecutions are rare, anticompetitive schemes have not disappeared, as
economist Joseph Stiglitz says. He cites efforts by the Archer Daniels Midland company in the 1990s
in cooperation with several Asian partners to monopolize the sale of several feed products and
additives. ADM, one of the largest agribusiness firms in the world, was fined $100 million, and
several executives went to prison.
But the use of antitrust laws outside the criminal realm has been anything but simple. How far
should government go to protect competition, and what does competition really mean? Thinkers of
different ideological temperaments have contested this, with courts, particularly the Supreme Court,
playing the pivotal role. From the start, there was clear focus on the conduct of dominant firms, not
their size and power alone; Theodore Roosevelt famously observed that there were both “good
trusts” and “bad trusts.”
In 1911, the Supreme Court set down its “rule of reason” in antitrust disputes, holding that only
unreasonable restraints of trade—those that had no clear economic purpose—were illegal under the
Sherman Act. A company that gained a monopoly by producing better products or following a better
strategy would not be vulnerable to antitrust action. But the use of antitrust law to deal with dominant
companies remained an unsettled issue. Federal judges hearing cases over the decades have tended to
respect long-standing legal precedents, a principle known by its Latin name, stare decisis.
Court rulings at times have reflected changes in philosophy or doctrine as new judges were
appointed by new presidents to replace retiring or deceased judges. And the judiciary tends also to
reflect the temperament of its times. In 1936, during the New Deal era, Congress passed a new
antitrust law, the Robinson-Patman Act, “to protect the independent merchant and the manufacturer
from whom he buys,” according to Representative Wright Patman, who co-authored the bill. In this
view, the goal of antitrust law was to maintain a balance between large national manufacturing and
retailing companies on one side, and the small businesses that then formed the economic center of
most communities on the other.
This idea—that the law should preserve a competitive balance in the nation’s commerce by
restraining dominant firms regardless of their conduct—was reinforced by court decisions into the
1970s. At the peak of this trend, the U.S. government was pursuing antitrust cases against IBM
Corporation, the largest computer manufacturer, and AT&T Corporation, the national telephone
monopoly.
Rising imports from Asia such as these cargo containers unloaded in Tacoma, Washington, created political tension in the United
States.
© AP Images
Open trade “dovetailed with peace; high tariffs, trade barriers, and unfair economic
competition, with war.…”
SECRETARY CORDELL HULL
U.S. Department of State
1948
Trade ties the United States’ economy inextricably to the markets and economies of the rest of the
world. In 2010, the U.S. gross domestic product—the output of U.S.-based workers and property—
totaled nearly $14.5 trillion. Of that, $1.8 trillion came from exports to foreign destinations. Imports
into the United States were significantly higher, totaling $2.4 trillion.
In addition to traded goods and services, huge tides of financial transactions flow across global
borders. U.S. companies and individuals directly invest more than $2 trillion abroad annually, making
the United States the world’s largest direct investor in foreign economies. It also receives more
investment from outside its borders than any other nation. As a world financial capital, New York is
the center of an international hedge fund industry of private investors that amassed nearly $1.5 trillion
in assets at the end of 2006.
While U.S. exports add to the nation’s gross domestic product, the larger volume of imports
reduces it. The trade imbalance over the past decade has created a politically sensitive tradeoff: The
surplus of imports tended to lower prices paid by American consumers, but it also depressed wages
for some workers in industries facing foreign competition. The U.S. trade deficits have also
undermined the value of the U.S. dollar compared to other major currencies, increasing concerns
about the stability of the world’s financial markets, as described in chapter 8.
What does the United States export? The largest single category in 2010 was motor vehicles and
their parts and engines, totaling $112 billion. A group of refined petroleum products were high on the
list: plastic materials ($33 billion), fuel oil ($33 billion) and other petroleum products ($33 billion).
Semiconductors ($47 billion), pharmaceuticals ($47 billion), industrial machines ($43 billion),
organic chemicals ($34 billion), electrical apparatus ($32 billion), telecommunications equipment
($32 billion), medicinal equipment ($30 billion), and civilian aircraft ($30 billion) followed on the
list of major export industry categories.
U.S. crude oil and gas imports totaled $282 billion in 2010. Americans imported $225 billion
worth of motor vehicles, engines, and parts that year, along with $117 billion in computers and
computer accessories, $81 billion in various kinds of apparel and textiles, $85 billion in
pharmaceuticals, $48 billion in telecommunications equipment, $38 billion in televisions and VCRs,
and $35 billion worth of toys and games. The variety of traded items spans virtually everything
Americans make, wear, use, or consume.
The United States is the world’s largest agricultural exporter, with one out of every three acres
planted for export, according to U.S. government surveys. The value of U.S. exports of farm products,
animal feeds, and beverages came to $108 billion in 2010. Imports were lower at $92 billion. The
total volume of U.S. farm exports rose by 17 percent between 1997 and 2007, and in that period,
American farmers exported 45 percent of their wheat, 33 percent of their soybean production, and 60
percent of their sunflower oil crops.
As economist Paul M. Romer has observed, imports rose from 12 percent of the U.S. gross
domestic product in 1995 to about 17 percent a decade later. Foreign money provides about one-third
of U.S. domestic investment, up from 7 percent in 1995. In other words, Romer says, “The U.S. is
more open to the global economy than ever before, and the links run in both directions.”
A commitment to expand global trade has been a cornerstone of U.S. policy since the final years of
World War II, when the United States and other victorious nations adopted a series of international
compacts to promote economic stability and growth. Trade restrictions and currency devaluations
were widely considered to have worsened the 1930s Great Depression by stifling international
commerce.
Through the formation of the United Nations and the agreements on international economic policies
reached at the 1944 Bretton Woods Conference in the United States, the allied powers hoped to
replace the militant nationalism that led to the war with cooperative economic policies. During the
Cold War between the Soviet bloc and the West, trade liberalization with Europe and Asia became an
instrument of U.S. foreign policy and a way to promote market capitalism in emerging nation
economies.
The U.S. steel industry survives in a reduced size, continuing research and development at this facility in Monroeville,
Pennsylvania.
© AP Images
The u.s. steel industry has faced a series of crises since the mid-1970s, when steel producers
engaged in a global battle for market share, profitability, and survival. The industry’s struggles
graphically illustrate the impact—both positive and negative—of creative destruction on American
manufacturing.
Benefits have accrued to the nation as a whole. The U.S. steel industry and its workers are three
times more productive today than in the 1970s. American steel companies have invested in
advanced processes that have dramatically boosted energy efficiency while reducing pollution and
health threats to steelworkers. The sharp rise in coal and other energy prices since 2000 has helped
U.S. steel producers that process their own raw materials.
On the ledger’s other side, steel industry employment plunged from 531,000 in 1970 to 150,000 in
2008. Steelmaking cities in the American industrial heartland were battered over these decades. In a
2006 interview, Nobel Prize-winning economist Joseph Stiglitz recounted the impact of the
industry’s fall on his hometown of Gary, Indiana, a city founded by U.S. Steel Corporation a century
ago. The city “reflects the history of industrial America. It rose with the U.S. steel industry, reached
a peak in the mid-’50s when I was growing up, and then declined very rapidly, and today is but a
shell of what it was.”
In Europe and Asia, governments have directly intervened for more than a quarter-century to help
fund a massive expansion of steelmaking capacity. They have supported both official and unofficial
import barriers and turned a blind eye on secret market-sharing agreements, according to evidence
before the U.S. International Trade Commission and the European Union’s competition authorities.
While the United States has sporadically restricted imports, it has never developed a long-term
policy to bolster the American steel industry’s competitiveness.
International trade rules permit countries to defend domestic industries against the “dumping” of
imports in their home markets at “less than normal” prices. When recessions and financial crises left
world markets filled with surplus steel, the U.S. industry sought dumping penalties to combat low-
priced imports. In response, U.S. presidents tended to impose temporary limits on imported steel, or
arrange voluntary restraints, to ease the damage to American steel firms. But the U.S. steel industry
rarely got the sustained protection it sought. For a range of political and economic reasons, U.S.
policy has tended to resist tough trade sanctions. Cheaper steel imports benefited the auto industry
and other steel users and helped restrain inflation. And Washington has been sensitive to the outcry
from foreign governments against proposed U.S. trade penalties.
The result is a U.S. steel market that is more open to foreign ownership and imports than are any
of its major rivals. In 2007, more than 30 percent of U.S. steel consumption was imported, a far
higher import share than one finds in the markets of major U.S. steel competitors Japan, Russia,
China, and Brazil.
U.S. STEEL Corporation, the company that J.P. Morgan founded in 1901, remains the country’s
largest steel manufacturer and is ranked 10th in the world based on 2007 output. Nucor, the upstart
U.S. producer that challenged “Big steel” by fabricating new steel from scrap melted in high-
efficiency furnaces, is third in the United states and 12th in the world.
The other major U.S. steel concern is a collection of commonly owned historic companies headed
by the former Bethlehem Steel, a major producer that sank into bankruptcy in the late 1990s. They
were bought at severely discounted prices by an American investor, Wilbur L. Ross, a specialist in
distressed asset acquisitions. Ross says his approach to buying failing companies and reclaiming the
salvageable parts is “a Darwinian thing.” he told Fortune magazine in 2003, “The weaker parts get
eliminated, and the stronger ones come out stronger. Our trick is to figure out which is which, try to
climb on to the ones that can be made into the stronger ones, and then try to facilitate the demise of
the weaker ones.”
In 2004, Ross sold the U.S. plants to India’s Lakshmi Mittal and his Mittal Steel company, which
then became part of the world’s largest steel producer in 2006 when Mittal merged with Europe’s
leading steelmaker, Arcelor. Today, U.S. Steel, Arcelor Mittal, and Nucor control more than half of
U.S. production. Ten percent is owned by Russian steel interests, another beneficiary of the
relatively open U.S. steel market.
Following the late 1990s’ financial crises, when low-cost foreign steel flooded the U.S. market,
more than 40 steelmakers, distributors, and fabricators filed for bankruptcy. At that time, the U.S.
steel industry owed more than $11 billion in “unfunded” pension obligations to a growing
population of retirees, debts that it could not pay. Bankruptcy was a way out.
U.S. bankruptcy law allows companies to revoke certain contracts, including pension
commitments, which can then be passed on to the Pension Benefit Guaranty Corporation, a federal
agency that insures certain pension plans and pays promised benefits upon a company’s failure.
Steelworkers retired from the insolvent companies held on to most of their pension benefits thanks to
the PBGC, but they lost the retiree health insurance coverage also promised by their former
employees.
Trade restrictions imposed by former President George W. Bush, coupled with relief from some
industry retiree health care commitments, helped the U.S. steel industry recover during the economic
boom of the early 2000s. But the recession that began in 2008 has revived fears of steel surpluses,
particularly with the growth of state-supported steelworks in Brazil, India, and China. Steelmaking
capacity in those three countries now equals one-third of the world’s total, and the debate over fair
trade in steel is back on the world’s agenda.
President Barack Obama, shown with former Federal Reserve Chairman Paul Volcker, faces the greatest economic challenges in
a generation while working with a Congress that is sharply divided politically.
© AP Images
“The hard truth is that getting this deficit under control is going to require broad sacrifice.”
PRESIDENT BARACK OBAMA
United States of America
2010
The United States and much of the developed world escaped the worst of the possible outcomes
associated with the 2008 financial crisis. But the United States and other industrial nations still faced
high unemployment and unsatisfactory economic growth. Financial emergencies in several European
nations in 2010-2011 suggested that parts of the world’s banking system might remain vulnerable.
Several conclusions seemed inescapable. Economic globalization, which has linked banking and
trade on every continent and supplied real benefits to many, also enabled the financial market
contagion to spread worldwide. Leaders of the United States and other major economies agreed that a
new system of financial market supervision and regulation was needed to restore investors’ battered
confidence in markets and to revive investment.
In 2010 Congress passed and President Obama signed the Dodd-Frank Act covering banks
operating in the United States. This law is designed to:
Prevent banks and other financial firms from becoming “too big to fail” and thus requiring a
government bailout should they fall into financial difficulty.
Give regulators authority to take over and shut down troubled financial firms in an orderly way
before they threaten economic stability.
Prohibit banks from engaging in speculative investments with their own accounts as opposed to
executing instructions issued by a customer.
Identify and address risks posed by complex financial products and practices.
Give the Federal Reserve authority to regulate non-bank businesses such as insurance companies
and investment firms that predominantly engage in financial activities.
Regulate such potentially risky practices as over-the-counter derivatives, mortgage-backed
securities and hedge funds.
Protect consumers from hidden fees and deceptive practices in mortgages, credit cards and other
financial products.
Protect investors through tougher regulation of credit rating agencies.
The legislation left regulators to work out key details, and their actions would determine Dodd-
Frank’s effectiveness. Despite the recognition that leading economies should harmonize their bank
regulations, this goal had not been fully achieved as of early 2012.
SOARING DEFICIT
The emergency measures taken to stimulate the economy and shore up threatened financial institutions
drastically increased the federal budget deficit, which represents the difference between federal
spending and revenue. The federal budget had already gone into deficit during the George W. Bush
administration, starting in the 2002 fiscal year. President Obama’s 2009 stimulus package of new
government spending and tax cuts brought the deficit, as measured in proportion to the entire
economy, to a level not seen since the end of World War II. The deficit for fiscal year 2011 came to
$1.3 trillion, about 8.7 percent of economic output, down from 9 percent in 2010 and 10 percent in
2009.
A bipartisan National Commission on Fiscal Responsibility and Reform appointed by Obama
concluded in 2010 that the nation was on “an unsustainable fiscal path.”
The commission noted that in 2011 the first of the Baby Boom generation of 78 million citizens was
becoming eligible for Social Security and Medicare (the health program for the elderly), increasing
the cost of these programs. If U.S. deficits continue to grow at the current pace, by 2025 federal tax
collections and other revenue would cover only interest payments on the federal debt and
“entitlement” programs (Social Security; Medicare; Medicaid, the health program for the poor;
veterans’ pensions and benefits). Nothing would be left for defense programs or federal support for
education, transportation, housing, research and all the rest of government services.
As the 2000s decade proceeded, foreign investors financed an increasing share of U.S. government
debt. In mid-2000, this debt totaled $1 trillion. Eight years later, the total was $2.7 trillion, with
foreign government-owned banks or “sovereign” investment funds holding the fastest-growing share.
Foreign entities used the U.S. dollars flowing overseas for manufactured goods and oil to purchase
U.S. Treasury securities and other U.S. government debt. The United States, in essence, was
borrowing from the future to finance current consumption.
U.S. government officials across the political spectrum agreed on the need to realign spending with
revenues although they disagreed over the best strategy for doing so. After Republican Party gains in
the November 2010 elections, passing legislation on spending and taxes became more protracted and
difficult. “The hard truth is that getting this deficit under control is going to require broad sacrifice,”
President Obama said. He proposed a policy of combining spending cuts with a tax increase for a
relatively small number of families with the highest incomes, but Republicans in Congress blocked
any tax rise.
INCOME DISPARITY
Another challenge facing economic policymakers and legislators was mounting evidence that
economic growth increasingly has concentrated income and wealth gains among a small minority of
the U.S. population.
Possible factors for this shift include: the decline in well-paid manufacturing jobs and a shift
toward lower-paid service employment, the growing employment disadvantages of less-educated
workers in a highly technical economy and the burden of rising medical care costs for America’s
lower- and middle-income families. Because of these and other factors, the average wage of U.S.
non-farm workers has not increased appreciably since 1980, after taking inflation into account.
Optimistic observers noted that the United States still could bring important resources to bear on
the economic challenges, among them its entrepreneurial culture, the depth and breadth of its
educational system and the freedom it afforded capital to seek the highest returns.
Applying these real strengths to the nation’s equally real challenges will be a great test for the
current generation of Americans. As Kent H. Hughes of the Woodrow Wilson International Center for
Scholars writes, “It is hard to see how the United States will win the contest of ideas in the 21st
century without continued economic growth, technological innovation, improved education, and
broad-based equality of opportunity.”
Hughes adds that “the country will need to take steps to restore national trust in key institutions,
rediscover a sense of national purpose, restore its commitment to shared gains and shared sacrifices,
and renew its sense of American identity.” But it also is true that Americans have faced and
surmounted such challenges in the past, as President Obama reminded the nation in his 2009 inaugural
address. “Starting today,” he said, “we must pick ourselves up, dust ourselves off, and begin again the
work of remaking America.”
Outline of the U.S. Economy
2012 Updated Edition
Published in 2012 by: Bureau of International Information Programs United States Department of
State
Bureau of International Information Programs
Coordinator: Dawn McCall
Executive Editor: Nicholas Namba
Editor in Chief: Michael Jay Friedman
Managing Editor: Bruce Odessey
design: David Hamill
Graphs: Erin Riggs
Photo Editor: Maggie Sliker
FRONT COVER
Top illustration © Dave Cutler / Stock Illustration Source
Bottom illustration © Jane Sterrett / Stock Illustration Source
ABOUT THIS EDITION
This edition updates the 2009 revision by Peter Behr, a former business editor and reporter for the
Washington Post. Previous editions of this title were published by the U.S. Information Agency
beginning in 1981 and by the U.S. State Department since 1999.
The changing global economic dynamics post World War II significantly influenced U.S. labor union membership trends. Initially, unions flourished, with high membership levels as labor demands were strong due to America's unmatched industrial capacity . However, as global industries recovered, foreign competition grew, leading U.S. companies to seek cost-reduction strategies by moving production offshore and employing non-unionized labor, both domestically in less union-likely states and abroad . These changes, combined with increased automation and the growth of service sectors less conducive to unionization, drove a decline in labor union membership from one-third of the workforce post-WWII to 12 percent by 2007 . The decline was exacerbated by political decisions and changing labor laws, which further weakened traditional union strongholds .
U.S. perceptions of government intervention evolved significantly from the Revolutionary War, which was kindled by opposition to British mercantilist controls and taxes, emphasizing 'Taxation Without Representation' . Post-independence, the emphasis was on individual enterprise with limited government intervention, aligning with ideals of liberty and democracy. The late 19th and early 20th centuries saw a shift toward more federal intervention to regulate and break up monopolies and trusts, reflecting concerns over economic power concentration . The Great Depression further shifted perceptions, leading to New Deal policies that established a regulatory framework, including labor laws that protected workers' rights and promoted unionization . By the mid-20th century, a balance was sought, where government intervention was accepted as necessary for regulating the economy and ensuring fair competition, marking a more interventionist approach compared to earlier independence-focused ideologies .
U.S. economic policy adapted to the decline in domestic manufacturing jobs by encouraging a shift towards service industries and technology sectors. As manufacturing's share of employment and economic output decreased, policymakers embraced strategies to stimulate sectors less susceptible to outsourcing and international competition . Initiatives included investments in education, development of technology greenhouses, and a focus on innovation as a competitive edge . Federal and state governments pursued policies that enhanced workforce skills suitable for service and technology industries while providing transitional support for displaced workers . These adapative strategies aimed to maintain economic growth in the face of global shifts and align labor market capabilities with emerging industrial and technological demands .
The role and influence of labor unions in the United States significantly declined from the mid-20th century to 2007 due to several factors. At the end of World War II, one-third of the U.S. workforce belonged to unions. However, by 1983, this figure dropped to 20 percent, and further declined to 12 percent by 2007 . This decline was fueled by the rise of global competition, technological advancement, and a political climate less favorable to unions. Key legislative changes such as the Taft-Hartley Act of 1947 weakened union power by making it harder to organize workers and allowing a presidential cooling-off period during strikes . Additionally, the shift of manufacturing jobs to states with less union prevalence and to foreign countries due to higher labor costs contributed to the decline. President Reagan's action against the Professional Air Traffic Controllers Organization in 1981 also symbolized the weakening political support for unions and echoed public sentiments against certain public employee strikes .
World War II heavily impacted the global industrial landscape, leaving much of Europe and Asia in ruins and contributed to the emergence of the United States as the world's economic superpower . In the immediate post-war period, American industries had little competition from other industrialized nations that were focused on rebuilding . This enabled the USA to capitalize significantly on its industrial capacity, fueling an economic boom that reinforced its position as a dominant economic power. However, as Europe and Asia recovered, especially by the 1980s, countries like Japan and Germany began to challenge U.S. industrial leadership, further joined by emerging nations like China and India in later decades . This competition, coupled with domestic shifts from goods production to service-oriented economic sectors, gradually altered the dynamics of American economic dominance .
The relationship between technological innovation and entrepreneurship in the U.S. during the late 20th century was synergistic, driving economic opportunities and market transformation. The personal computer and the Internet fundamentally redefined business operations, creating global markets and enhancing efficiency in production and distribution . These advancements attracted numerous entrepreneurs who harnessed technology for economic gain by securing venture capital for their start-ups. This influx of capital fueled new business models and markets, exemplified by the rise of companies like Microsoft, Apple, and later eBay and Yahoo . The capitalization on intellectual innovations not only reshaped the economy by fostering new industries but also stimulated economic growth and productivity gains towards the end of the 1990s . The symbiotic relationship was pivotal in transitioning the economy towards more complex, service-oriented sectors .
In the early 21st century, sectors that significantly contributed to the U.S. GDP included finance and real estate services, which accounted for 21 percent of the economic output by 2010, overtaking manufacturing that provided 12 percent . This represented a major shift from previous decades, particularly 1980, when manufacturing's share was 20 percent and finance and real estate stood at 16 percent . Additionally, professional business services matched manufacturing by contributing 12 percent, while health care and private educational services grew significantly from 4 percent in 1980 to 9 percent in 2010 . This shift reflects the growing dominance of the services sector, which encompassed 79 percent of private-sector output by 2010, compared to 67 percent in 1980 .
The dot-com bubble exemplified the volatile nature of the 'new economy' by being marked with rapid growth and a subsequent sharp decline in the value of internet-based companies. During the late 1990s, extraordinary investments were made in untested internet companies based on high expectations of technology and e-commerce, driving stock prices to unsustainable levels . Companies such as Pets.com highlighted the speculative investment frenzy where economic fundamentals were overlooked in favor of 'irrational exuberance,' as warned by Federal Reserve Chairman Alan Greenspan . The NASDAQ Composite Index reached over 5,000 in March 2000, before crashing shortly after, which demonstrated how the high optimism was met with unexpected downturns as overvaluations corrected . This pattern underscored the risks associated with speculative bubbles in technology-driven markets.
Venture capitalism played a crucial role in developing internet-based companies during the 1990s by providing the essential financial backing needed for growth and innovation in this burgeoning sector. Entrepreneurs looking to shape new niches in e-commerce often attracted funding from venture capitalists who were willing to invest high-risk capital in exchange for equity stakes, motivated by the potential for substantial returns . These investments led to significant innovations and market expansions, with companies like Yahoo and eBay epitomizing the growth fostered by such financial support. The availability of venture capital enabled rapid scaling of ideas to market level, yet it also contributed to speculative investment behaviors observed during the dot-com bubble . Despite the risks, venture capital was instrumental in transitioning new technologies into viable and transformative businesses, fueling the economic momentum of the late '90s .
In response to increased global competition during the late 20th and early 21st centuries, the U.S. manufacturing industry adopted strategies such as offshoring operations, sourcing foreign parts, and focusing on high-value products where they had a competitive advantage through innovation . The restructuring was necessary due to the rising competition from revitalized foreign industrial giants like Japan and emerging economies such as China and India . This period also saw a decline in manufacturing employment, dropping from more than 20 percent to 10 percent by the early 21st century, while productivity continued to increase due to technological advancements and specialization . The strategy was aimed at maintaining competitiveness in the global market while adapting to higher domestic labor and benefits costs .