1 From Dot-com to Dot-bomb
The first decade of the 21st century began with worldwide celebrations at midnight on January 1, 2000. As the world spun on its axis, citizens were treated to spectacular displays of fireworks from notable landmarks: the Harbour Bridge in Sydney, Australia, which was hosting the Summer Olympics in coming months; the Eiffel Tower in Paris; and the Washington Monument in the nationâs capital, covered in scaffolding designed by architect Michael Graves. The celebrations were broadcast worldwide, meaning you could celebrate the New Year in every time zone from the comfort of your living room.
Or did we have it all wrong? Did the new millennium actually start a year later? Technically, yes, the third millennium started in 2001, but we still celebrated in 2000, as the counter clicked over from 19 to 20.
That counter was actually a core computer issue known as Y2K (that is, the Year 2000). Many early computers programmed the year as two rather than four digits, and thus many feared that the power grid, airplane tracking systems, water pumping stations, and more would fail when the year restarted from 99 to 00. They called it the Y2K bug, and companies and governments spent billions upgrading their equipment and software to prepare for New Yearâs Eve. But when the clock ticked over from 1999 to 2000, nothing happened. It was a huge relief. The worldâs computing systems in fact didnât fail; however, there was another crisis just around the corner.
Computers were the key reason why the American economy grew so strongly in the 1990s. The technology boom had pushed productivity ever higher, and the economy in turn boomed at a 4 percent annual growth rate. âIt made Americaâs freewheeling, entrepreneurial, so-what-if-you-fail business culture the envy of the world,â explained Federal Reserve chair Alan Greenspan. âU.S. information technology swept the global market, as did innovations ranging from Starbucks lattes to credit derivatives.â1
The nation was prosperous. Rising productivity produced a bonus of tax revenue for the federal government, which suddenly found a budget surplus in 1998 for the first time in thirty years. The surplus measured $70 billion in 1998, $124 billion in 1999, $237 billion in 2000, and was projected to grow to $270 billion in 2001. A debate ensued over how to spend itâor to return it to the taxpayers. Greenspan preferred a fiscally conservative policy to pay down the national debt, which then stood at $3.7 trillion. This was an opportunity to prepare for the retirement of the massive Baby Boom generation starting in a decade.2
The technology boom was in part driven by the adoption of the Internet. The Department of Defense created the platform in the 1970s as a communications platform that could survive a nuclear war. By the 1990s it broadened to the private sector with the widespread adoption of email and the World Wide Web, which created a graphical interface for people to find information. Websites were born, soon followed by electronic commerce. Consumers became comfortable with online transactions, such as buying books on Amazon. Internet-based companiesâknown as âdot-comsâ for the .com on their websiteâclaimed that they were the face of the New Economy, and that the business cycle was now a thing of the past. And presumably, so were recessions. It was a cocky time for people who worked in technology. They thought they could conquer the world.
A bull market eruptedâa stock-buying binge that was nothing more than a bubble. And like all bubbles, it would burst. Investors scooped up shares in initial public offerings (IPO). Internet browser Netscapeâs IPO in 1995 touched off the Internet stock surge on the technology-heavy NASDAQ exchange. The discussion at cocktail and dinner parties was all about the latest stock tip. Amazon, AOL, eBay, and Yahoo were all darlings of the era. Valuations soared, far beyond profitable brick-and-mortar businesses, hyped by the hubristic belief that stocks could only go up.
In January 2000, a New Economy company America Online (AOL) merged with an Old Economy cable television company, Time Warner, in what was thought to be a harbinger of things to come. The deal was valued at a shocking $350 billion. It was poor timing (the dot-com bubble burst two months later), and an even poorer decision, as this merger turned out to have few synergies, and AOLâs dial-up Internet business was fading. Ego-driven acquisitions made little business sense, but who cared when even secretaries were becoming millionaires with their stock options?
The Super Bowl, Americaâs most watched television event, had possibly its most interesting commercials in 2000. Many of these were for dot-com companies that used humor and entertainment, such as the beloved sock puppet from Pets.com, cowboys herding cats for EDS, and a risquĂ© âmoney out the wazooâ ad from E*Trade. Many of these companies soon would be out of business.
The dot-com boom was really two bubbles: Internet and telecom. Telecommunications companies required massive nationwide infrastructure: building a network was expensive, and thus investment in telecom was actually far greater than in dot-coms, which tended to be small startups. The hype was that you couldnât have enough bandwidth. Fiber optical cables had dramatically increased capacity as the Internet grew, but far more bandwidth was built than anyone needed. There were also too many competitorsâeveryone was overly leveraged as they had borrowed a staggering amount of money to build their networks.
Part of what fueled the dot-com boom were financial analysts like Henry Blodget of Merrill Lynch and Jack Grubman of Salomon Smith Barney, who were hyping stocks in public while panning them behind closed doors. There was supposed to be a firewallâwhat many referred to as a Chinese wallâin financial firms between analysts and traders, a wall that turned out to be nonexistent. Securities analysts became cheerleaders for stocks, knowing their firms would rope in juicy underwriting contracts and theyâd get a fat bonus. They were hardly neutral players in an industry that needed dispassionate analysis. The conflicts of interest were legion.
âShareholder valueâ was the mantra of CEOs of every publicly traded company. Driving the stock price up became the primary goal, not a secondary reflection of the companyâs merits. The stock option became a tool to promise rewards to managers and executives if they pushed the stock price up. This was especially popular in Silicon Valley technology companies, but soon others joined. Executives smelled money like sharks smell blood and they demanded options. Stock options were handed out to everyone from CEOs to secretaries. The rising stock market made everyone feel rich. Day-trading stocks became possible from home, and some adopted this get-rich-quick ethos that seems so destructive in human behavior. âThe degree of hype was surreal,â observed Alan Greenspan.3
CEOs and corporate boards engaged in peer benchmarking, comparing their pay to the median pay of other CEOs. As every executive believed they were above average, boards raised executive pay through the roof, often without the companyâs actual performance in mind. At the same time, worker compensation over the decade declined in real terms. This greatly widened the inequality gap and further concentrated national wealth at the top of the pyramid.4 âOf course, the CEO was nominally supervised by the directors,â noted Wall Street historian Roger Lowenstein. âBut the typical board was larded with the CEOâs cronies, even with his golfing buddies. They were generally as independent as a good cocker spaniel.â5
Internet business centers developed around the country: the traditional technology incubator in Silicon Valley north of San Jose, California; Tysons Corner, Virginia; Boston; Raleigh-Durham; Seattle; and Silicon Alley in Lower Manhattan. These were technology hubs that attracted talent. As technology worker pay was so much higher, and often inflated through stock options, this pushed up the cost of living in technology-focused cities. Californiaâs Bay Area became especially unaffordable. Author Chrystia Freeland called the emergence of the technologists the âtriumph of the nerds.â6
The age of the Internet brought about a permanent shift in the office dress code. Through the 1990s, people generally dressed up for work. Men wore suits and ties, while women wore dresses, skirts, and jewelry. But with the advent of the dot-com era, business casual clothing was introduced into the workplace, and khakis, polos, sneakers and hoodies, and even jeans became normal. Every day became a casual day, not just casual Friday. While some expected this to be temporary, it was in fact permanent as the suit and necktie were relegated to the back of the closet, rarely to emerge again. Companies realized that relaxing work-related dress codes was good for employee morale, cost them absolutely nothing, and in turn, created a casual, hipster-friendly environment that attracted new talent. Millennials graduating college barely had to change outfits from college sweatshirts and jeans to fit right into the new workforce. And they were appropriately attired for the Ping Pong table in the breakroom.
In the dot-com boom, companies that had no earnings and no prospect of profitability saw their shares soar through the roof. Investors were simply infatuated with anything Internet-related, like the Dutch tulip mania of the 1630s. It was hubris to believe that the Internet party would never end. But end it did. And it was hubris to believe that somehow we had conquered the business cycle. The New Economy, as it turned out, was pretty indistinguishable from the Old.
On March 10, 2000, the tech-heavy NASDAQ reached an all-time high after doubling in one year. This was just five weeks after all those fabulous Super Bowl commercials ran. Stocks were far too expensive and companies too heavily leveraged with no profits. It was like someone taking the punch bowl away. Overnight the dot-com revolution turned into a dot-bomb. The bursting of the Internet bubble was as swift as it was sudden as investors raced for the lifeboats. Between the March high and year-end 2000, the NASDAQ fell 50 percent. The rest of the market was down as well, but not nearly as much: the Dow had fallen 3 percent, while the S&P 500 fell 14 percent.7
Things didnât improve in 2001, as the market selloff continued into its second year and extended into the broader market as the economy went into recession. By October 2002, the NASDAQ had fallen 78 percent from its March 2000 high. The S&P 500 fell 50 percent, and it took six years to return to its former high. The Dow Jones Industrial Average fell 40 percent to just above 7,000. An estimated $6.5 trillion in investment had been wiped out when the dot-com bubble burst.8
A huge shakeout took place as many Internet-based startups collapsed. Venture capital dried up. Hundreds of thousands of layoffs rippled through the economy in 2000 and 2001 as dot-coms folded. Sharks circled to sweep up the salvageable remnants. It turned out that the Old Economy way of business was the only way to do business: you still needed a business plan, paying customers, and to be profitable to survive.
Fortunately, the dot-com collapse didnât take down the broader economyâonly a mild recession ensued, though many investors were hit hard. This was especially painful to future retirees, given that a growing number of Americans owned stocks in their retirement savings thanks to the 401(k). However, after the bubble burst there was no return to the heady economic growth of the 1990s. Economic growth slowed throughout the 2000s as productivity growth braked.
The Internet survived the meltdown, of course, and many dot-coms like Amazon, eBay, and Netflix continued and thrived. The survivors had good business models. The Internet launched many new successful businesses, and it had become a new channel for many existing businesses. It had permanently changed how companies operate with consumers, how consumers interact with one another, and how we research and share information. And most of all, pornography. Yes, pornography. The off Broadway musical Avenue Q had a famously ribald song called âThe Internet is For Porn.â Dr. Cox from the television comedy Scrubs said in sardonic seriousness: âI am fairly sure that if they took porn off the Internet, there would only be one web site left and it would be called âBring Back the Porn.âââ
Indeed, the Internet had changed things. Customer service, information, music, publishing, research, shoppingâso many things shifted online. By the end of the decade, for example, the album or compact disc would go extinct as consumers shifted to downloading music. People shifted from newspapers to the Web for their news. But these changes took time to evolve, rather than happened overnight.
In 1996, Federal Reserve chairman Alan Greenspan had warned about âirrational exuberanceâ in the stock market, but it took another four years before his warning came true. The phrase would be a hallmark for the first decade, not just in the bursting the dot-com bubble in 2000, but in the housing market collapse in 2007 and the stock market panic in 2008.
The New Economy meant a pink slip, a box to carry your stuff out of the office, and a humble phone call to ask if you could move back in with your parents till you got back on your feet. The thousands of stock options that would allow you to retire at thirty turned out to be worthless. The business cycle had conquered after all.
2 Dubya
The year 2000 marked many things: the supposed start of the new millennium, the Olympics, a leap year, and importantly for Americans, a presidential election. Bill Clinton, a charismatic but controversial Democrat, had been in the Oval Office for eight years, which coincided with the dot-com boom. It was said that his vice president, Al Gore, claimed to have invented the Internet when in fact it was really a Pentagon agency. This brought Gore widespread derision, even though he had said no such thing.
Every president elected since Bill Clinton has been known as a polarizing figure, but that is in part because of the increasing partisanship that poisoned the well of comity and good feelings that had existed since World War II. Americans became extremely polarized during the Vietnam War era, when the country sharply split over the war and nearly tore itself apart. The Watergate scandal created an enormous crisis of confidence and trust in the government, since President Richard Nixon had sabotaged a political opponent and subverted the constitution to get reelected in 1972. Still, politicians continued to act in a fairly bipartisan manner until the 1990s. The end of the era of good feelings in Congress coincided with the end of the Cold War.
Clinton badly stumbled in his first two years in the White House before hitting his stride. The result was a Republican Party takeover of Congress in 1994 led by Newt Gingrich, who turned the GOP into a hyper-partisan organization. Gingrich was only House Speaker for four years (1995â1998) before an ethics scandal sank him, but he forever changed the GOP into a party that shed its past as a mainstream, pro-trade, chamber of commerce party into a tribal organization geared more toward power. Four decades of Democratic dominance in Congress came to an end. With it came the rise of right-wing media like Fox News that was effectively a propaganda machine.
Clinton himself was certainly not innocent of partisanship or political shenanigans. He had an affair with a White House intern, which he lied about to the special counsel investigating him. The Gingrich-led House of Representatives impeached Clinton in 1998, a step that proved deeply unpopular to the nation. Clinton bounced back after the Senate failed to find him guilty, but the act of impeachment cemented Democrats and Republicans into their ideological corners. Impeachment is foremost a political act, and this backfired against Republicans who were scolded for prosecuting iniquity rather than illegality. âPartisan warfare had been the permanent condition of the 1990s,â observed author Steve Kornacki.1
It was against this highly charged partisan environment that the presidential campaign of 2000 began to take shape. George âDubyaâ Bush, the governor of Texas and son of former president George H. W. Bush, emerged as the Republican frontrunner. Bush had an upset victory over Ann Richards in 1994, despite never having been elected to public office before, and served two terms as governor of the Lone Star State. He championed education reform, something that Texas was failing at badly at the time. He roped in Karl Rove to serve as his political architect, a man who would follow him to the ...