1
A PEASANTâS KINGDOM
I moved to New York City in 1999 just in time to see the dot-com dream come crashing down. I saw high-profile start-ups empty out their spacious lofts, the once ebullient spaces vacant and echoing; there were pink-slip parties where content providers, designers, and managers gathered for one last night of revelry. Although I barely felt the aftershocks that rippled through the economy when the bubble burst, plenty of others were left thoroughly shaken. In San Francisco the boomâs rising rents pushed out the poor and working class, as well as those who had chosen voluntary poverty by devoting themselves to social service or creative experimentation. Almost overnight, the tech companies disappeared, the office space and luxury condos vacated, jilting the city and its inhabitants despite the irreversible accommodations that had been made on behalf of the start-ups. Some estimate that 450,000 jobs were lost in the Bay Area alone.1
As the economist Doug Henwood has pointed out, a kind of amnesia blots out the dot-com era, blurring it like a bad hangover. It seems so long ago: before tragedy struck lower Manhattan, before the wars in Afghanistan and Iraq started, before George W. Bush and then Barack Obama took office, before the economy collapsed a second time. When the rare backward glance is cast, the period is usually dismissed as an anomaly, an embarrassing by-product of irrational exuberance and excess, an aberrational event that gets chalked up to collective folly (the crazy business schemes, the utopian bombast, the stock market fever), but ânever as something emerging from the innards of American economic machinery,â to use Henwoodâs phrase.2
At the time of the boom, however, the prevailing myth was that the machinery had been forever changed. âTechnological innovation,â Alan Greenspan marveled, had instigated a new phase of productivity and growth that was ânot just a cyclical phenomenon or a statistical aberration, but ⊠a more deep-seated, still developing, shift in our economic landscape.â Everyone would be getting richer, forever. (Income polarization was actually increasing at the time, the already affluent becoming ever more so while wages for most U.S. workers stagnated at levels below 1970s standards.)3 The wonders of computing meant skyrocketing productivity, plentiful jobs, and the end of recessions. The combination of the Internet and IPOs (initial public offerings) had flattened hierarchies, computer programming jobs were reconceived as hip, and information was officially more important than matter (bits, boosters liked to say, had triumphed over atoms). A new economy was upon us.
Despite the hype, the new economy was never that novel. With some exceptions, the Internet companies that fueled the late nineties fervor were mostly about taking material from the off-line world and simply posting it online or buying and selling rather ordinary goods, like pet food or diapers, and prompting Internet users to behave like conventional customers. Due to changes in law and growing public enthusiasm for high-risk investing, the amount of money available to venture capital funds ballooned from $12 billion in 1996 to $106 billion in 2000, leading many doomed ideas to be propped up by speculative backing. Massive sums were committed to enterprises that replicated efforts: multiple sites specialized in selling toys or beauty supplies or home improvement products, and most of them flopped. Barring notable anomalies like Amazon and eBay, online shopping failed to meet inflated expectations. The Web was declared a wasteland and investments dried up, but not before many venture capitalists and executives profited handsomely, soaking up underwriting fees from IPOs or exercising their options before stocks went under.4 Although the new economy evaporated, the experience set the stage for a second bubble and cemented a relationship between technology and the market that shapes our digital lives to this day.
As business and technology writer Sarah Lacy explains in her breathless account of Silicon Valleyâs recent rebirth, Once Youâre Lucky, Twice Youâre Good, a few discerning entrepreneurs extracted a lesson from the bust that they applied to new endeavors with aplomb after the turn of the millennium: the heart of the Internet experience was not e-commerce but e-mail, that is to say, connecting and communicating with other people as opposed to consuming goods that could easily be bought at a store down the street. Out of that insight rose the new wave of social media companies that would be christened Web 2.0.
The story Lacy tells is a familiar one to those who paid attention back in the day: ambition and acquisitions, entrepreneurs and IPOs. âWinning Is Everythingâ is the title of one chapter; âFuck the Sweater-Vestsâ another. Youâd think it was the nineties all over again, except that this time around the protagonists aspired to market valuations in the billions, not millions. Lacy admires the entrepreneurs all the more for their hubris; they are phoenixes, visionaries who emerged unscathed from the inferno, who walked on burning coals to get ahead. After the bust, the dot-coms and venture capitalists were âeasy targets,â blamed for being âsilly, greedy, wasteful, irrelevant,â Lacy writes. The âjokes and quipsâ from the âcynicsâ cut deep, making it that much harder for wannabe Web barons âto build themselves back up again.â But build themselves back up a handful of them did, heading to the one place insulated against the downturn, Silicon Valley. âThe Valley was still awash in cash and smart people,â says Lacy. âEveryone was just scared to use them.â
Web 2.0 was the logical consequence of the Internet going mainstream, weaving itself into everyday life and presenting new opportunities as millions of people rushed online. The âhuman need to connectâ is âa far more powerful use of the Web than for something like buying a book online,â Lacy writes, recounting the evolution of companies like Facebook, LinkedIn, Twitter, and the now beleaguered Digg. âThatâs why these sites are frequently described as addictive ⊠everyone is addicted to validations and human connections.â
Instead of the old start-up model, which tried to sell us things, the new one trades on our sociabilityâour likes and desires, our observations and curiosities, our relationships and networksâwhich is mined, analyzed, and monetized. To put it another way, Web 2.0 is not about users buying products; rather, users are the product. We are what companies like Google and Facebook sell to advertisers. Of course, social media have made a new kind of engagement possible: they have also generated a handful of enormous companies that profit off the creations and interactions of others. What is social networking if not the commercialization of the once unprofitable art of conversation? That, in a nutshell, is Web 2.0: content is no longer king, as the digital sages like to say; connections are.
Though no longer the popular buzzword it once was, âWeb 2.0â remains relevant, its key tenets incorporated not just by social networking sites, but in just by all cultural production and distribution, from journalism to film and music. As traditional institutions go underâconsider the independent book, record, and video stores that have gone out of businessâthey are being replaced by a small number of online giantsâAmazon, iTunes, Netflix, and so onâthat are better positioned to survey and track users. These behemoths âharness collective intelligence,â as the process has been described, to sell people goods and services directly or indirectly. âThe key to media in the twenty-first century may be who has the most knowledge of audience behavior, not who produces the most popular content,â Tom Rosenstiel, the director of the Pew Research Centerâs Project for Excellence in Journalism, explained.
For those who desire to create art and cultureâor âcontent,â to use that horrible, flattening wordâthe shift is significant. More and more of the money circulating online is being soaked up by technology companies, with only a trickle making its way to creators or the institutions that directly support them. In 2010 publishers of articles and videos received around twenty cents of each dollar advertisers spent on their sites, down from almost a whole dollar in 2003.6 Cultural products are increasingly valuable only insofar as they serve as a kind of âsignal generatorâ from which data can be mined. The real profits flow not to the people who fill the platforms where audiences congregate and communicateâthe content creatorsâbut to those who own them.
The original dot-com bubbleâs promise was first and foremost about money. Champions of the new economy conceded that the digital tide would inevitably lift some boats higher than others, but they commonly assumed that everyone would get a boost from the virtual effervescence. A lucky minority would work at a company that was acquired or went public and spend the rest of their days relaxing on the beach, but the prevailing image had each individual getting in on the action, even if it was just by trading stocks online.
After the bubble popped, the dream of a collective Internet-enabled payday faded. The new crop of Internet titans never bothered to issue such empty promises to the masses. The secret of Web 2.0 economics, as Lacy emphasizes, is getting people to create content without demanding compensation, whether by contributing code, testing services, or sharing everything from personal photos to restaurant reviews. âA great Web 2.0 site needs a mob of people who use it, love it, and live by itâand convince their friends and family to do the same,â Lacy writes. âMobs will devote more time to a site they love than to their jobs. Theyâll frequently build the site for the founders for free.â These sites exist only because of unpaid labor, the millions of minions toiling to fill the coffers of a fortunate few.
Spelling this out, Lacy is not accusatory but admiringâawestruck, even. When she writes that âsocial networking, media, and user-generated content sites tap intoâand exploitâcore human emotions,â itâs with fealty appropriate to a fiefdom. As such, her book inadvertently provides a perfect exposĂ© of the hypocrisy lurking behind so much social media rhetoric. The story she tells, after all, is about nothing so much as fortune seeking, yet the question of compensating those who contribute to popular Web sites, when it arises, is quickly brushed aside. The âmobsâ receive something âfar greater than money,â Lacy writes, offering up the now-standard rationalization for the inequity: entertainment, self-expression, and validation.7 This time around, no oneâs claiming the market will be democratizedâinstead, the promise is that culture will be. We will âcreateâ and âconnectâ and the entrepreneurs will keep the cash.
This arrangement has been called âdigital sharecropping.â8 Instead of the production or distribution of culture being concentrated in the hands of the few, it is the economic value of culture that is hoarded. A small group, positioned to capture the value of the network, benefits disproportionately from a collective effort. The owners of social networking sites may be forbidden from selling songs, photos, or reviews posted by individual users, for example, but the companies themselves, including user content, might be turned over for a hefty sum: hundreds of millions for Bebo and Myspace and Goodreads, one billion or more for Instagram and Tumblr. The mammoth archive of videos displayed on YouTube and bought by Google was less a priceless treasure to be preserved than a vehicle for ads. These platforms succeed because of an almost unfathomable economy of scale; each search brings revenue from targeted advertising and fodder for the data miners: each mouse click is a trickle in the flood.
Over the last few years, there has been an intermittent but spirited debate about the ethics of this economic relationship. When Flickr was sold to Yahoo!, popular bloggers asked whether the site should compensate those who provided the most viewed photographs; when the Huffington Post was acquired by AOL for $315 million, many of the thousands of people who had been blogging for free were aghast, and some even started a boycott; when Facebook announced its upcoming IPO, journalists speculated about what the company, ethically, owed its users, the source of its enormous valuation.9 The same holds for a multitude of sites: Twitter wouldnât be worth billions if people didnât tweet, Yelp would be useless without freely provided reviews, Snapchat nothing without chatters. The people who spend their time sharing videos with friends, rating products, or writing assessments of their recent excursion to the coffee shopâare they the users or the used?
The Internet, it has been noted, is a strange amalgamation of playground and factory, a place where amusement and labor overlap in confusing ways. We may enjoy using social media, while also experiencing them as obligatory; more and more jobs require employees to cultivate an online presence, and social networking sites are often the first place an employer turns when considering a potential hire. Some academics call this phenomenon âplaybor,â an awkward coinage that tries to get at the strange way âsexual desire, boredom, friendshipâ become âfodder for speculative profitâ online, to quote media scholar Trebor Scholz.10 Others use the term âsocial factoryâ to describe the Web 2.0, envisioning it as a machine that subsumes our leisure, transforming lazy clicks into cash. âParticipation is the oil of the digital economy,â as Scholz is fond of saying. The more we comment and share, the more we rate and like, the more economic value is accumulated by those who control the platforms on which our interactions take place.11
Taking this argument one step further, a frustrated minority have complained that we are living in a world of âdigital feudalism,â where sites like Facebook and Tumblr offer up land for content providers to work while platform owners expropriate value with impunity and, if you read the fine print, stake unprecedented claim over usersâ creations.12 âBy turn, we are the heroic commoners feeding revolutions in the Middle East and, at the same time, âmodern serfsâ working on Mark Zuckerbergâs and other digital plantations,â Marina Gorbis of the Institute for the Future has written. âWe, the armies of digital peasants, scramble for subsistence in digital manor economies, lucky to receive scraps of ad dollars here and there, but mostly getting by, sometimes happily, on social rewardsâfun, social connections, online reputations. But when the commons are sold or traded on Wall Street, the vast disparities between us, the peasants, and them, the lords, become more obvious and more objectionable.â13
Computer scientist turned techno-skeptic Jaron Lanier has staked out the most extreme position in relation to those he calls the âlords of the computing clouds,â arguing that the only way to counteract this feudal structure is to institute a system of nano-payments, a market mechanism by which individuals are rewarded for every bit of private information gleaned by the network (an interesting thought experiment, Lanierâs proposed solution may well lead to worse outcomes than the situation we have now, due to the twisted incentives it entails).
New-media cheerleaders take a different view.14 Consider the poet laureate of digital capitalism, Kevin Kelly, cofounder of Wired magazine and longtime technology commentator. It is not feudalism and exploitation that critics see, he argued in a widely circulated essay, but the emergence of a new cooperative ethos, a resurgence of collectivismâthough not the kind your grandfather worried about. âThe frantic global rush to connect everyone to everyone, all the time, is quietly giving rise to a revised version of socialism,â Kelly raves, pointing to sites like Wikipedia, YouTube, and Yelp.