The second generation of the Internet has arrived. It's worse than you think.
11:00 PM, Feb 14, 2006 • By ANDREW KEEN
THE ANCIENTS were good at resisting seduction. Odysseus fought the seductive song of the Sirens by having his men tie him to the mast of his ship as it sailed past the Siren's Isle. Socrates was so intent on protecting citizens from the seductive opinions of artists and writers, that he outlawed them from his imaginary republic.
We moderns are less nimble at resisting great seductions, particularly those utopian visions that promise grand political or cultural salvation. From the French and Russian revolutions to the counter-cultural upheavals of the '60s and the digital revolution of the '90s, we have been seduced, time after time and text after text, by the vision of a political or economic utopia.
Rather than Paris, Moscow, or Berkeley, the grand utopian movement of our contemporary age is headquartered in Silicon Valley, whose great seduction is actually a fusion of two historical movements: the counter-cultural utopianism of the '60s and the techno-economic utopianism of the '90s. Here in Silicon Valley, this seduction has announced itself to the world as the "Web 2.0" movement.
LAST WEEK, I was treated to lunch at a fashionable Japanese restaurant in Palo Alto by a serial Silicon Valley entrepreneur who, back in the dot.com boom, had invested in my start-up Audiocafe.com. The entrepreneur, like me a Silicon Valley veteran, was pitching me his latest start-up: a technology platform that creates easy-to-use software tools for online communities to publish weblogs, digital movies, and music. It is technology that enables anyone with a computer to become an author, a film director, or a musician. This Web 2.0 dream is Socrates's nightmare: technology that arms every citizen with the means to be an opinionated artist or writer.
"This is historic," my friend promised me. "We are enabling Internet users to author their own content. Think of it as empowering citizen media. We can help smash the elitism of the Hollywood studios and the big record labels. Our technology platform will radically democratize culture, build authentic community, create citizen media." Welcome to Web 2.0.
Buzzwords from the old dot.com era--like "cool," "eyeballs," or "burn-rate"--have been replaced in Web 2.0 by language which is simultaneously more militant and absurd: Empowering citizen media, radically democratize, smash elitism, content redistribution, authentic community . . . . This sociological jargon, once the preserve of the hippie counterculture, has now become the lexicon of new media capitalism.
Yet this entrepreneur owns a $4 million house a few blocks from Steve Jobs's house. He vacations in the South Pacific. His children attend the most exclusive private academy on the peninsula. But for all of this he sounds more like a cultural Marxist--a disciple of Gramsci or Herbert Marcuse--than a capitalist with an MBA from Stanford.
In his mind, "big media"--the Hollywood studios, the major record labels and international publishing houses--really did represent the enemy. The promised land was user-generated online content. In Marxist terms, the traditional media had become the exploitative "bourgeoisie," and citizen media, those heroic bloggers and podcasters, were the "proletariat."
This outlook is typical of the Web 2.0 movement, which fuses '60s radicalism with the utopian eschatology of digital technology. The ideological outcome may be trouble for all of us.
SO WHAT, exactly, is the Web 2.0 movement? As an ideology, it is based upon a series of ethical assumptions about media, culture, and technology. It worships the creative amateur: the self-taught filmmaker, the dorm-room musician, the unpublished writer. It suggests that everyone--even the most poorly educated and inarticulate amongst us--can and should use digital media to express and realize themselves. Web 2.0 "empowers" our creativity, it "democratizes" media, it "levels the playing field" between experts and amateurs. The enemy of Web 2.0 is "elitist" traditional media.
Empowered by Web 2.0 technology, we can all become citizen journalists, citizen videographers, citizen musicians. Empowered by this technology, we will be able to write in the morning, direct movies in the afternoon, and make music in the evening.
Sounds familiar? It's eerily similar to Marx's seductive promise about individual self-realization in his German Ideology:
Whereas in communist society, where nobody has one exclusive sphere of activity but each can become accomplished in any branch he wishes, society regulates the general production and thus makes it possible for me to do one thing today and another tomorrow, to hunt in the morning, fish in the afternoon, rear cattle in the evening, criticise after dinner, just as I have a mind, without ever becoming hunter, fisherman, shepherd or critic.
Just as Marx seduced a generation of European idealists with his fantasy of self-realization in a communist utopia, so the Web 2.0 cult of creative self-realization has seduced everyone in Silicon Valley. The movement bridges counter-cultural radicals of the '60s such as Steve Jobs with the contemporary geek culture of Google's Larry Page. Between the book-ends of Jobs and Page lies the rest of Silicon Valley including radical communitarians like Craig Newmark (of Craigslist.com), intellectual property communists such as Stanford Law Professor Larry Lessig, economic cornucopians like Wired magazine editor Chris "Long Tail" Anderson, and new media moguls Tim O'Reilly and John Batelle.
The ideology of the Web 2.0 movement was perfectly summarized at the Technology Education and Design (TED) show in Monterey, last year, when Kevin Kelly, Silicon Valley's über-idealist and author of the Web 1.0 Internet utopia Ten Rules for The New Economy, said:
Imagine Mozart before the technology of the piano. Imagine Van Gogh before the technology of affordable oil paints. Imagine Hitchcock before the technology of film. We have a moral obligation to develop technology.
But where Kelly sees a moral obligation to develop technology, we should actually have--if we really care about Mozart, Van Gogh and Hitchcock--a moral obligation to question the development of technology.
The consequences of Web 2.0 are inherently dangerous for the vitality of culture and the arts. Its empowering promises play upon that legacy of the '60s--the creeping narcissism that Christopher Lasch described so presciently, with its obsessive focus on the realization of the self.
Another word for narcissism is "personalization." Web 2.0 technology personalizes culture so that it reflects ourselves rather than the world around us. Blogs personalize media content so that all we read are our own thoughts. Online stores personalize our preferences, thus feeding back to us our own taste. Google personalizes searches so that all we see are advertisements for products and services we already use.
Instead of Mozart, Van Gogh, or Hitchcock, all we get with the Web 2.0 revolution is more of ourselves.
STILL, the idea of inevitable technological progress has become so seductive that it has been transformed into "laws." In Silicon Valley, the most quoted of these laws, Moore's Law, states that the number of transistors on a chip doubles every two years, thus doubling the memory capacity of the personal computer every two years. On one level, of course, Moore's Law is real and it has driven the Silicon Valley economy. But there is an unspoken ethical dimension to Moore's Law. It presumes that each advance in technology is accompanied by an equivalent improvement in the condition of man.
But as Max Weber so convincingly demonstrated, the only really reliable law of history is the Law of Unintended Consequences.
We know what happened first time around, in the dot.com boom of the '90s. At first there was irrational exuberance. Then the dot.com bubble popped; some people lost a lot of money and a lot of people lost some money. But nothing really changed. Big media remained big media and almost everything else--with the exception of Amazon.com and eBay--withered away.
This time, however, the consequences of the digital media revolution are much more profound. Apple and Google and Craigslist really are revolutionizing our cultural habits, our ways of entertaining ourselves, our ways of defining who we are. Traditional "elitist" media is being destroyed by digital technologies. Newspapers are in freefall. Network television, the modern equivalent of the dinosaur, is being shaken by TiVo's overnight annihilation of the 30-second commercial. The iPod is undermining the multibillion dollar music industry. Meanwhile, digital piracy, enabled by Silicon Valley hardware and justified by Silicon Valley intellectual property communists such as Larry Lessig, is draining revenue from established artists, movie studios, newspapers, record labels, and song writers.
Is this a bad thing? The purpose of our media and culture industries--beyond the obvious need to make money and entertain people--is to discover, nurture, and reward elite talent. Our traditional mainstream media has done this with great success over the last century. Consider Alfred Hitchcock's masterpiece, Vertigo and a couple of other brilliantly talented works of the same name Vertigo: the 1999 book called Vertigo, by Anglo-German writer W.G. Sebald, and the 2004 song "Vertigo," by Irish rock star Bono. Hitchcock could never have made his expensive, complex movies outside the Hollywood studio system. Bono would never have become Bono without the music industry's super-heavyweight marketing muscle. And W.G. Sebald, the most obscure of this trinity of talent, would have remained an unknown university professor had a high-end publishing house not had the good taste to discover and distribute his work. Elite artists and an elite media industry are symbiotic. If you democratize media, then you end up democratizing talent. The unintended consequence of all this democratization, to misquote Web 2.0 apologist Thomas Friedman, is cultural "flattening." No more Hitchcocks, Bonos, or Sebalds. Just the flat noise of opinion--Socrates's nightmare.
WHILE SOCRATES correctly gave warning about the dangers of a society infatuated by opinion in Plato's Republic, more modern dystopian writers--Huxley, Bradbury, and Orwell--got the Web 2.0 future exactly wrong. Much has been made, for example, of the associations between the all-seeing, all-knowing qualities of Google's search engine and the Big Brother in Nineteen Eighty-Four. But Orwell's fear was the disappearance of the individual right to self-expression. Thus Winston Smith's great act of rebellion in Nineteen Eight-Four was his decision to pick up a rusty pen and express his own thoughts:
The thing that he was about to do was open a diary. This was not illegal, but if detected it was reasonably certain that it would be punished by death . . . Winston fitted a nib into the penholder and sucked it to get the grease off . . . He dipped the pen into the ink and then faltered for just a second. A tremor had gone through his bowels. To mark the paper was the decisive act.
In the Web 2.0 world, however, the nightmare is not the scarcity, but the over-abundance of authors. Since everyone will use digital media to express themselves, the only decisive act will be to not mark the paper. Not writing as rebellion sounds bizarre--like a piece of fiction authored by Franz Kafka. But one of the unintended consequences of the Web 2.0 future may well be that everyone is an author, while there is no longer any audience.
SPEAKING OF KAFKA, on the back cover of the January 2006 issue of Poets and Writers magazine, there is a seductive Web 2.0 style advertisement which reads:
Kafka toiled in obscurity and died penniless. If only he'd had a website . . . .
Presumably, if Kafka had had a website, it would be located at kafka.com which is today an address owned by a mad left-wing blog called The Biscuit Report. The front page of this site quotes some words written by Kafka in his diary:
I have no memory for things I have learned, nor things I have read, nor things experienced or heard, neither for people nor events; I feel that I have experienced nothing, learned nothing, that I actually know less than the average schoolboy, and that what I do know is superficial, and that every second question is beyond me. I am incapable of thinking deliberately; my thoughts run into a wall. I can grasp the essence of things in isolation, but I am quite incapable of coherent, unbroken thinking. I can't even tell a story properly; in fact, I can scarcely talk . . .
One of the unintended consequences of the Web 2.0 movement may well be that we fall, collectively, into the amnesia that Kafka describes. Without an elite mainstream media, we will lose our memory for things learnt, read, experienced, or heard. The cultural consequences of this are dire, requiring the authoritative voice of at least an Allan Bloom, if not an Oswald Spengler. But here in Silicon Valley, on the brink of the Web 2.0 epoch, there no longer are any Blooms or Spenglers. All we have is the great seduction of citizen media, democratized content and authentic online communities. And weblogs, course. Millions and millions of blogs.
Andrew Keen is a veteran Silicon Valley entrepreneur and digital media critic. He blogs at TheGreatSeduction.com and has recently launched aftertv.com, a podcast chat show about media, culture, and technology.