The decline of Western civilization, 140 characters at a time
May 6, 2013, Vol. 18, No. 32 • By MATT LABASH
“The Machine,” they exclaimed, “feeds us and clothes us and houses us; through it we speak to one another, through it we see one another, in it we have our being. . . . [T]he Machine is omnipotent, eternal; blessed is the Machine.” —E.M. Forster, “The Machine Stops” (1909)
You can try, if so inclined. But unlike Kim Kardashian, Lady Gaga, the pope, the Dalai Lama, and the Church of England (which invited Twitter users to help select the next archbishop of Canterbury), you won’t find me there. I’m not on it, and hope never to be. I say hope, because the clip at which the Twidiocracy has infiltrated itself into every crevice of society might leave me no choice. In the dystopian future—which in the age of Google glasses is starting to feel like the dystopian present—I might be forced to join Twitter in order to, say, collect my Social Security e-check when the time comes. Though the likelihood of there still being Social Security in 25 years is much less than the likelihood of people endlessly tweeting about how there’s no more Social Security.
If you’re not following this, there’s an outside chance you still have an analog life that unfolds beyond the glow of a screen. That you remember a time, not all that long ago, when the social-media contagion of FacebookTwitterPinterestInstagram hadn’t yet made us wonder how we used to talk to each other. A time when a phone was considered a communication device, not an extra limb. (A Stanford study found 75 percent of iPhone users fall asleep with their phones in their beds, only 2 percent less than the number of spouses who sleep with each other.) More likely, it just means you’ve been in a deep coma since Twitter’s birth in 2006. In which case, I envy you.
If you haven’t gathered by now, I’m not a Twitter fan. In fact, I outright despise the inescapable microblogging service, which nudges its users to leave no thought unexpressed, except for the fully formed ones (there’s a 140-characters-per-tweet limit). I hate it not just because the Twidiocracy constantly insists I should love it, though that certainly helps. Being in the media profession (if “profession” isn’t overstating things), where everyone flocked en masse to the technology out of curiosity or insecurity or both, I’ve hated it reflexively since its beginning. But with time’s passage and deliberation, I’ve come to hate it with deeper, more variegated richness. I hate the smugness of it, the way the techno-triumphalists make everyone who hasn’t joined the Borg feel like they’ve been banished to an unpopulated island, when in fact the numbers don’t support that notion. Even after seven years of nonstop media hype, only 16 percent of Internet users tweet, the same as the percentage of 14-49-year-olds who have genital herpes. The difference being that the latter are not proud of their affliction, while the former never shut up about theirs.
I hate the way Twitter transforms the written word into abbreviations and hieroglyphics, the staccato bursts of emptiness that occur when Twidiots who have no business writing for public consumption squeeze themselves into 140-character cement shoes. People used to write more intelligently than they speak. Now, a scary majority tend to speak more intelligently than they tweet. If that’s a concern—and all evidence suggests it isn’t—you can keep your tweets private, readable only by those you invite. But that reduces your number of “followers,” so almost nobody does it. A private Twitter account cuts against the whole spirit of the enterprise—a bit like showing up at a nude beach in a muumuu.
There are admittedly pockets of genius on Twitter, as anyone who’s ever visited the Goldman Sachs Elevator Gossip page knows. (@GSElevator: “If you can only be good at one thing, be good at lying. . . . Because if you’re good at lying, you’re good at everything.”) But most often, Twitter makes otherwise great writers good, and good writers bad. It’s a format that encourages even the pros to locate the mediocrity within. From @salmanrushdie: “One year today since I joined Twitter. I’m amazed to have 400,000+ of you with me. Thanks for following, everyone. It has been fun.”
I hate the way Twitter turns people into brand managers, their brands being themselves. It’s nearly impossible now to watch television news without an anchor imploring you to “follow me on Twitter,” even as you’re already following him on television. You couldn’t do this much following in the physical world without being slapped with a restraining order.
Though I’ve just catalogued much to hate about Twitter, there’s plenty more to hate about Twitter. I hate that Twitter makes the personal public. That conversations between two intimates that formerly transpired in person or by email become conversations between two intimates for the benefit of their followers. I’ve actually been to lunch with people who have tweeted throughout, unbeknownst to me. (The fact that they only looked up from their iPhone twice in two hours might’ve been a tipoff. Though that’s pretty much par for the course, even with untweeted lunches these days.)
I hate that formerly respectable adults now think it’s okay to go at each other like spray-tanned girls on Jersey Shore, who start windmill-slapping each other after they’ve each had double-digit cherry vodkas and one calls the other “fat.” None of which gives onlookers pause. After all, it’s only a Twitter fight. The Pulitzer Prize-winning writer of Friday Night Lights, Buzz Bissinger, recently made headlines for penning a lengthy GQ article about his Gucci shopping addiction (75 pairs of boots! 41 pairs of leather pants!), which set him back over half a million dollars. But long before that, he’d humiliated himself on Twitter, machine-gunning all comers with one f-bomb tweet after another. Not only was Bissinger unabashed, he wrote a piece boasting about it for the New Republic (appropriately titled “Twidiot”). Buried deep in his GQ piece, however, was an admission more troubling than an addiction to overpriced clothing that makes him look like the interior of a 1982 Crown Victoria. Once considered to be a fine long-form writer, Bissinger now found himself losing focus: “I f—ed around more and more—nasty guillotine rants on Twitter going after everything and everyone, Googling my name six or seven times a day, craving crumbs of attention.”
Being driven to distraction by the steady dopamine-drip of attention on Twitter and other social-media sites is hardly unique to megalomaniacal leather enthusiasts. A recent survey by Boost Mobile found 16-25-year-olds so addicted that 31 percent of respondents admitted to servicing their social accounts while “on the toilet.” And a Retrevo study found that 11 percent of those under age 25 allow themselves to be interrupted by “an electronic message during sex.”
A technology that incentivizes its status-conscious, attention-starved users to yearn for ever more followers and retweets, Twitter causes Twidiots to ask one fundamental question at all times: “How am I doing?” That’s not a question most people can resist asking, even in their offline lives, but on Twitter, where tweeters are publicly judged by masses of acquaintances and strangers alike, the effect tends to be intensified. Even the most independent spirit becomes a needy member of the bleating herd. It’s the nerd incessantly repeating what the more popular kids say. It’s the pretty girl, compulsively seeking compliments.
As a friend of mine says, “It’s addictive and insidious. I see it even with smart people who ought to know better but can’t help themselves. They give wildly disproportionate weight to the opinions they read on Twitter, mostly because they’re always reading Twitter. Which fills them with anxiety, distorts their perceptions, and makes it almost impossible for them to take the long view on anything. Every crisis is huge, ominous, and growing. Every attack requires an immediate response.”
A yearlong Pew study reinforces this. It found that Twitter users tend to be considerably younger and more liberal than the general public. But whether tweets tended to skew liberal or conservative was almost immaterial. Twitter reaction to current events was often at odds with overall public opinion, and it was “the overall negativity that stands out.”
Another friend, who has seen her industry overrun by Twitter, puts it like this: “It’s the constant mirror in front of your face. The only problem is that it’s not just you and the mirror. You’re waiting for the mirror to tell you what it thinks. The more you check for a response, the more habituated you become to craving one. It’s pathetic, because at the end of the day, a Twitter user is asking, ‘Am I really here, and do you love me?’ And there will always be someone else who gets more approbation for their 140 characters to make you feel like you’re never quite good enough. The whole thing is like being in the worst years of one’s adolescence.”
Perhaps nowhere is Twidiot adolescence more pronounced than in the junior-high cafeteria that is the new media. Consider that the Washington Post recently ran a piece on the death of the metronome, which musicians are replacing with iPhone apps that perform the same function. Except that the entire story began as a series of 17 tweets, which the paper then republished verbatim. (And you wonder why print is dying.) Then there’s Nick Bilton, the New York Times “Bits Blog” writer, who lives on the cutting-edge of connectedness. Or perhaps disconnectedness, as he recently proclaimed how rude it is to leave voicemails, which burden the recipient with picking up a phone and having—brace yourself—an actual conversation. He ignores the voicemails of his father, and communicates with his mother, he admitted without shame, mostly through Twitter.
How desperate are journos to prove how au courant they are? Well, look no further than Mediaite’s media correspondent, Tommy Christopher. In 2010, he live-tweeted his own heart attack, while paramedics were working on him. Actual tweet: “I gotta be me. Livetweeting my heart attack. Beat that!”
Unfortunately, a lot of people are trying. Here’s a short list of how pathological the Twitterfication of the world has become: A Houston hospital felt it necessary to live-tweet a brain surgery. A second-grade class in Buffalo corrected the misspelled tweets of NFL players as a grammar exercise. A Washington, D.C., hotel promised a “dedicated social media butler” as part of its $47,000 Obama inaugural package, to chronicle the experience “so your friends and family can follow your adventures on Twitter.” And real-life pimps and prostitutes are regularly found soliciting on Twitter, perhaps thinking it affords them cover among all the attention whores.
The British media announced the appointment by David Cameron of a “Twitter Tsar,” to be paid nearly as much as the prime minister himself. (Cameron once considered temporarily shutting down Twitter, after mobs of looters used it to organize during the 2011 riots.) The Israeli Defense Forces became the first military force to declare war on Twitter (against Hamas). Their declaration-of-war tweet earned 430 retweets, which wasn’t as many as their “Happy #passover!” message (434 retweets). When Pope Benedict, for God only knows what reason, felt it necessary to join, he was given a typical warm Twitter welcome with “now let’s hit this bitch up with some hate tweets.”
Twitter celebrity death hoaxes are staples. Adam Sandler supposedly died four times in four months in the same snowboarding accident. Not to be confused with Twitter death threats, which are also hardy perennials. Twitter lynch mobs have threatened the lives of everyone from Wisconsin governor Scott Walker to NFL commissioner Roger Goodell.
The state of Ohio toyed with announcing executions on Twitter. While the city of Chicago’s “social media director” (yes, they have one) decided to get a handle on the city’s out-of-control murder rate by asking followers to tweet recommendations tagged #whatifchicago. (Here’s an idea: #whatifchicago hired more police instead of social-media directors?)
Don’t think you have enough Twitter followers? Well apparently, neither did Barack Obama, Mitt Romney, and Newt Gingrich, all of whom have been accused of inflating the numbers with legions of fakes. A web tool called “Fake Follower Check” determined that nearly 70 percent of Obama’s didn’t actually exist. But if you’re undeterred by being followed by people who aren’t, technically speaking, people, you can buy them. In order to write about it, Slate’s Seth Stevenson bought 27,000 mass-produced fake zombie followers for a cool $202 from sketchy Internet middlemen who procure them from suppliers in India. Even our fake people, sadly, are outsourced.
While Twitteristas love to champion Twitter as freedom’s trainbearer, seldom mentioned is that the bad guys love Twitter too—as a tool of propaganda, surveillance, and intimidation. Al Qaeda and the Taliban are on Twitter. China launched a copycat “Red Twitter” service, to promote revolutionary spirit, though they still use regular ol’ Twitter to spy on and punish their citizenry, sentencing a woman to a year in a labor camp for retweeting a post that mocked Chinese protesters who destroyed Japanese products.
Of course, most tweets don’t land you in prison. Most of them, in fact, are just inconsequential crap. Don’t take my word for it. Take science’s. A Proceedings of the National Academy of Sciences paper said upwards of 80 percent “of posts to social-media sites (such as Twitter) consist simply of announcements of one’s own immediate experiences.” Rutgers researchers found that 51 percent of mobile-posted Twitter messages were “me now” messages, and that 80 percent of tweets analyzed could be classified as “meformers” (informing about yourself). After Pear Analytics collected thousands of tweets over two weeks and broke them down into six categories, the leader at 40.5 percent was “pointless babble.” Even Twitter users, in a study conducted by MIT, Carnegie Mellon, and Georgia Tech researchers, said only a little over a third of the tweets they receive are worthwhile.
Though Twitter made about $140 million in ad revenue in 2011, their most recent valuation was pegged at a whopping $10 billion ahead of their expected 2014 IPO. Color me skeptical (see Facebook IPO crash, circa 2012), but the true value of Twitter might have best been captured by the Annenberg School for Communication. They polled 1,900 subjects, asking if they’d be willing to pay for Twitter. The result? There were 0.00 percent takers. As in NONE.
Not that this deters the Twitter triumphalists for a second. In fact, after all the Twitter-bashing I’ve just engaged in, I’m starting to feel unfair. Yes, researching Twidiots for too long can put you in mind of Thoreau, who said, “But lo! Men have become the tools of their tools.” Still, knocking the Twidiocracy from afar seems like the kind of behavior I condemn on Twitter. So I went to the one place where I knew I’d find the highest concentration of Twitter triumphalists. The place that gets credit for originally taking Twitter viral. The place where men and tools are indistinguishable from each other: Austin’s South by Southwest Interactive Festival.
In the ’90s, SXSW was known for the kind of kick-back atmosphere where hipster aesthetes in pearl-button hillbilly shirts could go eat’n’drink their weight in barbecue brisket and Lone Star beer, while checking out the sleeper independent film or unsigned band that would be launched as the next big thing. The old-timers tell me it was nice while it lasted.
SXSW is now gut-to-butt crowded and ultra-corporate. These days, a musical “discovery” is stumbling upon LL Cool J performing on “The Jacked Stage by Doritos”—an edifice literally made to look like a giant Doritos vending machine. But something else changed, too. Around the time the entire world decided, “Why pay directors and musicians for their work when we can just watch it on YouTube?” the Interactive portion of SXSW kicked into high gear. The music and film festivals are still a big draw, certainly. But the tech industry has pretty much taken over this world, just as they have all the others. Even the dinosaur rock stars of yesteryear are instructed by the new rock stars of the tech industry, in special editions of the Social Media Monthly, that they should “curate and share content every day.”
This year, SXSW Interactive, with 25,000 attendees, ran during 5 of the 11 days of the larger festival. Local tourism boosters like to say “Keep Austin weird”—the inscription is on every T-shirt and shot-glass in hotel gift shops. And you will still see the occasional bohemian pushing a painted-pink tumbleweed down Sixth Street, or you’ll get a pedi-cab driver wheeling you around in a bike chariot shaped like Darth Vader, complete with light saber and speakers in his helmet blasting Procol Harum’s “A Whiter Shade of Pale.” But the real weirdness is that everyone thronging the streets during this Geek Mardi Gras seems to be from Silicon Valley, or wants to meet a venture capitalist who is. Even a tattoo artist at True Blue Tattoos tells me how soulless Austin becomes during Interactive. Customers ask to be inked with their company logos or, worse, the little blue bird that serves as Twitter’s trademark.
Like any religious cult, it prizes its own inscrutable language (words like “optimization” and “curate our stories” are prominent). So to keep up, I consult a few online “Web 2.0 B.S. generators.” I figure if someone at a party asks what I do, I can earn a valuable status upgrade by saying that I digitize tag clouds to e-enable infomediaries and engage data-driven long-tail folksonomies which harness rss-capable platforms and envisioneer cross-media functionalities.
Not that anyone would ask. They’re too busy peddling their own Web 2.0 B.S. And a good thing too, since my lingo is probably badly dated. Things age fast, here. On a downtown convention center escalator, I actually hear a woman say, in regular conversation, “Last year’s innovation is this year’s old news.”
My mission is narrow. I’m not here to scope out 3-D printers or smart contact lenses or whatever the bleeding edge of tomorrow is. I’m here to attend every dopey social-media/Twitter event I can find. While social media arrived many innovation cycles ago in SXSW time, it’s clear that, like an inoperable tumor, it’s here to stay. You see every kind of app getting pushed here, from the “Hater” iPhone app (“share the things you hate with people you love”) to the “Bang with Friends” app, which lends “social” new meaning as it promises to “anonymously find SXSW’ers who are down for the night.”
The problem with selecting social-media panels is that there are so damn many. In four days of attending them from morning until night, I will get to about a fifth of the ones offered. Almost all of them have gas-baggish titles like “Black Twitter Activism, Bigger than Hip Hop” or “One Million Strong: Social Media and the U.S. Army.” I finally end up finding a great use for Twitter when I check out the cracks of a few techie wisenheimers who did not come to SXSW out of pure loathing, but who are hashtagging “#betterSXSWpanels” with made-up titles such as “How to Be Pretentious Without Being Smart” or “My Agency Just Did A Harlem Shake Video, Now What?”
Of course, I’m forced to keep up with the fake titles at night, on my laptop. After all, I wouldn’t dare bring my PC into the land of iGadgetry like some sort of philistine, and I can’t follow them on my dumbphone—or, as a horrified David Carr of the New York Times calls it when spying it during a party he throws at his hotel (the week’s finest, since he serves brisket from Franklin Barbecue), “Look at your mom phone!”
Nothing against moms. But Carr’s right—my phone isn’t sexy. It’s an old clamshell flip job that I’ve carried around since last decade, an eternity in phone world. I’ve resisted entreaties from our office manager to take a smartphone instead. Not only because I wish to avoid the electronic monitoring bracelet that I see everyone else wearing. But as a self-regulation mechanism, so that I stay mindful that there is still flesh-and-blood life outside of the Internet. At least for the moment.
As I walk through the convention center to attend my first panel, I see that exactly nobody else has my smartphone reticence. Everybody is on theirs, pretty much full time. Entire hallways and lounges are silent as the inhabitants ignore each other, lost in their own iWorlds. Their heads are tucked and rocking like those of trance-induced madrassa students, their thumbs pistoning as fast as they think, tweeting and Foursquaring and iHate-ing and working any number of other apps that will go from being the World’s Greatest Innovation to MySpace (the universal term of derision for all things obsolete) before you’ve ever heard of them.
I arrive at one session five minutes early, but it turns out to be way too late. The cavernous conference room, which looks to hold about 500, is already packed. Several hundred others congregate outside the door. In front of a spillover hallway speaker, they sit on the floor wordlessly and in unison, all of the same hive-mind. They start thumb-clacking on their iPhones and iPads, live-tweeting the speaker, or maybe surfing for nerd porn or Googling themselves, who knows? To turn out a crowd of this number and intensity, you’d think the panel was titled “Finally: a Cure for Cancer” or “See this Sack of Money?—Take It!” But, no. It’s “How Twitter Has Changed How We Watch TV.”
I take a seat on the hallway floor with the rest of the hive to listen to Jenn Deering Davis, cofounder of Union Metrics. If you’re a committed Twidiot, you might still want to go back and read all the live-tweets from the session, so consider this a spoiler alert. It turns out, a lot of people tweet while they watch TV. What the pros call “the second screen experience.”
Davis is one of those people who say “that’s interesting,” whenever she doesn’t have anything interesting to say. So she finds it “interesting” that the teen drama Pretty Little Liars is the most-tweeted-about show on television. It’s also “interesting” that it’s harder for viewers to live-tweet House of Cards, because Netflix released all 13 episodes of its original series simultaneously. Likewise, it’s “interesting” that the voice-actors from animated shows like Archer actually maintain a Twitter presence in character, so that you feel like you’re “part of the conversation” by carrying out a “para-social relationship” with a TV character, even if that character doesn’t actually exist. When the new Hawaii Five‑O asked its fans to vote on an ending via Twitter, showing one ending to the East Coast, while airing a different one on the West Coast? Well, that’s almost too much for Davis, as far as revolutionizing television goes. “It’s fascinating,” she says, mixing things up.
After an hour of this, I feel depleted, as if brain cells have died and I’ve just shed 30 IQ points. I ask a perky Australian techie, sitting Indian-style on the floor next to me, if all the panels are this overpopulated. “Yeah,” she says. “Anything that’s got a good title, like ‘Top 10 Ways to Go Viral,’ that kind of shit, you’ve gotta get there real early.”
“Hmm,” I respond, applying my new knowledge. “That’s interesting.”
Evan Fitzmaurice, an Austin-based lawyer and longtime friend who until recently was the Texas Film Commissioner, has attended many a SXSW. He tells me one night over dinner that while he’s wired to the hilt (“I’ve gotta connect to the Matrix”), he sees the downside of perpetual connectedness. “You’re truncating natural thought. Things don’t gestate anymore. It’s instantaneous, without the benefit of reflection. And everything’s said at volume 10. Nothing’s graduated anymore. It’s a clamor.” Though not religious himself, he says what I witness at SXSW would be recognized by any religious person. “They’re trying to supplant deliverance and redemption through religion with civil religion and technological redemption—the promise of a sublime life on a higher plane.”
In one instance, the Twidiocracy tries to have it both ways. I attend a Sunday morning session called “Transcendent Tech: Is G-d Rebooting the World?” It’s a discussion headed by a bearded Mordechai Lightstone, in full Hasidic regalia as the director of social media for the Lubavitch News Service, and Seth Cohen, director of network initiatives at the Charles and Lynn Schusterman Family Foundation. “God,” Cohen says, “was a coder. She was a hacker. She saw a plan for the world.” An element of those plans, he says, was the Ten Commandments. Though now, “we are in a 2.0 phase.”
Our group then contemplates the 2.0ness of it all. Cohen, though Jewish, wonders what it would be like if the Catholic church “came out with a chief technology officer” who said “we’re going to reboot the Catholic church. And we actually decided to have someone design apps and take a technological approach to changing the paradigm.” A man sitting next to me would like to see “an Amazon of the Catholic church” since there’s a “distribution of specialized services problem” and he wants to know how the church will be “brought to my front doorstep.” A man in thick geek glasses says he sees the Bible as the “first great example of opensourcing.” Cohen adds that he still thinks there are prophets, as he sees “the prophetic voice” when he reads friends’ comments on his Facebook page. Another gent says his problem with the Bible is there’s no “error correction.” Paul, for instance, was a homophobe, so he’d like to see more wiki-style group editing. One woman, who has 33,000 Twitter followers, says she writes Jewish tweets. She thinks that’s the wave of the future, since “people aren’t going to houses of worship anymore.”
This kind of talk could send even a believer like me running into Richard Dawkins’s arms. If God is indeed rebooting the world in this vein, here’s hoping His hard drive crashes.
Not everybody at SXSW thinks 140 characters are the answer to everything. For some, that sort of sustained thought is heavy sledding. Oxford University Press lexicographers calculate that the average tweet is 14.98 words. If a picture is worth 1,000 words, that means that a picture is also worth 66.7 tweets.
Of course, you can share pictures on Twitter. And that may be the direction in which we need to head, since attention spans are shrinking and words are so wordy. Which is the reason for panels like “Smile: People Like Your Picture More than Words.” Chas Edwards, the chief revenue officer at Luminate, gives us some mind-blowing numbers: With so many phone cameras, 10 percent of all the photos ever taken have been snapped in the last 12 months; 70 percent of all social-media activity involves a photo; people who read news in newspapers spend an average of 25 minutes reading, while people who read news online spend an average of 70 seconds.
Lesson: We’ve got to work fast. Words are slow. Pictures, fast. As Chas speaks, most of the room is looking down into their iAbysses, thumb-pistoning away. He observes that “only 10 percent of you are actually consuming me. What I’m hoping is that the other 90 percent of you are online enjoying more fully this experience and tweeting it.” There’s no way for me to tell if they are, since I’m stuck in the lousy real world with my dumbphone. But I feel for Chas. Maybe he needs to think about talking faster. Or about streaming his presentation in little chunks the tweeters can post on Vine (Twitter’s new six-second video clip-sharing app).
Some Twidiots have an easier time paying attention, especially if it’s to themselves. Witness Cory Booker, a politician who is so baldly self-aggrandizing, so intent on “telling my truth to the world,” so emblematic of our social-media age, that he will almost surely become president of the United States someday. When not tweeting, Booker is the mayor of Newark. (As of this writing, he’s tweeted 27,319 times and has 1,382,151 followers.)
Booker, who has become a media darling (he’ll end up being voted best speaker at SXSW), is smart, warm, and a shockingly effective suck-up (show me another politician in America who follows 71,529 people on Twitter). Even a decade ago, when Booker was a lowly city councilman, I used to get press releases about his birthday party. But now, he’s no longer confined by the straitjacket of a press release. He can tell the truth—his truth—sometimes 40 or 50 times a day on Twitter. And that truth isn’t just between Cory Booker and his followers. No, that truth is between Cory Booker and his followers and whoever retweets them. We’re talking multiples of truth, here.
Now, when Booker needs to plug a talk-show appearance or quote Oprah Winfrey (“True forgiveness is when you can say, ‘Thank you for that experience’ ”) or get a pothole fixed that he heard about from a constituent on Twitter (he might’ve heard it from his staff, if they could get a word in edgewise between his tweets), he just fires away with his 140-character truth cannon.
Booker tells us that we are “all syndicators of information. We are media outlets.” Some more than others. He lets slip that he gets “more consumer impressions from one tweet than [does] my state newspaper.” Which is why, he announces to a rapturous SXSW audience, he’s cofounded #waywire, a social-video sharing service that features news that’s important to you, as well as lots of Cory Booker videos.
“If you want to see my microblog identity, you could just go through my tweets—but now you can go see my video identity,” Booker says, before reverting to talking about himself in the third person. “What music videos does Cory like? Go to the inspirational videos that really move him. This morning, I tweeted out a #waywire video of Nina Simone.” (Not being a Booker follower, I missed it. But how great would it be if it were Simone’s 1974 song “Funkier than a Mosquito’s Tweeter.” O sweet synergy!)
All over SXSW, Twidiots are thick on the ground. At a sports panel, “Integrating Digital Into the Live Game Experience,” representatives of the NBA and NASCAR talk about everything from fans interactively posting messages on the arena Jumbotron to concession stand apps to tweeting from your car during a race (one NASCAR driver who tweeted from the cockpit after a Daytona 500 crash gained over 100,000 followers in two hours). They talk about just about everything except what you’re purportedly there to do—watch a game or a race. Or, rather, “an experience,” as the digerati call games and races.
As Jayne Bussman-Wise, the robotic digital director of the Brooklyn Nets/Barclays Center, puts it, “We’re really monitoring analytics. We work closely with our research analytics team. Everything’s worth pumping into our CRM system. . . . We’re listening to the conversation on social and sort of reacting to that.” The expression “it’s not all fun and games” has never been more true.
At a session entitled “The Tangled Web We Leave: Digital Life after Death,” we’re warned to get our online affairs in order. (Give those passwords to your loved ones, because if you get electrocuted dropping your iPad in the tub tomorrow, how will your family access your Instagram account?) But we’re also told of a new app called LivesOn, the logical terminus of the Twidiocracy. It’s a service that studies your pre-mortem Twitter feed for tastes and syntax, and then keeps tweeting in what it assumes is your voice after you expire. (Company slogan: “When your heart stops beating, you’ll keep tweeting.”)
But even that might not be the apex of strangeness. At a 90-minute session at Pete’s Dueling Piano Bar, I listen to a stage full of advertising honchos bloviate on the “Power of Microcontent and Marketing in the Moment.” They boisterously joke and cajole, cut each other off, slap each other’s backs, toss off profanities, and generally inflate themselves like pufferfish. And what is all the excitement over? A single tweet put out by Oreo during this year’s 34-minute Super Bowl blackout in New Orleans’s Superdome.
Featuring many of the people on the team that had a hand in it (“emerging media” types from places like Mondelez International and 360i), the panel provides all kinds of bluster about “authenticity” and “identifying all relevant streams” and “a snack conversation” and “real-time marketing” and “transformation” and “bellwether moments” and “eyes and ears . . . shifting at scale.”
The content of the tweet, it should be noted, was never even spelled out. It didn’t need to be explained to this insider crowd. The tweet is simply known as “the Dunk in the Dark.” Explaining what it is to a roomful of “real-time marketers” is like explaining who L. Ron Hubbard is to a roomful of Scientologists, since it may be, quite possibly, the tweet that saved and/or relaunched an entire industry. A tweet that had the Washington Post asking, “Can Twitter replace the Super Bowl ad?” In case you missed it (and I did; like most Americans I was watching the Super Bowl, not Oreo’s Twitter feed), here it is in its entirety: “Power out? No problem.” A link is provided to a photo of a lighted Oreo in a dark room with the tagline: “You can still dunk in the dark.”
A clever use of improvisational advertising during a freak occurrence? Sure. Though you would think, from the reaction both of this room and the media (the latter of which are always eager to sing hosannas to anything with the prefix “social”), that electricity had been discovered or the automobile had been invented. All except for one lonely columnist, that is. Mark Ritson, an associate professor of marketing at the Melbourne Business School, wrote a column for BRW, an Australian business magazine, in which he did some back-of-the-envelope calculations.
How much carry did the universally praised Oreo tweet actually have? Well, Ritson figured, Oreo had 65,000 followers on Twitter at the time of the tweet. The average click-through rate of followers on any tweet is a mere 2 percent. Crunching the click-through rates and adding the retweets with their potential reach, he generously estimated that “the Dunk in the Dark” reached about 150,000 people, in a country where 80 million people were already eating Oreos this year, and where traditional Super Bowl ads (which Oreo also ran) would get approximately 250 views for every view garnered by Oreo’s tweet. That Twitter audience, I should add, is about the size of the Fort Worth Star-Telegram’s circulation.
Granted, the tweet garnered all kinds of free media, and was free itself. But Ritson’s problem, he said, wasn’t with Oreo. It was with all the “lazy journalists” who failed to “look behind all the hype about social media for the numbers that tell the real story.” Of course Ritson’s column was only retweeted 125 times. In the land of the blind, the most retweeted are king.
With all the panel’s talk of creating a corporate culture at Oreo that allowed for this singular work of genius, I still wasn’t quite sure who wrote it. Afterwards, I cornered Steve Doan, a senior associate brand manager with Oreo. “Did you write it?” I asked him. “C’mon, don’t be modest.”
No, Doan replies. But he was “in the war room.” And how many people did it take in the war room to carry off the 11-word cultural and literary event of our era? “A good 15 people,” he says.
While I’ve painted the Twidiocracy with a broad brush as cultists—mostly because they are—there are notes of dissent even among disciples. I attend a “Twitter for Tough Guys” panel featuring several of the skippers from Deadliest Catch, Discovery Channel’s long-running hit reality show about crab fishermen working the Bering Sea. Though all the captains tweet, to the approval of the network’s social-media team buzzing around them like digital babysitters at SXSW, you can smell the whiff of heresy. These are men who have one of the most dangerous jobs in the world, and who do something concrete for a living—pulling food from the ocean to feed people (albeit under the gaze of reality-show cameras)—who have been reeled into the company of social-media gurus, attention barnacles, and Information Economy grifters. As a gruff Johnathan Hillstrand, captain of the Time Bandit, says under his breath from the stage, he misses the days when his entire crew didn’t have smartphones. Now, he says, “they’re walking around . . . not looking where they’re at. I’d rather see them on drugs. At least look out the f—in’ window.”
When I catch up with Hillstrand and the other captains later that night at a Deadliest Catch party, he declares his feelings for Twitter straightaway: “I f—in’ hate it. It takes all your time, and now people expect you to be doing it. I work my ass off, the last thing I need is another multimedia activity to do. . . . I see people who will go to lunch, and the four of them will be typing the whole time. And they probably leave and type, ‘that was the greatest lunch, let’s do this again.’ And they didn’t even f—in’ talk!”
One night, I run into an advocacy campaign strategist friend from back in D.C. at a hotel bar. In this eye of the hipsterville hurricane, it can be hard to find a simple Budweiser among the preciously named craft beers (“Saint Arnold’s Fancy Lawnmower”). Therefore, whiskey flows freely as we hash out our differences over the Twidiocracy.
Jake Brewer is a card-carrying member of the digerati, the chief strategy officer at Fission Strategy in Washington. He doesn’t take social media too seriously. Still, he takes them seriously enough to suggest I should perhaps be a tad more dispassionate about them. Social media, he says, are just a tool like any other, a chance “to have a shared experience at a scale never possible before.” There’s a dark side to that, he admits. “And it’s human beings. Period. . . . I can use a knife to cut bread and serve a great meal. I can also stab you with it.”
Jake, however, also admits that people are addicted to the dopamine-drip of What Is Happening Now. The pros, he says, call it “FOMO”—fear of missing out. Consequently, he says, people are always checking out “what else is going on versus just being where they are.” Or, as Douglas Rushkoff puts it in his book Present Shock: “Our culture becomes an entropic, static hum of everybody trying to capture the slipping moment. Narrativity and goals are surrendered to a skewed notion of the real and the immediate; the Tweet; the status update.”
As Jake and I talk, a man I mistake for a techie hipster in a sunken leather chair across from us fins his way in, uninvited. His name is Todd Butler, and I’m disabused of his annoyingness when he starts holding forth: “People judge success on social media not necessarily by the quality of the work, but by how many will follow. Which skews and diminishes the ability of people who actually want to put quality out there because they’re like, ‘Nobody cares if it’s quality.’ They care if they get ‘liked’ 5,000 times.” Jake has to excuse himself to make another appointment. But Todd now has my attention. He picks up the conversation, apologizing for what he knows must look like his signifying hipster-wear, the slouch hat, the hoodie under the sports jacket: “Honestly, this isn’t even my normal attire,” he says, “I look like f—in’ Don Draper when I go to work every day. I just had to be hipster for this.”
A digital strategist in his day job, Todd has also just released an iPhone app called “GONO,” which he describes as a “social decision-making app.” It lets users put anything in their life up for a vote among their social network. “Five years ago,” Todd explains, “whether I should buy this purse or car or should I do this blonde or brunette—I wouldn’t care [what anybody thinks]. But now more than ever, people are attuned to putting it on Twitter and whatever else. So it allows people to have that layer of assurance that the world likes their decision.”
I tell him his app sounds cynical, like he’s preying on the insecurity of those who are constantly looking over their shoulder for approval. Guilty, he transparently admits. That’s reality, though it’s a reality he himself loathes. Todd seems different to me from the tech triumphalists, and he is. Nine years ago, he was the sole survivor of a small-plane crash that killed his girlfriend and his pilot father. When I express gape-mouthed sympathy, he shrugs it off matter-of-factly, mentioning his fake teeth and the rods in his limbs. Though he looks healthy and hearty, he says, “I’ve had more plastic surgery than any girl you’ll ever meet.”
He asks if I’d care to see the pictures of the wreckage, and before I can answer, he pulls out his tablet, nonchalantly paging through photos of crash-scene debris. “Shit happens,” he says, his emotions in check. The crash, it seems, has given him a sort of direction, an urgency that’s sped up his metabolism. A pilot himself, he decided to fly again. He backpacked through Australia. He’s climbing Mt. Kilimanjaro this summer. He went to Nepal, where his app developers work. They were grateful to see him, since “Kathmandu is not a big hub for a lot of business. So it was fun. I got blessed, they made me sit around the prayer wheel and all that.”
Todd himself plunges ahead wherever his impulses lead. “I really don’t question a lot of my decisions,” he says. But his social app targets a hive-minded generation which often doesn’t know how to live apart from the constant, shoulder-sitting echo of its social network. “You can polish it up and say whatever you want,” he admits. “But honestly, this app literally focuses on the fact that people nowadays don’t want to make decisions on their own. People go to Twitter [instead]. . . . Paralyzed is probably the best word: ‘I don’t want to make this decision until I know what 5,000 anonymous people think.’ ” Who needs to be secure in their own point of view? That’s why the tech gods invented “likes” and retweets.
Not that Todd has no idea what insecurity looks like. He’s turning 30 during SXSW and says candidly that, while he’s good at his job and knows what he’s doing, in techie world, “I’m on the way out! I’m only 30 years old and feel like I’m having my midlife crisis now.” His world, the tech world and the world of the Twidiocracy, is forever shifting. His 19-year-old interns have ideas about his social app that don’t occur to him. “It’s moving so fast,” he says. Sometimes, Todd hits pause by watching Law & Order marathons. Yes, it feels very 1990 to him. “But it’s static,” he says, “the one thing I can watch that doesn’t change.”
As the hour grows late, and the amber-induced maudlin sets in, we speak of the larger-than-life men who fully inhabited the present tense. Gary Cooper. Frank Sinatra. “They wouldn’t give a s— what you think. They would never go on this app!” Todd says. “And that’s the difference. Now we have Kim Kardashian,” who lives and dies by her social network. The 15th-most-followed woman on Twitter even asked her followers to pick a first-dance song at her wedding, the song lasting nearly as long as the marriage did.
It’s all pretty dumb, Todd says. But just wait—it’ll get dumber. His interns recently informed him that Facebook and Twitter are passé—probably because too many cool moms and middle-aged journalists are on them. They prefer the picture-based Instagram. 140 characters? “Whoa, that’s way too much,” Todd laughs. “We found a way to make it even easier. Before, you’d tweet, ‘I don’t like the way Justin Bieber wears his hair.’ Now, I can just take a picture of Justin Bieber’s hair and be like, ‘Justin Bieber sucks,’ hashtag it, and that’s it.”
“It really simplifies things,” he adds, sardonically. “You basically put Robert Frost on Twitter and put Forrest Gump on Instagram.”
After several days, I finally find a panel that poses an intriguing question: “Are Social Media Making Us Sick?” The verdict, handed down by a couple of social-media hands from the firm Abelson Taylor, is apparently: no. Social media amplify whatever mood we’re already in, they say. Happy people tend to stay happy, depressive people, depressed. There’s a lengthy slideshow of a poll they conducted to back this up. It has the feel of tobacco company “scientists” telling us smoking increases lung capacity. Never mind that when they ask who thinks social media are making us sick, three-fourths of this tech-savvy, uber-connected SXSW crowd raise their hands in the affirmative.
And never mind a Michigan State study that found excessive media use/media multitasking can lead to symptoms associated with depression and anxiety. An Oxford University scientist said Facebook and Twitter are leading to narcissism and an “identity crisis” in users, while a Nominet Trust study found four-fifths of U.K. parents fear their children are getting addicted to social networking sites. A Western Illinois University study, as the Atlantic reported, “found a high correlation between Narcissistic Personality Inventory scores and Facebook activity.”
A Retrevo study showed 28 percent of iPhone users check/update Twitter before they get out of bed, while 48 percent do the same after they’ve gone to bed. A Chicago University study found that tweeting can be more addictive than cigarettes and alcohol, and that while sleep and sex can be stronger urges, people are more likely to give in to their urge to use social media. German university researchers found one out of three people who visited Facebook felt more dissatisfied with their lives afterwards, owing to feelings of envy and insecurity. And a study of 120,000 people by the media company Vuclip found 61 percent of men said their phone was the first thing people noticed about them (dire news for me and my mom phone).
In fact, a SXSW meetup that I attend on the back patio of an Austin bar—titled “I Am My Own Social Network”—proves just how acute the problem has become. Dave Hepp, the creative director at CreativeFeed in New York, has decided to conduct what these days passes for a brave and radical social experiment: He forces attendees to hand over their phones for 45 minutes and actually talk to each other. There are placards posted all over the patio with helpful conversation-starters for the human-interaction-impaired, such as “What is your earliest memory?” or “What did you want to be when you grew up?” Questionnaires are handed out, so that people can catalog their conversations, forcing them to listen.
People circle each other warily, their iThumbs twitching, yearning to make contact with their newly amputated digital appendages. But for most, the old muscle memory of analog life gradually returns. It becomes too much for one guy, who has to grab his iPhone and bolt—a pending dinner reservation swings in the balance—but he quickly returns, rechecking his phone at the confiscation desk. People introduce themselves, leaning into sometimes awkward small talk, tepidly feeling their way around each other, like accident victims learning to walk again. After about 15 minutes, they make what passes for real-live human connection. Nobody is looking over his shoulder for someone more important, since talking about your job is forbidden.
Many talk of their experiences at SXSW. How impossible it’s been to strike up conversations with the iDistracted. How at panels they’ve sat through, they’ve admired the live-tweets of people they know are sitting feet away from them, but how they wouldn’t think of introducing themselves afterwards.
Keith Kurson, who works for Agoge Inc. in San Francisco, tells me how he is never, ever disconnected. The other day, he used an app that allowed him to order McDonald’s and a bottle of Jack Daniels, and to have both delivered by the same guy on a bike. “I never had to leave my living room!” he says with astonished horror. His phone is forever on. “I take a lot of pride in my personal brand,” he allows. But a group of his friends will leave a club, and at the end of the night, will stand on the sidewalk, hashtagging whether anyone wants to go to iHop, with their buddies only feet away. He has answered texts during sex, he says, ashamed. He pines for the 1980s, which he regards as a golden era, “a different world . . . when you left your job at your job.” Keith is 22 years old.
His buddy, who works for GaymerConnect (a company which puts on conventions for gay gamers), is even more nihilistic. He wears a straw hat, and a sign taped to his shirt that says, “I AM A DOUCHEBAG
In a corner of the patio, I eavesdrop as a bubbly black woman from Brooklyn named Kerry Coddett, who runs a sketch comedy web series, makes contact with a mild-mannered country boy from Pennsylvania named Andrew. She asks him for his story, and whether it’s true what she’s heard—that everyone in the country has lots of babies and takes meth. “Not as a rule,” Andrew laughs.
Then she tells a story of her own, which could be the story of nearly everyone here. “You know, it’s funny,” she says. “I went to Costa Rica for my birthday. And I looked up, and I said, ‘What the f— is that? Oh s—, those are stars!’ I haven’t seen stars in a really long time. Like, it’s sad.”
Matt Labash is a senior writer at The Weekly Standard.
Recent Blog Posts