Two Miserable Decades
Don’t worry, it was even worse in the 1970s. Or was it?
Sep 30, 2013, Vol. 19, No. 04 • By JONATHAN V. LAST
Against that litany of horribles, you might think we have it pretty good these days. Whatever else there is to be said about the ’00s, at least some degree of sanity returned to public life. Granted, we have our oddballs, too. But by the time Al Gore revealed himself definitively to be a crank, he was out of office and relegated to film festivals and a minor cable TV channel. Ron Paul never found a following beyond the fringe. Belief in conspiracy theories—about Israel or vaccines or black helicopters—is now automatically disqualifying. There are standards for public life these days, and anyone who suggested outlawing the internal combustion engine or tacking a luxury tax on diapers would be ignored by the White House. Three years before being elected president of the United States, Jimmy Carter filed an official UFO report. Imagine how that might go over with voters today.
In other ways, however, we are less well off. If the ’70s started in 1967, the current age began on September 11, 2001. It has been an era marked, first and foremost, by large-scale terrorism. The Cold War was terrifying, and we should not discount the fears of the 1970s just because we know how the story ended. But the abstract threat of a global nuclear holocaust is different from the possibility that a jetliner will be flown into your office building, or that the passenger sitting next to you on your flight will try to set off a bomb in his shoe, or that a coworker will yell “Alahu Akbar” before opening fire in your workplace.
The Cold War and the (unfortunately misnamed) war on terror are not entirely dissimilar. Both involve a long, subterranean, ideological struggle pockmarked by shooting wars. What’s more, the two ideological struggles center around the same question: Does liberalism have the cultural confidence to insist on its own survival? The weak horse/strong horse paradigm is broadly applicable to both conflicts.
Economic life during the ’00s hasn’t been much fun, either. The attacks of 9/11 were followed by one long downturn. Strictly speaking, it has not been a single recession, but the periods of relief we have seen were undergirded not by real growth, but by bubbles—the tech bubble, which burst in 2001-02, and the housing bubble, which burst in 2007-08. Inflation was kept under control in the aughts, but unemployment has been more intractable. The term “jobless recovery” has become a black joke because for four straight years, whenever the unemployment rate dipped, the good news has been caused not by the creation of jobs, but rather by people dropping out of the workforce. Twenty-four months into the official recovery, most analysts put the “real unemployment” figure north of 11 percent.
Lurking behind these near-term economic problems is a long-term crisis that would have startled people in the ’70s: The American government is going broke. Even during the stagflation of the 1970s, the government remained reasonably solvent. After piling up enormous debts during the Depression and Second World War, America gradually put its house back in order during the ’50s and ’60s, and the deficit and debt were modest during the ’70s.
But in the ’00s they ballooned under both Bush and Obama, and the ratio of debt to gross domestic product exploded. Even worse, the commitments of Social Security and Medicare will relentlessly exacerbate the problem. As the blue-ribbon committees like to say, something must be done. And eventually it will be; the country won’t file for Chapter 11. Instead, it will, at some point, be forced to impose a great deal of pain on the citizenry. Or at least one cohort of it. As bad as the economics of the 1970s were, we almost certainly have it worse today.
You could probably say the same about our politics. Both eras featured a string of presidencies that can be reasonably categorized as failures. Nixon’s near-impeachment and resignation were blows to the body politic. But in a way, at least that debacle unfolded in a manner consistent with traditional politics. The wild political swings during the ’00s—triumph for Republicans in 2002 and 2004, landslides for the Democrats in 2006 and 2008, a counter-realignment for the Republicans in 2010—suggest an electorate lurching from one party to the other in search of competent political management.
Nothing illustrates the failure of the political order better than the passage of Obamacare. Objectively speaking, it’s remarkable how little support there was for the Affordable Care Act when it was passed. No one—left, right, or center—much liked the bill. (This goes for voters as well as lawmakers.) Republicans hated it. Democrats, for the most part, were wary of it. It was passed as an act of political vanity because President Obama wanted his name on an omnibus health care reform plan, irrespective of what the plan said.
Our political system is designed to prevent this sort of legislation because it’s based on a kind of Maslow’s hierarchy of needs for our elected officials. At the top of the pyramid are high-minded hopes about wisdom and judgment. And very occasionally lawmakers do buck public opinion because they believe in the rightness of a policy or action. But at the base of the pyramid, the people trust their officials to be self-interested—that is, to respect their constituents’ will, or at least not deliberately to do anything so abhorrent that they know it will cost them their jobs.
The passage of Obamacare broke this compact. The marginal Democrats who passed the law (and were subsequently defeated in 2010) violated every tier of the Maslow hierarchy: They didn’t believe in Obamacare, they didn’t like Obamacare, and they knew that voting for it would cost them their jobs because their constituents hated it, too. But they voted for it anyway. Pure partisanship overrode every other concern. It is not crazy to suspect that in the long run Obamacare may prove more damaging to political life than Nixon’s resignation.
Which leaves us, finally, with the culture. On the one hand, by many of the big statistical markers—violent crime, drug use, teen pregnancy, abortion—the aughts are clearly superior to the seventies. By others they are unarguably worse.
Family structure, for instance, has fractured to an extent that would have shocked even the drippy hippies at Woodstock. The divorce epidemic escalated beyond the ’70s before it plateaued, so that today “only” about 45 percent of marriages fail. Even family formation has become optional. In 1970, 8 percent of women finished their childbearing years without having children and 11 percent finished with only one child. Today 18.8 percent of American women have no children, and 18.5 percent have only one—a sea change that has spawned a movement—recently celebrated in a Time magazine cover story—called the “childfree life.”
But the most pernicious change is the abandonment of children by their fathers. In 1970, fewer than 10 percent of U.S. births were to single mothers. For 2011, the figure is 41 percent. The median age at first birth is actually lower than the median age at first marriage, and 48 percent—nearly half—of first births are to unwed mothers.
For all of the acid-dropping and love-ins, this destruction of family norms would have been unfathomable in the 1970s; it would have stunned even the radicals of the Weather Underground who vowed to “smash monogamy.” As would the creation of a constitutionally protected right to gay marriage. Or the encroachments gay marriage has made on religious freedom, with bakers and florists forced by the state to take part in gay marriage ceremonies.
By the same token, as dangerous as life was in the ’70s, the modern security state would also have been unthinkable. People walking into a football game or a theme park these days are treated as if they were visiting a prison. In many office buildings, visitors are now photographed and issued picture IDs. Imagine someone time-traveling from 1970 to a modern airport, with its lines, magnetometers, X-ray machines, millimeter wave scanners, and full-body pat-downs. It would strike him as a scene from a dystopian science-fiction novel. Yet we’ve constructed and accepted this vast apparatus over the last 12 years, and we’ve done so with little public debate.
It’s possible to see the calamities of both the ’70s and ’00s as consequences of institutional failure. Yet in the two decades, it was different types of institutions that faltered.
The ’70s were caused, in no small part, by failures of public institutions. Mismanagement by generals and politicians made Vietnam costlier. Poor judgment by Richard Nixon led to his near-impeachment. Unwise policies—the sudden termination of Bretton Woods and the refusal of Federal Reserve chairman William Miller to raise interest rates—made the economic environment more painful. The law enforcement apparatus was simply unable to maintain public order.
The institutions that failed during the ’00s tended to be private. Both the tech and housing bubbles were pumped up by ratings agencies that should have known better—and probably did. Banks abandoned banking in an effort to act like hedge funds, leading to the financial crisis of 2008. And the rise of flash-trading, exotic derivatives, tranches, and the fixed-income market made Wall Street function less like an engine for capital allocation and more like an arbitrage machine. Or possibly a casino.
Even the university system seems on the brink of failure as the rationale for a college education—that it is a glide path to a middle-class life—has steadily eroded. Until 1970, only about 11 percent of the workforce had a college degree; today it’s closer to a third. But because universities keep minting new graduates—in all manner of trivial and useless subjects—a bachelor’s degree no longer has the value it once did. In 2010, for instance, the unemployment rate for recent graduates was well over the national average, at 10.4 percent. And even that doesn’t tell the whole story. Since many of these young adults were working in McJobs, their underemployment rate was 19.8 percent.
To be sure, in the ’70s there was plenty of institutional failure outside the public sector. In the auto industry, for example, the labor movement was so unhinged that, as Fortune reported in 1970, it was not uncommon for assembly line workers to vandalize cars as they were building them: “[S]crews have been left in brake drums, tool handles welded into fender compartments (to cause mysterious, unfindable, and eternal rattles), paint scratched, and upholstery cut.” And as previously discussed, there has been plenty of failure from public institutions in the ’00s.
The point is that in both times, difficult environments were made worse by failures of the elites. And once institutions falter, it can be difficult to rebuild them.
So which period was worse? There’s a strong case to be made for each. Superficially, you could argue the ’70s, for all the obvious reasons: 58,000 Americans dead in Vietnam, Watergate, gas lines, the last helicopter leaving Saigon. But the deeper undercurrents suggest a different answer.
After all, American culture was fraying in the ’70s, but for the most part, society agreed that it was fraying and that this dissolution was problematic. In the 1970s, the country retained habits learned during almost two generations of the strongest growth in American history. A 40-year-old in 1970 had lived through the Depression and the Second World War, and his parents had seen the Great War, too. These people were made of stern stuff. And they could plausibly look at the world around them and see it as a terrible aberration. They could believe that the normal state of affairs was much better and that a return to normalcy was possible. That’s why the country responded to Reagan’s call for “morning in America.”
Our age is different. A 40-year-old in 2000 was a teenager during the maelstrom of the ’70s. He saw the bright spot of the mid-1980s and the respite from history that was the 1990s. But to him, the economic and social patterns of the ’00s look like the norm.
As for the culture, the social order of the 1950s may have been washing out to sea during the 1970s, but today it might as well be Atlantis—a world so lost that people no longer believe it ever really existed. When 1 out of every 10 births is illegitimate, it’s a societal failure. When nearly half are, it’s a new way of life.
Or think about it this way: In 1972, the X-rated movie Deep Throat became a national sensation. It played in mainstream theaters. It was talked about in polite conversation. It was noted by the New York Times, which claimed it had launched a trend of “porno chic.” In the course of all of this, Deep Throat was a flash point for a debate about propriety, pornography, morality, and culture.
In 2011, a TV show called The Secret Life of the American Teenager featured an exchange in which a teenage lesbian throws herself at a high school quarterback and begs him to give her “lots and lots of sex. Day and night. Around the clock. Twenty-four, seven. Oral sex. Sex-sex. Any kind of sex.” On the one hand, this scene was played for laughs, and both characters kept their clothes on. It was tamer than anything in Deep Throat, including the credit sequence.
On the other hand, The Secret Life of the American Teenager was written for young adolescents, and it aired in prime time on the ABC Family Channel, a cable network owned by Disney which caters expressly to, such as they are, “families.”
It's not clear that a politician could run a “morning in America” campaign today, because people might not believe that “better” is an option. It is no accident that the last three presidential cycles have showcased politicians who, instead of promising to make America great again, pleaded for voters to support them lest the other side make things worse. (The 2008 Obama “Hope and Change” campaign never even pretended to be about bettering American life with jobs or growth or stability; the hope and change being sold was all internal, a vote for Obama being an act of spiritual enlightenment.)
And worse may be in the cards. In addition to everything else, the ’00s have featured an accelerating social stratification. Charles Murray detailed the new phenomenon in last year’s Coming Apart, and it boils down to this: Where lawyers once married their secretaries, they now marry, and stay married to, other lawyers. High school dropouts, meanwhile, have children with, but do not marry, other high school dropouts. With weak family formation and slack attachment to the workforce creeping up the socioeconomic scale, patterns that were once unthinkable start to seem inescapable.
In the spring of 2012, I had coffee with an enormously successful young Internet entrepreneur. He had no special technical genius; he’d majored in social studies at Harvard. But, he informed me, he had been “early at Facebook.” Which was his polite way of saying that as an undergraduate he had befriended a classmate named Mark Zuckerberg and, by chance, gotten richer than Croesus.
Still in his mid-20s, he worried that the American middle class was on a road to extinction. In the near future, he explained, the only people able to make real money would be elites in the tech sector, like himself. There would be, he allowed, a class of tradesmen who could make a living servicing the elites. “For example,” he said, “I like artisan pickles. So there will be a place for people in Brooklyn who make really good artisan pickles, for people like me.” But outside of the artisan-pickle-makers? Nothing. “We’re headed,” he said unhappily, “for the kind of social divide they have in Brazil.”
Whether he is wrong or right is beside the point. What made an impression on me was that this fellow—a young man of copious wealth and ambition, who was one of the few winners from the Great Recession—believed that, as bad as things were in America today, we hadn’t yet touched bottom.
Setting these two decades side by side doesn’t force us to choose one over the other, but rather helps us to appreciate a dismal truth: They’re both worse. Just in different ways.
Jonathan V. Last is a senior writer at The Weekly Standard and the author of What to Expect When No One’s Expecting: America’s Coming Demographic Disaster.
Recent Blog Posts