Happy times are all alike, nestled in the comfortable batting of peace, growth, and stability. Every unhappy time is unhappy in its own way.
America has been blessed because, since the end of the Great Depression, our nation has experienced only two periods of deep discontent that lasted a decade or more. The first was the 1970s. We are living through the second today. Which was worse?
The popular mind often misremembers the past. For instance, these days the 1950s are held out as a time deserving special scorn. Stories set in the Eisenhower era are often shot through with contempt for the racism, sexism, hypocrisy, and dissatisfaction of American life. But this is revisionism; by many measures—wages, unemployment, home sales, marital stability, births, savings rates, upward mobility—the ’50s were an idyll.
What’s more, the happy times of the 1950s stretched into the 1960s. So long that “The ’60s” as we remember them—Woodstock, long hair, free love—didn’t really get underway until 1967 and continued well into the 1970s. That’s one of the central insights of David Frum’s wonderful book about the ’70s, How We Got Here. His other insight is that whatever people want to believe about the ’50s and ’60s, the stretch from 1967 to 1979 was a rarely mitigated disaster.
Many people remember the headlines from the 1970s: the shooting war in Vietnam and the quiet but existential threat of the larger Cold War; a president nearly impeached; oil shocks that forced people to stand in line for gasoline. But the problems in America were both broader and deeper.
The economics of the 1970s, for example, were brutal. In 1969, the unemployment rate was 3.5 percent, the lowest it had been since the mid-1950s. (The postwar average has been about 5 percent.) By 1975 unemployment had more than doubled, to 8.5 percent. While people were working less, so was their money, as inflation ate into the value of the dollar. In the 1960s, the inflation rate rose above 2 percent only twice—until 1968. At which point it began steadily increasing, reaching 11 percent in 1974, 9.1 percent in 1975, and 11.3 percent in 1979. To understand the effect this financial terror had on the national psyche, consider how often inflation fears have recurred during the last 30 years—even though inflation hasn’t topped 6 percent since 1982.
Everyday life wasn’t much better than economic life. Terrorism first came into vogue in the 1970s. Sometimes it was a thuggish hijacking, with criminals commandeering an airplane and demanding passage to Cuba. Sometimes it was deadly, like the massacre of 11 Israeli athletes at the 1972 Munich Olympics. Nobody much remembers it today, but in March 1977 Muslim radicals with machine guns and machetes marched into the B’nai B’rith headquarters in Washington, just five blocks north of the White House, and took 100 workers hostage. They herded the hostages onto the roof, where one was killed and two others were shot over the course of a standoff that lasted two days. Simultaneously, affiliated terrorists took over D.C. city hall, where future mayor Marion Barry was shot and a radio reporter was shot and killed.
The B’nai B’rith incident was soon lost in the wash of small-scale attacks and bombings from Islamic extremists, Black Power radicals, and student leftists that punctuated life in the ’70s—none of which seems to have left much of an impact. One prelude to the ’70s did have lasting consequences. During the “long, hot summers” of 1964-68, 329 “important” riots took place in 257 U.S. cities, according to Stephan and Abigail Thernstrom’s authoritative America in Black and White, with a toll of some 300 dead, 8,000 injured, and 60,000 arrested. The riots in Harlem, Watts, Detroit, Newark, and, after the assassination of Martin Luther King, Washington, D.C., were only the most famous. These eruptions helped drive the middle class out of urban cores in the ’70s, sending cities into decline and making the new underclass permanent.
Violent crime was almost nonexistent in the 1950s, but by 1973 it was rampant, and the Department of Justice had to create a new accounting system to keep track of it all. As Frum reports, in 1973 the FBI found that “37 million Americans—meaning one household out of every four—had suffered a rape, robbery, assault, burglary, larceny, or auto theft.” In cities, the victimization rate was 1 in 3. In 1960, there were 9,000 murders in America. By 1975, the number was in excess of 20,000. (In 2010, with America’s population 50 percent larger, there were 14,000 murders.) Perhaps the most evocative statistic concerns schoolchildren. In 1979, 1 out of every 20 public school teachers reported being physically assaulted by a student during the previous year.
Mind you, the kids had a lot to be angry about. During the 1970s their families were falling apart. Cohabitation, which only a few years before had been looked down on as “living in sin,” began migrating upward from the lower socioeconomic rungs during the 1960s. In the ’70s it became so commonplace that by the end of the decade nearly half of all couples who got married had lived together first. Of course, lots of couples never bothered to marry at all—during the ’70s the percentage of men and women tying the knot dropped by roughly 10 percent. And marriage was becoming an increasingly frail institution. In 1960 there were about 400,000 divorces annually. By 1979, the number was just shy of 1.2 million.
(All of this leaves aside abortion. In 1974, the year after Roe v. Wade made it every woman’s right, there were 900,000 abortions in America; five years later the number was 1.5 million, a 66 percent increase.)
The prevailing sense one gets is of a civilization unspooling. Even the environment seemed on the brink of calamity, with smog descending on Los Angeles and Cleveland’s Cuyahoga River catching fire, not to mention the toxic waste scandal at Love Canal, or the floating garbage barge outside of New York City, or the scare at the Three Mile Island nuclear power plant. This witch’s brew conjured the return of neo-Malthusian thinking about the dangers of “overpopulation,” which came to dominate both public discourse and public policy. (More on this in a moment.)
If people weren’t worrying about overpopulation, it was something else; a constant cloud of eschatological alarm loomed over the decade. A new Ice Age was coming to end our way of life—that is, if the comet Kohoutek or the killer bees that were en route from Mexico didn’t wipe us out first. On the New York Times op-ed page, editorial board member William Shannon wrote about “a new spirit of nihilism” and observed—with only a slight flourish—that “there are fleeting moments when the public scene recalls the Weimar Republic of 1932-33.”
Yet the most disconcerting aspect of the ’70s was the degree to which elite thinking suggested that the world had temporarily lost its mind. For instance, in 1972, the Supreme Court heard a case in which the Sierra Club attempted to prevent the Forest Service from allowing development of a valley near Sequoia National Park, which hinged on the technical question of whether or not the plaintiffs had standing to sue. Having liberalized standing in several cases, the High Court finally drew a line and denied it to the Sierra Club.
In a dissenting opinion, William O. Douglas offered not only that the Sierra Club should have standing, but that legal standing ought to be granted to inanimate objects, too:
The river as plaintiff speaks for the ecological unit of life that is part of it. The people who have a meaningful relation to that body of water . . . must be able to speak for the values which the river represents and which are threatened with destruction.
This was legal reasoning by a justice of the Supreme Court in the 1970s.
For another example, consider the sudden stardom of Paul Ehrlich. In 1968, Ehrlich, a Stanford University entomologist specializing in butterflies, published The Population Bomb, a book claiming that overpopulation would cause worldwide cataclysm, the deaths of “hundreds of millions,” and the end of human civilization—all within a year or three.
Ehrlich pitched all sorts of public policy ideas. He worked with legislators in the California state assembly to engineer a bill outlawing the internal combustion engine. (This bill actually passed the assembly before the senate killed it.) He proposed eliminating Monday holidays, so as to discourage people from traveling on long weekends and conserve precious resources. He said that the “freedom to breed is intolerable” and advocated coercive government measures to prevent people from having children. Some of these measures were soft—he proposed a luxury tax on diapers, bottles, and other baby paraphernalia. Others were firmer—he suggested the government should quietly add antifertility drugs to the water supply to prevent couples from conceiving.
Paul Ehrlich wasn’t some rogue crackpot; he was a notable personage. His book sold millions of copies, and he appeared on The Tonight Show with Johnny Carson at least 20 times. He was routinely courted by politicians and had the ear of three presidents, Johnson, Nixon, and Carter—all of whom crafted policies around his ideas.
In the 1970s, the public square became indistinguishable from an asylum.
Against that litany of horribles, you might think we have it pretty good these days. Whatever else there is to be said about the ’00s, at least some degree of sanity returned to public life. Granted, we have our oddballs, too. But by the time Al Gore revealed himself definitively to be a crank, he was out of office and relegated to film festivals and a minor cable TV channel. Ron Paul never found a following beyond the fringe. Belief in conspiracy theories—about Israel or vaccines or black helicopters—is now automatically disqualifying. There are standards for public life these days, and anyone who suggested outlawing the internal combustion engine or tacking a luxury tax on diapers would be ignored by the White House. Three years before being elected president of the United States, Jimmy Carter filed an official UFO report. Imagine how that might go over with voters today.
In other ways, however, we are less well off. If the ’70s started in 1967, the current age began on September 11, 2001. It has been an era marked, first and foremost, by large-scale terrorism. The Cold War was terrifying, and we should not discount the fears of the 1970s just because we know how the story ended. But the abstract threat of a global nuclear holocaust is different from the possibility that a jetliner will be flown into your office building, or that the passenger sitting next to you on your flight will try to set off a bomb in his shoe, or that a coworker will yell “Alahu Akbar” before opening fire in your workplace.
The Cold War and the (unfortunately misnamed) war on terror are not entirely dissimilar. Both involve a long, subterranean, ideological struggle pockmarked by shooting wars. What’s more, the two ideological struggles center around the same question: Does liberalism have the cultural confidence to insist on its own survival? The weak horse/strong horse paradigm is broadly applicable to both conflicts.
Economic life during the ’00s hasn’t been much fun, either. The attacks of 9/11 were followed by one long downturn. Strictly speaking, it has not been a single recession, but the periods of relief we have seen were undergirded not by real growth, but by bubbles—the tech bubble, which burst in 2001-02, and the housing bubble, which burst in 2007-08. Inflation was kept under control in the aughts, but unemployment has been more intractable. The term “jobless recovery” has become a black joke because for four straight years, whenever the unemployment rate dipped, the good news has been caused not by the creation of jobs, but rather by people dropping out of the workforce. Twenty-four months into the official recovery, most analysts put the “real unemployment” figure north of 11 percent.
Lurking behind these near-term economic problems is a long-term crisis that would have startled people in the ’70s: The American government is going broke. Even during the stagflation of the 1970s, the government remained reasonably solvent. After piling up enormous debts during the Depression and Second World War, America gradually put its house back in order during the ’50s and ’60s, and the deficit and debt were modest during the ’70s.
But in the ’00s they ballooned under both Bush and Obama, and the ratio of debt to gross domestic product exploded. Even worse, the commitments of Social Security and Medicare will relentlessly exacerbate the problem. As the blue-ribbon committees like to say, something must be done. And eventually it will be; the country won’t file for Chapter 11. Instead, it will, at some point, be forced to impose a great deal of pain on the citizenry. Or at least one cohort of it. As bad as the economics of the 1970s were, we almost certainly have it worse today.
You could probably say the same about our politics. Both eras featured a string of presidencies that can be reasonably categorized as failures. Nixon’s near-impeachment and resignation were blows to the body politic. But in a way, at least that debacle unfolded in a manner consistent with traditional politics. The wild political swings during the ’00s—triumph for Republicans in 2002 and 2004, landslides for the Democrats in 2006 and 2008, a counter-realignment for the Republicans in 2010—suggest an electorate lurching from one party to the other in search of competent political management.
Nothing illustrates the failure of the political order better than the passage of Obamacare. Objectively speaking, it’s remarkable how little support there was for the Affordable Care Act when it was passed. No one—left, right, or center—much liked the bill. (This goes for voters as well as lawmakers.) Republicans hated it. Democrats, for the most part, were wary of it. It was passed as an act of political vanity because President Obama wanted his name on an omnibus health care reform plan, irrespective of what the plan said.
Our political system is designed to prevent this sort of legislation because it’s based on a kind of Maslow’s hierarchy of needs for our elected officials. At the top of the pyramid are high-minded hopes about wisdom and judgment. And very occasionally lawmakers do buck public opinion because they believe in the rightness of a policy or action. But at the base of the pyramid, the people trust their officials to be self-interested—that is, to respect their constituents’ will, or at least not deliberately to do anything so abhorrent that they know it will cost them their jobs.
The passage of Obamacare broke this compact. The marginal Democrats who passed the law (and were subsequently defeated in 2010) violated every tier of the Maslow hierarchy: They didn’t believe in Obamacare, they didn’t like Obamacare, and they knew that voting for it would cost them their jobs because their constituents hated it, too. But they voted for it anyway. Pure partisanship overrode every other concern. It is not crazy to suspect that in the long run Obamacare may prove more damaging to political life than Nixon’s resignation.
Which leaves us, finally, with the culture. On the one hand, by many of the big statistical markers—violent crime, drug use, teen pregnancy, abortion—the aughts are clearly superior to the seventies. By others they are unarguably worse.
Family structure, for instance, has fractured to an extent that would have shocked even the drippy hippies at Woodstock. The divorce epidemic escalated beyond the ’70s before it plateaued, so that today “only” about 45 percent of marriages fail. Even family formation has become optional. In 1970, 8 percent of women finished their childbearing years without having children and 11 percent finished with only one child. Today 18.8 percent of American women have no children, and 18.5 percent have only one—a sea change that has spawned a movement—recently celebrated in a Time magazine cover story—called the “childfree life.”
But the most pernicious change is the abandonment of children by their fathers. In 1970, fewer than 10 percent of U.S. births were to single mothers. For 2011, the figure is 41 percent. The median age at first birth is actually lower than the median age at first marriage, and 48 percent—nearly half—of first births are to unwed mothers.
For all of the acid-dropping and love-ins, this destruction of family norms would have been unfathomable in the 1970s; it would have stunned even the radicals of the Weather Underground who vowed to “smash monogamy.” As would the creation of a constitutionally protected right to gay marriage. Or the encroachments gay marriage has made on religious freedom, with bakers and florists forced by the state to take part in gay marriage ceremonies.
By the same token, as dangerous as life was in the ’70s, the modern security state would also have been unthinkable. People walking into a football game or a theme park these days are treated as if they were visiting a prison. In many office buildings, visitors are now photographed and issued picture IDs. Imagine someone time-traveling from 1970 to a modern airport, with its lines, magnetometers, X-ray machines, millimeter wave scanners, and full-body pat-downs. It would strike him as a scene from a dystopian science-fiction novel. Yet we’ve constructed and accepted this vast apparatus over the last 12 years, and we’ve done so with little public debate.
It’s possible to see the calamities of both the ’70s and ’00s as consequences of institutional failure. Yet in the two decades, it was different types of institutions that faltered.
The ’70s were caused, in no small part, by failures of public institutions. Mismanagement by generals and politicians made Vietnam costlier. Poor judgment by Richard Nixon led to his near-impeachment. Unwise policies—the sudden termination of Bretton Woods and the refusal of Federal Reserve chairman William Miller to raise interest rates—made the economic environment more painful. The law enforcement apparatus was simply unable to maintain public order.
The institutions that failed during the ’00s tended to be private. Both the tech and housing bubbles were pumped up by ratings agencies that should have known better—and probably did. Banks abandoned banking in an effort to act like hedge funds, leading to the financial crisis of 2008. And the rise of flash-trading, exotic derivatives, tranches, and the fixed-income market made Wall Street function less like an engine for capital allocation and more like an arbitrage machine. Or possibly a casino.
Even the university system seems on the brink of failure as the rationale for a college education—that it is a glide path to a middle-class life—has steadily eroded. Until 1970, only about 11 percent of the workforce had a college degree; today it’s closer to a third. But because universities keep minting new graduates—in all manner of trivial and useless subjects—a bachelor’s degree no longer has the value it once did. In 2010, for instance, the unemployment rate for recent graduates was well over the national average, at 10.4 percent. And even that doesn’t tell the whole story. Since many of these young adults were working in McJobs, their underemployment rate was 19.8 percent.
To be sure, in the ’70s there was plenty of institutional failure outside the public sector. In the auto industry, for example, the labor movement was so unhinged that, as Fortune reported in 1970, it was not uncommon for assembly line workers to vandalize cars as they were building them: “[S]crews have been left in brake drums, tool handles welded into fender compartments (to cause mysterious, unfindable, and eternal rattles), paint scratched, and upholstery cut.” And as previously discussed, there has been plenty of failure from public institutions in the ’00s.
The point is that in both times, difficult environments were made worse by failures of the elites. And once institutions falter, it can be difficult to rebuild them.
So which period was worse? There’s a strong case to be made for each. Superficially, you could argue the ’70s, for all the obvious reasons: 58,000 Americans dead in Vietnam, Watergate, gas lines, the last helicopter leaving Saigon. But the deeper undercurrents suggest a different answer.
After all, American culture was fraying in the ’70s, but for the most part, society agreed that it was fraying and that this dissolution was problematic. In the 1970s, the country retained habits learned during almost two generations of the strongest growth in American history. A 40-year-old in 1970 had lived through the Depression and the Second World War, and his parents had seen the Great War, too. These people were made of stern stuff. And they could plausibly look at the world around them and see it as a terrible aberration. They could believe that the normal state of affairs was much better and that a return to normalcy was possible. That’s why the country responded to Reagan’s call for “morning in America.”
Our age is different. A 40-year-old in 2000 was a teenager during the maelstrom of the ’70s. He saw the bright spot of the mid-1980s and the respite from history that was the 1990s. But to him, the economic and social patterns of the ’00s look like the norm.
As for the culture, the social order of the 1950s may have been washing out to sea during the 1970s, but today it might as well be Atlantis—a world so lost that people no longer believe it ever really existed. When 1 out of every 10 births is illegitimate, it’s a societal failure. When nearly half are, it’s a new way of life.
Or think about it this way: In 1972, the X-rated movie Deep Throat became a national sensation. It played in mainstream theaters. It was talked about in polite conversation. It was noted by the New York Times, which claimed it had launched a trend of “porno chic.” In the course of all of this, Deep Throat was a flash point for a debate about propriety, pornography, morality, and culture.
In 2011, a TV show called The Secret Life of the American Teenager featured an exchange in which a teenage lesbian throws herself at a high school quarterback and begs him to give her “lots and lots of sex. Day and night. Around the clock. Twenty-four, seven. Oral sex. Sex-sex. Any kind of sex.” On the one hand, this scene was played for laughs, and both characters kept their clothes on. It was tamer than anything in Deep Throat, including the credit sequence.
On the other hand, The Secret Life of the American Teenager was written for young adolescents, and it aired in prime time on the ABC Family Channel, a cable network owned by Disney which caters expressly to, such as they are, “families.”
It's not clear that a politician could run a “morning in America” campaign today, because people might not believe that “better” is an option. It is no accident that the last three presidential cycles have showcased politicians who, instead of promising to make America great again, pleaded for voters to support them lest the other side make things worse. (The 2008 Obama “Hope and Change” campaign never even pretended to be about bettering American life with jobs or growth or stability; the hope and change being sold was all internal, a vote for Obama being an act of spiritual enlightenment.)
And worse may be in the cards. In addition to everything else, the ’00s have featured an accelerating social stratification. Charles Murray detailed the new phenomenon in last year’s Coming Apart, and it boils down to this: Where lawyers once married their secretaries, they now marry, and stay married to, other lawyers. High school dropouts, meanwhile, have children with, but do not marry, other high school dropouts. With weak family formation and slack attachment to the workforce creeping up the socioeconomic scale, patterns that were once unthinkable start to seem inescapable.
In the spring of 2012, I had coffee with an enormously successful young Internet entrepreneur. He had no special technical genius; he’d majored in social studies at Harvard. But, he informed me, he had been “early at Facebook.” Which was his polite way of saying that as an undergraduate he had befriended a classmate named Mark Zuckerberg and, by chance, gotten richer than Croesus.
Still in his mid-20s, he worried that the American middle class was on a road to extinction. In the near future, he explained, the only people able to make real money would be elites in the tech sector, like himself. There would be, he allowed, a class of tradesmen who could make a living servicing the elites. “For example,” he said, “I like artisan pickles. So there will be a place for people in Brooklyn who make really good artisan pickles, for people like me.” But outside of the artisan-pickle-makers? Nothing. “We’re headed,” he said unhappily, “for the kind of social divide they have in Brazil.”
Whether he is wrong or right is beside the point. What made an impression on me was that this fellow—a young man of copious wealth and ambition, who was one of the few winners from the Great Recession—believed that, as bad as things were in America today, we hadn’t yet touched bottom.
Setting these two decades side by side doesn’t force us to choose one over the other, but rather helps us to appreciate a dismal truth: They’re both worse. Just in different ways.
Jonathan V. Last is a senior writer at The Weekly Standard and the author of What to Expect When No One’s Expecting: America’s Coming Demographic Disaster.