The Price of Power
The benefits of U.S. defense spending far outweigh the costs
Jan 24, 2011, Vol. 16, No. 18 • By ROBERT KAGAN
The looming battle over the defense budget could produce a useful national discussion about American foreign and defense policy. But we would need to begin by dispensing with the most commonly repeated fallacy: that cutting defense is essential to restoring the nation’s fiscal health. People can be forgiven for believing this myth, given how often they hear it. Typical is a recent Foreign Affairs article claiming that the United States faces “a watershed moment” and “must decide whether to increase its already massive debt in order to continue being the world’s sheriff or restrain its military missions and focus on economic recovery.”
F-22 Raptors over Southwest Asia
This is nonsense. No serious budget analyst or economist believes that cutting the defense budget will aid economic recovery in the near term—federal spending on defense is just as much a job-producing stimulus as federal spending on infrastructure. Nor, more importantly, do they believe that cutting defense spending will have more than the most marginal effect on reducing the runaway deficits projected for the coming years. The simple fact is, as my Brookings colleague and former budget czar Alice Rivlin recently observed, the scary projections of future deficits are not “caused by rising defense spending,” and even if one assumes that defense spending continues to increase with the rate of inflation, this is “not what’s driving the future spending.” The engine of our growing debt is entitlements.
So why are the various commissions, including the Rivlin-Domenici commission, as well as members of Congress, calling for defense cuts at all? The answer boils down to one of fairness, and politics. It is not that cutting defense is necessary to save the economy. But if the American people are going to be asked to accept cuts in their domestic entitlements, the assumption runs, they’re going to want to see the pain shared across the board, including by defense.
This “fair share” argument is at least more sober than phony “cut defense or kill the economy” sensationalism, and it has the appearance of reasonableness. But it is still based on a fallacy. Distributing cuts equally is not an intrinsically good thing. If you wanted to reduce the gas consumption of your gas-guzzling car by 10 percent, you wouldn’t remove 10 percent of your front and rear bumpers so that all parts of the car shared the pain. The same goes for the federal budget. Not all cuts have equal effect on the national well-being. Few would propose cutting spending on airport security, for instance. At a time of elevated risk of terrorist attack, we don’t need to show the American people that airport security is contributing its “fair share” to budget reduction.
Today the international situation is also one of high risk.
• The terrorists who would like to kill Americans on U.S. soil constantly search for safe havens from which to plan and carry out their attacks. American military actions in Afghanistan, Pakistan, Iraq, Yemen, and elsewhere make it harder for them to strike and are a large part of the reason why for almost a decade there has been no repetition of September 11. To the degree that we limit our ability to deny them safe haven, we increase the chances they will succeed.
• American forces deployed in East Asia and the Western Pacific have for decades prevented the outbreak of major war, provided stability, and kept open international trading routes, making possible an unprecedented era of growth and prosperity for Asians and Americans alike. Now the United States faces a new challenge and potential threat from a rising China which seeks eventually to push the U.S. military’s area of operations back to Hawaii and exercise hegemony over the world’s most rapidly growing economies. Meanwhile, a nuclear-armed North Korea threatens war with South Korea and fires ballistic missiles over Japan that will someday be capable of reaching the west coast of the United States. Democratic nations in the region, worried that the United States may be losing influence, turn to Washington for reassurance that the U.S. security guarantee remains firm. If the United States cannot provide that assurance because it is cutting back its military capabilities, they will have to choose between accepting Chinese dominance and striking out on their own, possibly by building nuclear weapons.
• In the Middle East, Iran seeks to build its own nuclear arsenal, supports armed radical Islamic groups in Lebanon and Palestine, and has linked up with anti-American dictatorships in the Western Hemisphere. The prospects of new instability in the region grow every day as a decrepit regime in Egypt clings to power, crushes all moderate opposition, and drives the Muslim Brotherhood into the streets. A nuclear-armed Pakistan seems to be ever on the brink of collapse into anarchy and radicalism. Turkey, once an ally, now seems bent on an increasingly anti-American Islamist course. The prospect of war between Hezbollah and Israel grows, and with it the possibility of war between Israel and Syria and possibly Iran. There, too, nations in the region increasingly look to Washington for reassurance, and if they decide the United States cannot be relied upon they will have to decide whether to succumb to Iranian influence or build their own nuclear weapons to resist it.
In the 1990s, after the Soviet Union had collapsed and the biggest problem in the world seemed to be ethnic conflict in the Balkans, it was at least plausible to talk about cutting back on American military capabilities. In the present, increasingly dangerous international environment, in which terrorism and great power rivalry vie as the greatest threat to American security and interests, cutting military capacities is simply reckless. Would we increase the risk of strategic failure in an already risky world, despite the near irrelevance of the defense budget to American fiscal health, just so we could tell American voters that their military had suffered its “fair share” of the pain?
The nature of the risk becomes plain when one considers the nature of the cuts that would have to be made to have even a marginal effect on the U.S. fiscal crisis. Many are under the illusion, for instance, that if the United States simply withdrew from Iraq and Afghanistan and didn’t intervene anywhere else for a while, this would have a significant impact on future deficits. But, in fact, projections of future massive deficits already assume the winding down of these interventions.Withdrawal from the two wars would scarcely make a dent in the fiscal crisis. Nor can meaningful reductions be achieved by cutting back on waste at the Pentagon—which Secretary of Defense Gates has already begun to do and which has also been factored into deficit projections. If the United States withdrew from Iran and Afghanistan tomorrow, cut all the waste Gates can find, and even eliminated a few weapons programs—all this together would still not produce a 10 percent decrease in overall defense spending.
In fact, the only way to get significant savings from the defense budget—and by “significant,” we are still talking about a tiny fraction of the cuts needed to bring down future deficits—is to cut force structure: fewer troops on the ground; fewer airplanes in the skies; fewer ships in the water; fewer soldiers, pilots, and sailors to feed and clothe and provide benefits for. To cut the size of the force, however, requires reducing or eliminating the missions those forces have been performing. Of course, there are any number of think tank experts who insist U.S. forces can be cut by a quarter or third or even by half and still perform those missions. But this is snake oil. Over the past two decades, the force has already been cut by a third. Yet no administration has reduced the missions that the larger force structures of the past were designed to meet. To fulfill existing security commitments, to remain the “world’s power balancer of choice,” as Leslie Gelb puts it, to act as “the only regional balancer against China in Asia, Russia in eastern Europe, and Iran in the Middle East” requires at least the current force structure, and almost certainly more than current force levels. Those who recommend doing the same with less are only proposing a policy of insufficiency, where the United States makes commitments it cannot meet except at high risk of failure.
The only way to find substantial savings in the defense budget, therefore, is to change American strategy fundamentally. The Simpson-Bowles commission suggests as much, by calling for a reexamination of America’s “21st century role,” although it doesn’t begin to define what that new role might be.
Others have. For decades “realist” analysts have called for a strategy of “offshore balancing.” Instead of the United States providing security in East Asia and the Persian Gulf, it would withdraw its forces from Japan, South Korea, and the Middle East and let the nations in those regions balance one another. If the balance broke down and war erupted, the United States would then intervene militarily until balance was restored. In the Middle East and Persian Gulf, for instance, Christopher Layne has long proposed “passing the mantle of regional stabilizer” to a consortium of “Russia, China, Iran, and India.” In East Asia offshore balancing would mean letting China, Japan, South Korea, Australia, and others manage their own problems, without U.S. involvement—again, until the balance broke down and war erupted, at which point the United States would provide assistance to restore the balance and then, if necessary, intervene with its own forces to restore peace and stability.
Before examining whether this would be a wise strategy, it is important to understand that this really is the only genuine alternative to the one the United States has pursued for the past 65 years. To their credit, Layne and others who support the concept of offshore balancing have eschewed halfway measures and airy assurances that we can do more with less, which are likely recipes for disaster. They recognize that either the United States is actively involved in providing security and stability in regions beyond the Western Hemisphere, which means maintaining a robust presence in those regions, or it is not. Layne and others are frank in calling for an end to the global security strategy developed in the aftermath of World War II, perpetuated through the Cold War, and continued by four successive post-Cold War administrations.
At the same time, it is not surprising that none of those administrations embraced offshore balancing as a strategy. The idea of relying on Russia, China, and Iran to jointly “stabilize” the Middle East and Persian Gulf will not strike many as an attractive proposition. Nor is U.S. withdrawal from East Asia and the Pacific likely to have a stabilizing effect on that region. The prospects of a war on the Korean Peninsula would increase. Japan and other nations in the region would face the choice of succumbing to Chinese hegemony or taking unilateral steps for self-defense, which in Japan’s case would mean the rapid creation of a formidable nuclear arsenal.
Layne and other offshore balancing enthusiasts, like John Mearsheimer, point to two notable occasions when the United States allegedly practiced this strategy. One was the Iran-Iraq war, where the United States supported Iraq for years against Iran in the hope that the two would balance and weaken each other. The other was American policy in the 1920s and 1930s, when the United States allowed the great European powers to balance one another, occasionally providing economic aid, or military aid, as in the Lend-Lease program of assistance to Great Britain once war broke out. Whether this was really American strategy in that era is open for debate—most would argue the United States in this era was trying to stay out of war not as part of a considered strategic judgment but as an end in itself. Even if the United States had been pursuing offshore balancing in the first decades of the 20th century, however, would we really call that strategy a success? The United States wound up intervening with millions of troops, first in Europe, and then in Asia and Europe simultaneously, in the two most dreadful wars in human history.
It was with the memory of those two wars in mind, and in the belief that American strategy in those interwar years had been mistaken, that American statesmen during and after World War II determined on the new global strategy that the United States has pursued ever since. Under Franklin Roosevelt, and then under the leadership of Harry Truman and Dean Acheson, American leaders determined that the safest course was to build “situations of strength” (Acheson’s phrase) in strategic locations around the world, to build a “preponderance of power,” and to create an international system with American power at its center. They left substantial numbers of troops in East Asia and in Europe and built a globe-girdling system of naval and air bases to enable the rapid projection of force to strategically important parts of the world. They did not do this on a lark or out of a yearning for global dominion. They simply rejected the offshore balancing strategy, and they did so because they believed it had led to great, destructive wars in the past and would likely do so again. They believed their new global strategy was more likely to deter major war and therefore be less destructive and less expensive in the long run. Subsequent administrations, from both parties and with often differing perspectives on the proper course in many areas of foreign policy, have all agreed on this core strategic approach.
From the beginning this strategy was assailed as too ambitious and too expensive. At the dawn of the Cold War, Walter Lippmann railed against Truman’s containment strategy as suffering from an unsustainable gap between ends and means that would bankrupt the United States and exhaust its power. Decades later, in the waning years of the Cold War, Paul Kennedy warned of “imperial overstretch,” arguing that American decline was inevitable “if the trends in national indebtedness, low productivity increases, [etc.]” were allowed to continue at the same time as “massive American commitments of men, money and materials are made in different parts of the globe.” Today, we are once again being told that this global strategy needs to give way to a more restrained and modest approach, even though the indebtedness crisis that we face in coming years is not caused by the present, largely successful global strategy.
Of course it is precisely the success of that strategy that is taken for granted. The enormous benefits that this strategy has provided, including the financial benefits, somehow never appear on the ledger. They should. We might begin by asking about the global security order that the United States has sustained since Word War II—the prevention of major war, the support of an open trading system, and promotion of the liberal principles of free markets and free government. How much is that order worth? What would be the cost of its collapse or transformation into another type of order?
Whatever the nature of the current economic difficulties, the past six decades have seen a greater increase in global prosperity than any time in human history. Hundreds of millions have been lifted out of poverty. Once-backward nations have become economic dynamos. And the American economy, though suffering ups and downs throughout this period, has on the whole benefited immensely from this international order. One price of this success has been maintaining a sufficient military capacity to provide the essential security underpinnings of this order. But has the price not been worth it? In the first half of the 20th century, the United States found itself engaged in two world wars. In the second half, this global American strategy helped produce a peaceful end to the great-power struggle of the Cold War and then 20 more years of great-power peace. Looked at coldly, simply in terms of dollars and cents, the benefits of that strategy far outweigh the costs.
The danger, as always, is that we don’t even realize the benefits our strategic choices have provided. Many assume that the world has simply become more peaceful, that great-power conflict has become impossible, that nations have learned that military force has little utility, that economic power is what counts. This belief in progress and the perfectibility of humankind and the institutions of international order is always alluring to Americans and Europeans and other children of the Enlightenment. It was the prevalent belief in the decade before World War I, in the first years after World War II, and in those heady days after the Cold War when people spoke of the “end of history.” It is always tempting to believe that the international order the United States built and sustained with its power can exist in the absence of that power, or at least with much less of it. This is the hidden assumption of those who call for a change in American strategy: that the United States can stop playing its role and yet all the benefits that came from that role will keep pouring in. This is a great if recurring illusion, the idea that you can pull a leg out from under a table and the table will not fall over.
Much of the present debate, it should be acknowledged, is not about the defense budget or the fiscal crisis at all. It is only the latest round in a long-running debate over the nature and purposes of American foreign policy. At the tactical level, some use the fiscal crisis as a justification for a different approach to, say, Afghanistan. Richard Haass, for instance, who has long favored a change of strategy from “counterinsurgency” to “counterterrorism,” now uses the budget crisis to bolster his case—although he leaves unclear how much money would be saved by such a shift in strategy.
At the broader level of grand strategy, the current debate, though revived by the budget crisis, can be traced back a century or more, but its most recent expression came with the end of the Cold War. In the early 1990s, some critics, often calling themselves “realists,” expressed their unhappiness with a foreign policy—first under George H.W. Bush and then under Bill Clinton—that cast the United States as leader of a “new world order,” the “indispensable nation.” As early as 1992, Robert W. Tucker and David C. Hendrickson assailed President Bush for launching the first Persian Gulf war in response to Saddam Hussein’s invasion and occupation of Kuwait. They charged him with pursuing “a new world role . . . required neither by security need nor by traditional conceptions of the nation’s purpose,” a role that gave “military force” an “excessive and disproportionate . . . position in our statecraft.”
Tucker and Hendrickson were frank enough to acknowledge that, pace Paul Kennedy, the “peril” was not actually “to the nation’s purse” or even to “our interests” but to the nation’s “soul.” This has always been the core critique of expansive American foreign policy doctrines, from the time of the Founders to the present—not that a policy of extensive global involvement is necessarily impractical but that it is immoral and contrary to the nation’s true ideals.
Today this alleged profligacy in the use of force is variously attributed to the influence of “neoconservatives” or to those Mearsheimer calls the “liberal imperialists” of the Clinton administration, who have presumably now taken hold of the Obama administration as well. But the critics share a common premise: that if only the United States would return to a more “normal” approach to the world, intervening abroad far less frequently and eschewing efforts at “nation-building,” then this would allow the United States to cut back on the resources it expends on foreign policy.
Thanks to Haass’s clever formulation, there has been a great deal of talk lately about “wars of choice” as opposed to “wars of necessity.” Haass labels both the war in Iraq and the war in Afghanistan “wars of choice.” Today, many ask whether the United States can simply avoid such allegedly optional interventions in the future, as well as the occupations and exercises in “nation-building” that often seem to follow.
Although the idea of eliminating “wars of choice” appears sensible, the historical record suggests it will not be as simple as many think. The problem is, almost every war or intervention the United States has engaged in throughout its history has been optional—and not just the Bosnias, Haitis, Somalias, or Vietnams, but the Korean War, the Spanish-American War, World War I, and even World War II (at least the war in Europe), not to mention the many armed interventions throughout Latin America and the Caribbean over the course of the past century, from Cuba in 1898 to Panama in 1989. A case can be made, and has been made by serious historians, that every one of these wars and interventions was avoidable and unnecessary. To note that our most recent wars have also been wars of choice, therefore, is not as useful as it seems.
In theory, the United States could refrain from intervening abroad. But, in practice, will it? Many assume today that the American public has had it with interventions, and Alice Rivlin certainly reflects a strong current of opinion when she says that “much of the public does not believe that we need to go in and take over other people’s countries.” That sentiment has often been heard after interventions, especially those with mixed or dubious results. It was heard after the four-year-long war in the Philippines, which cost 4,000 American lives and untold Filipino casualties. It was heard after Korea and after Vietnam. It was heard after Somalia. Yet the reality has been that after each intervention, the sentiment against foreign involvement has faded, and the United States has intervened again.
Depending on how one chooses to count, the United States has undertaken roughly 25 overseas interventions since 1898:
That is one intervention every 4.5 years on average. Overall, the United States has intervened or been engaged in combat somewhere in 52 out of the last 112 years, or roughly 47 percent of the time. Since the end of the Cold War, it is true, the rate of U.S. interventions has increased, with an intervention roughly once every 2.5 years and American troops intervening or engaged in combat in 16 out of 22 years, or over 70 percent of the time, since the fall of the Berlin Wall.
The argument for returning to “normal” begs the question: What is normal for the United States? The historical record of the last century suggests that it is not a policy of nonintervention. This record ought to raise doubts about the theory that American behavior these past two decades is the product of certain unique ideological or doctrinal movements, whether “liberal imperialism” or “neoconservatism.” Allegedly “realist” presidents in this era have been just as likely to order interventions as their more idealistic colleagues. George H.W. Bush was as profligate an intervener as Bill Clinton. He invaded Panama in 1989, intervened in Somalia in 1992—both on primarily idealistic and humanitarian grounds—which along with the first Persian Gulf war in 1991 made for three interventions in a single four-year term. Since 1898 the list of presidents who ordered armed interventions abroad has included William McKinley, Theodore Roose-velt, William Howard Taft, Woodrow Wilson, Franklin Roosevelt, Harry Truman, Dwight Eisenhower, John F. Kennedy, Ronald Reagan, George H.W. Bush, Bill Clinton, and George W. Bush. One would be hard-pressed to find a common ideological or doctrinal thread among them—unless it is the doctrine and ideology of a mainstream American foreign policy that leans more toward intervention than many imagine or would care to admit.
Many don’t want to admit it, and the only thing as consistent as this pattern of American behavior has been the claim by contemporary critics that it is abnormal and a departure from American traditions. The anti-imperialists of the late 1890s, the isolationists of the 1920s and 1930s, the critics of Korea and Vietnam, and the critics of the first Persian Gulf war, the interventions in the Balkans, and the more recent wars of the Bush years have all insisted that the nation had in those instances behaved unusually or irrationally. And yet the behavior has continued.
To note this consistency is not the same as justifying it. The United States may have been wrong for much of the past 112 years. Some critics would endorse the sentiment expressed by the historian Howard K. Beale in the 1950s, that “the men of 1900” had steered the United States onto a disastrous course of world power which for the subsequent half-century had done the United States and the world no end of harm. But whether one lauds or condemns this past century of American foreign policy—and one can find reasons to do both—the fact of this consistency remains. It would require not just a modest reshaping of American foreign policy priorities but a sharp departure from this tradition to bring about the kinds of changes that would allow the United States to make do with a substantially smaller force structure.
Is such a sharp departure in the offing? It is no doubt true that many Americans are unhappy with the on-going warfare in Afghanistan and to a lesser extent in Iraq, and that, if asked, a majority would say the United States should intervene less frequently in foreign nations, or perhaps not at all. It may also be true that the effect of long military involvements in Iraq and Afghanistan may cause Americans and their leaders to shun further interventions at least for a few years—as they did for nine years after World War I, five years after World War II, and a decade after Vietnam. This may be further reinforced by the difficult economic times in which Americans are currently suffering. The longest period of nonintervention in the past century was during the 1930s, when unhappy memories of World War I combined with the economic catastrophe of the Great Depression to constrain American interventionism to an unusual degree and produce the first and perhaps only genuinely isolationist period in American history.
So are we back to the mentality of the 1930s? It wouldn’t appear so. There is no great wave of isolationism sweeping the country. There is not even the equivalent of a Patrick Buchanan, who received 3 million votes in the 1992 Republican primaries. Any isolationist tendencies that might exist are severely tempered by continuing fears of terrorist attacks that might be launched from overseas. Nor are the vast majority of Americans suffering from economic calamity to nearly the degree that they did in the Great Depression.
Even if we were to repeat the policies of the 1930s, however, it is worth recalling that the unusual restraint of those years was not sufficient to keep the United States out of war. On the contrary, the United States took actions which ultimately led to the greatest and most costly foreign intervention in its history. Even the most determined and in those years powerful isolationists could not prevent it.
Today there are a number of obvious possible contingencies that might lead the United States to substantial interventions overseas, notwithstanding the preference of the public and its political leaders to avoid them. Few Americans want a war with Iran, for instance. But it is not implausible that a president—indeed, this president—might find himself in a situation where military conflict at some level is hard to avoid. The continued success of the international sanctions regime that the Obama administration has so skillfully put into place, for instance, might eventually cause the Iranian government to lash out in some way—perhaps by attempting to close the Strait of Hormuz. Recall that Japan launched its attack on Pearl Harbor in no small part as a response to oil sanctions imposed by a Roosevelt administration that had not the slightest interest or intention of fighting a war against Japan but was merely expressing moral outrage at Japanese behavior on the Chinese mainland. Perhaps in an Iranian contingency, the military actions would stay limited. But perhaps, too, they would escalate. One could well imagine an American public, now so eager to avoid intervention, suddenly demanding that their president retaliate. Then there is the possibility that a military exchange between Israel and Iran, initiated by Israel, could drag the United States into conflict with Iran. Are such scenarios so farfetched that they can be ruled out by Pentagon planners?
Other possible contingencies include a war on the Korean Peninsula, where the United States is bound by treaty to come to the aid of its South Korean ally; and possible interventions in Yemen or Somalia, should those states fail even more than they already have and become even more fertile ground for al Qaeda and other terrorist groups. And what about those “humanitarian” interventions that are first on everyone’s list to be avoided? Should another earthquake or some other natural or man-made catastrophe strike, say, Haiti and present the looming prospect of mass starvation and disease and political anarchy just a few hundred miles off U.S. shores, with the possibility of thousands if not hundreds of thousands of refugees, can anyone be confident that an American president will not feel compelled to send an intervention force to help?
Some may hope that a smaller U.S. military, compelled by the necessity of budget constraints, would prevent a president from intervening. More likely, however, it would simply prevent a president from intervening effectively. This, after all, was the experience of the Bush administration in Iraq and Afghanistan. Both because of constraints and as a conscious strategic choice, the Bush administration sent too few troops to both countries. The results were lengthy, unsuccessful conflicts, burgeoning counterinsurgencies, and loss of confidence in American will and capacity, as well as large annual expenditures. Would it not have been better, and also cheaper, to have sent larger numbers of forces initially to both places and brought about a more rapid conclusion to the fighting? The point is, it may prove cheaper in the long run to have larger forces that can fight wars quickly and conclusively, as Colin Powell long ago suggested, than to have smaller forces that can’t. Would a defense planner trying to anticipate future American actions be wise to base planned force structure on the assumption that the United States is out of the intervention business? Or would that be the kind of penny-wise, pound-foolish calculation that, in matters of national security, can prove so unfortunate?
The debates over whether and how the United States should respond to the world’s strategic challenges will and should continue. Armed interventions overseas should be weighed carefully, as always, with an eye to whether the risk of inaction is greater than the risks of action. And as always, these judgments will be merely that: judgments, made with inadequate information and intelligence and no certainty about the outcomes. No foreign policy doctrine can avoid errors of omission and commission. But history has provided some lessons, and for the United States the lesson has been fairly clear: The world is better off, and the United States is better off, in the kind of international system that American power has built and defended.
As Haass and Roger C. Altman have correctly noted, “it is not reckless American activity in the world that jeopardizes American solvency but American profligacy at home.” The United States may be in peril because of its spiraling deficits and mounting debt, but it will be in even greater peril if, out of some misguided sense that our national security budgets must “share the pain,” we weaken ourselves even further.
Robert Kagan is a contributing editor to The Weekly Standard and a senior fellow in foreign policy at the Brookings Institution.
Recent Blog Posts