Yesterday’s confirmation hearing for Lt. Gen. James Clapper, who is poised to take over as the new director of National Intelligence, highlighted a fundamental challenge facing America’s intelligence community (IC). How much, and in what ways, should the sprawling intelligence bureaucracy be streamlined or, conversely, remain decentralized with built-in redundancies?
This question affects at least three dimensions of intelligence work: data storage and sharing, analysis, and intelligence collection.
Clapper had some interesting things to say about the first dimension in his written testimony prepared for the Senate Select Intelligence Committee. Clapper wrote (PDF):
[The National Counterterrorism Center] has gained greater access to data since 12/25 and has accelerated efforts to integrate terrorism data, making solid progress in consolidating information and applying tools to streamline searches and correlate data. However, an integrated repository of terrorism data, capable of ingesting terrorism-related information from outside sources, remains necessary to establish a foundation from which a variety of sophisticated technology tools can be applied. These capabilities can help automate the display of links and alerts, as well as provide a mechanism for visualizing complex relationships.
Clapper’s mention of “12/25” is, of course, a reference to the attempted terrorist attack on Flight 253. We’ve learned since then that the IC knew who Umar Farouk Abdulmutallab was beforehand and should have been able to connect the dots on him with the information that was available, but didn’t.
One of the chief reasons for the IC’s failure was its inability to share critical information about the al Qaeda recruit. Had it done so, it is possible (if not likely) that Abdulmutallab would have been placed on a watch list and prevented from boarding Flight 253 in the first place. As it stands, America relied on luck (Abdulmutallab failed to detonate his bomb) and the vigilance of the flight’s passengers.
The IC, with its multi-billion dollar budget, should be able to do better. Nearly nine years after the September 11 attacks, the IC still doesn’t have an “integrated repository of terrorism data” that can be easily shared across multiple layers of its vast bureaucracy.
Clapper fired back at the IC’s critics with respect to the second dimension: analysis. Perfectly timed for Clapper’s confirmation hearing, the Washington Post released its buzz-generating “Top Secret America” series this week. One of the Post’s chief criticisms is that redundancies in the IC are inefficient and problematic. That is certainly reasonable with respect to databases and data access, which Clapper concedes need to be streamlined.
But Clapper does not think the Post got it right with respect to intelligence analysis. “One man's duplication is another man's competitive analysis,” Clapper said.
Clapper’s argument is that the IC needs multiple analyses of the most important topics (Iran’s nuclear program, for instance) in order to generate a competitive dynamic that leads to better overall quality. Recent experience certainly highlights the problems with streamlined, consensus analytic products. The 2002 National Intelligence Estimate (NIE) on Saddam’s weapons of mass destruction programs was far off the mark, as was the 2007 NIE on Iran’s nuclear program. By the same token, the independent analyses performed by some intelligence bureaucracies on those two topics were wrong too. But without competition, chances are the IC will get it entirely wrong more often than not.
Therefore, the analysis dimension cuts both ways. Clapper is right that competition among analysts is needed. But the executive branch also needs streamlined analysis that can be easily digested. Reading through Clapper’s prepared testimony illustrates just how daunting that can be. Others reportedly turned down the DNI post before Clapper was picked because of the sheer number of bureaucratic headaches the job entails. At some point, the bureaucracy gets too big to provide accurate, timely, and actionable intelligence products. Chances are the IC crossed that threshold long ago.
The third dimension does not get the attention it deserves, and was not highlighted yesterday as it should have been. Much of the discussion about the IC centers on bureaucratic questions, as opposed to what the IC is really supposed to be about: acquiring sensitive intelligence on America’s enemies.
The technical aspect of this is complicated but at the same time straightforward. In most cases (but perhaps not all), we probably don’t need two bureaucracies running technical collection (e.g. satellites) on the same targets. We don’t even know if this is a problem, however. The main technical issues involve collating the enormous amounts of data collected and then making sure it is analyzed in a timely manner. In other words, it is a problem properly addressed in the first two dimensions discussed above.
Human intelligence (HUMINT) collection is a different story. If we’ve learned anything over the last decade it is that the Central Intelligence Agency fails, time and again, to run spy networks behind enemy lines. Multiple investigations have revealed that the CIA has sparse HUMINT inside WMD hotspots such as Iran and North Korea.
The U.S. military has confronted the CIA’s lack of HUMINT inside war zones. In both Iraq and Afghanistan, the CIA failed to acquire the types of basic HUMINT the military needs. Here, redundancies and contractors are vitally important to fill in the gaps. On the other hand, this probably means the CIA’s bureaucracy could use some trimming since it is not fulfilling its core mission.
The IC’s bureaucracy is bloated and undoubtedly inefficient in important ways. But redundancies are not necessarily a bad thing.
Thomas Joscelyn is a senior fellow at the Foundation for Defense of Democracies.