On Friday, I attended a talk by Professor Robert Jervis of Columbia University, sponsored by the International Studies Colloquium of the Political Science Department at the University of Washington. Jervis, recently ranked as one of the top 1o most influential international relations scholars by Foreign Policy, gave a talk entitled "Reports, Politics, and Intelligence Failures on WMD: The Case of Iraq" in which he analyzed and assessed what caused the failure of the US, and international, intelligence communities to miss the absence of WMD in Iraq, and what can be done to improve intelligence efforts in the future. Professor Jervis has worked with the CIA on intelligence reform, and performed an external review of the Iraq-WMD failure, much of which is classified, so much of the talk was based on that review.
The general problems experienced by the CIA in Iraq, while perhaps more serious due to their nature, are not really different, according to Jervis, from the problems experienced by many other large organizations. Organizations have pathologies all their own that makes it difficult for them to learn from their mistakes and to adapt. The specific problems in Iraq were compounded by the difficulties in penetrating the country and in understanding the erratic and confusing behaviors of Saddam Hussein. Unfortunately, all these problems are present in other scenarios, such as North Korea or Iran.
The intelligence failure itself can be explained as one of three outcomes: Wrong assessments, not having the information that good intelligence should have had (also called collection failure), and developing unreasonable inferences from the available information (analysis failure). Having looked at all of the information and spoken with the analysts, Jervis concluded that a combination of collection and analysis failure were at work, causing analysts to make errors, but not ones caused by political pressure or willful ignorance/wishful thinking. This, however, leads to an interesting paradox of good news/bad news: The good news is that the analysts did a pretty good job of processing the available information, the bad news is that it is exceedingly difficult to avoid similar problems in the future.
The main problem of collection failure is a basic one: How much intelligence is needed to prove the absence of WMD? As former SecDef Donald Rumsfeld once noted, "absence of evidence is not evidence of absence." And of course this can be true: intelligence grossly understated the state of Iraq's WMD programs before the first Gulf War (Operation Desert Storm) as well as the nuclear program of Iran before confirming evidence could be gathered. There is no easy fix for this problem, and it is likely to be even worse in the future: the US had tons of intelligence on Iraq and will likely have much less in, say, confrontations with Iran or North Korea. One thing that can be improved, according to Jervis, is to create closer links between the collectors and the analysts. More reliance on alternative explanations could help as well.
The problem of analytic failure is just as serious and just as hard to fix. Alternative explanations of Saddam's behavior seemed to be missing from the analysis -- no one bothered to ask why Hussein would commit suicide. Furthermore, while analysts were making reasonable assessments based on the available information -- remember, every country, even those opposed to the invasion agreed that Iraq had large stocks of CBW and was moving towards a nuclear program -- they did tend to overstate their confidence in their assessments.
Unfortunately, Jervis notes that the overall failure would have been very difficult to prevent. Jervis' assessment was, as noted, that given the available information and intelligence, the inference of the presence of WMD was the best and more reasonable inference, even with the benefit of hindsight, although perhaps the confidence levels were overstated. Saddam's behavior was extremely puzzling for any alternative explanations, as he was clearly behaving, even while UN inspectors were crawling all over the country, as if he was trying to hide a WMD program. Why would Hussein behave this way and risk invasion and the destruction of his regime if he did in fact have nothing to hide? Furthermore, the plausibility of this inference was supported by previous examples: the inability of the intelligence community to correctly assess Iraq's WMD program before Desert Storm, the fact that Hussein had in fact used WMD in both the Iran-Iraq War and against the Kurds. UN inspectors were only able to examine 35% of the identified potential WMD sites, and Hussein's behavior of delays and misdirection seemed to support an argument that WMD information was being moved around to deceive the inspectors.
Jervis was able to identify several fundamental problems that can be addressed, although with great difficulty. First, the analysts in general did not do a good job of questioning their base assumptions. Also, the analysts displayed a lack of understanding of how to deal with negative evidence. For example, there was no systematic tallying or analysis of reports asserting the absence of WMDs. As any good scientist knows, a theory must be able to account for negative evidence. Unfortunately, these are some of the hardest problems to deal with and Jervis was unable to provide any concrete methods for solving them.
He did present a solid argument that political pressure was not responsible for the intelligence failure, claiming that no reasonable inference to the contrary could have been drawn from the available intelligence, and that while the confidence levels could have been lower, that likely would not have been sufficient to avoid conflict. Furthermore, if the Bush Administration was systematically pressuring the CIA to produce skewed analysis, that wouldn't explain why even the countries opposed to the war, such as France or Germany, could not produce any solid evidence or arguments to the contrary. Finally, Jervis noted that the CIA did challenge the Bush Administration on both the difficulties of occupation/reconstruction as well as the possibility of an Iraq-al Qaeda link. If political pressure was to blame for the intelligence failure, you would not likely have seen disagreement on those areas either.
So, what can be done? Unfortunately, the problems that produced the Iraqi WMD intelligence failure are some of the hardest to solve. There were no glaring, obvious, easy-to-solve causes or problems, meaning that there will be no easy solutions. Jervis presented three main reform proposals: Increased use of Red Teaming (Red Teaming means using a group of people to think solely about alternative explanations and to challenge the logic of the "main" group); more product evaluation (which is very hard to do for a variety of reasons), and greater connection of the intelligence community to outside experts for new, fresh opinions.
Ultimately, Jervis' talk was exceedingly important, if not frustrating. Frustrating because the problems in intelligence analysis are not likely to be solved. The international community is facing two major WMD problems in Iran and North Korea, and good intel is critically important in dealing with each. However, the US is likely to have even less reliable intel in either country than it did in Iraq. So, how to proceed? First, recognizing that there is a problem is the first step in solving the problem. Merely being cognizant of the reasons for failure in Iraq can help prevent a repeat failure. Analysts and policy makers alike must be more aware of how to deal with negative evidence, and must be more realistic when producing confidence levels. Iraq also reveals the limitations of relying on previous experience and outcomes; lessons drawn from history can guide present policy, but cannot be used as hard and fast rules. With Iran, for example, many similar problems exist. Iran had engaged in a long practice of deceit and denial about its nuclear program, causing international observers to grossly underestimate the extent of the program until recently. Analysts must be careful not to fall back too much on historical examples to determine plausibility of present scenarios. Each must be considered independently, and all assumptions must be challenged. This of course raises the serious possibility of historical ignorance and the creation of blinders.
The business of intelligence collection and analysis is necessarily a messy one, but one that is nonetheless vital to the security of this country and the larger international community. We will inevitable have to make decisions based on incomplete and unsatsifactory information. The lessons of the WMD failure in Iraq cannot be that the analysis was wrong and suspect. Jervis presents some useful ways to think about how to improve on the analytic assessments but we must be prepared for more failures along the lines of Iraq. Policy makers must make decisions for a collective good based on the best information they have, and they will often be wrong. It's important to remember that the intelligence failure is a separate issue from debacles that have been present in the occupation and reconstruction. Becoming too cautious as a result could have just as bad consequences, if not worse.