“Psychology of Intelligence Analysis” Notes
Psychology of Intelligence Analysis by Richards J. Heuer, Jr. is a useful resource for improving your intelligence analysis skills through better thinking and self-aware combating of cognitive biases. The book is a collection of articles originally used in the CIA Directorate of Intelligence.
I read this because I’ve seen it recommended for cyber threat intelligence analysts.
You can download the free PDF, or purchase it in paperback, hardcover, or audiobook.
This page contains one or more affiliate links. As an Amazon Associate, I earn from qualifying purchases.
My notes follow.
Foreword
… information and expertise are a necessary but not sufficient means of making intelligence analysis the special product that it needs to be. A comparable effort has to be devoted to the science of analysis. This effort has to start with a clear understanding of the inherent strengths and weaknesses of the primary analytic mechanism-the human mind-and the way it processes information.
Dick Heuer makes clear that the pitfalls the human mental process sets for analysts cannot be eliminated; they are part of us. What can be done is to train people how to look for and recognize these mental obstacles, and how to develop procedures designed to offset them.
Introduction
Intelligence analysts should be self-conscious about their reasoning processes. They should think about how they make judgments and reach conclusions, not just about the judgments and conclusions themselves.
Contributors to quality of analysis: Sherman Kent, Robert “Bob” Gates, Douglas MacEachin, Richards “Dick” Heuer.
Don’t reject the possibility of deception because you don’t see evidence of it; you won’t see evidence of properly-executed deception.
Perception: Why Can’t We See What Is There To Be Seen?
“Initial exposure to blurred or ambiguous stimuli interferes with accurate perception even after more and better information becomes available.”
Memory: How Do We Remember What We Know?
“Hardening of the categories”: If people don’t have an appropriate category for something. they’re unlikely to perceive it or be able to remember it later. If categories are drawn incorrectly, people are likely to perceive and remember things inaccurately.
Evidence is diagnostic when it influences an analyst’s judgment on the relative likelihood of various hypotheses. If an item seems inconsistent with all hypotheses, it may have no diagnostic value. Without a complete set of hypotheses, it’s impossible to evaluate the “diagnosticity” of the evidence.
A hypothesis can’t be proved even by a large body of evidence consistent with it, because that same body of evidence may be consistent with other hypotheses. A hypothesis can be disproved by a single item of evidence that’s incompatible with it.
Do You Really Need More Information?
Once an experienced analyst has the minimum info necessary to make an informed judgment, additional info generally doesn’t improve the accuracy of estimates. However, additional info leads the analyst to become more confident in the judgment (to the point of overconfidence).
Keeping an Open Mind
Questioning Assumptions: see how sensitive the judgment is to changes in the major variables; try to disprove assumptions; get alternative interpretations from those who disagree with you; don’t assume the other side thinks the same way you do (mirror-imaging).
Seeing Different Perspectives: imagine yourself in the future, explaining how the event could’ve happened; explain how your assumptions could be wrong; mentally put yourself in someone else’s place; find a “devil’s advocate” to critique your views.
Creative thinking techniques
- Deferred Judgment: generate all ideas first, then evaluate them
- Quantity Leads to Quality: quantity of ideas eventually leads to quality; 1st ideas are usually most common or usual
- No Self-Imposed Constraints: generate ideas without self-imposed constraints
- Cross-Fertilization of Ideas: combine ideas and interact with other analysts
Structuring Analytical Problems
Multiattribute Utility Analysis
- List attributes you want to maximize
- Quantify relative importance of each attribute, to add up to 100%
- For each option you’re considering, rate it on each attribute
- Calculate which option best fits your preferences
Analysis of Competing Hypotheses
Analysis of competing hypotheses (ACH) requires an analyst to explicitly identify all the reasonable alternatives and have them compete against each other for the analyst’s favor, rather than evaluating their plausibility one at a time.
ACH steps
- Identify the possible hypotheses to be considered. Use a group of analysts with different perspectives to brainstorm the possibilities.
- Make a list of significant evidence and arguments for and against each hypothesis.
- Prepare a matrix with hypotheses across the top and evidence down the side. Analyze the “diagnosticity” of the evidence and arguments- that is, identify which items are most helpful in judging the relative likelihood of the hypotheses.
- Refine the matrix. Reconsider the hypotheses and delete evidence and arguments that have no diagnostic value.
- Draw tentative conclusions about the relative likelihood of each hypothesis. Proceed by trying to disprove the hypotheses rather than prove them.
- Analyze how sensitive your conclusion is to a few critical items of evidence. Consider the consequences for your analysis if that evidence were wrong, misleading, or subject to a different interpretation.
- Report conclusions. Discuss the relative likelihood of all the hypotheses, not just the most likely one.
- Identify milestones for future observation that may indicate events are taking a different course than expected.
An unproven hypothesis has no evidence that it’s correct. A disproved hypothesis has positive evidence that it’s wrong.
When you’re tempted to write, “There’s no evidence that … ,” ask yourself, “If this hypothesis is true, can I realistically expect to see evidence of it?”
This procedure leads you through a rational, systematic process that avoids some common analytical pitfalls. It in- creases the odds of getting the right answer, and it leaves an audit trail showing the evidence used in your analysis and how this evidence was interpreted. If others disagree with your judgment, the matrix can be used to highlight the precise area of disagreement. Subsequent discussion can then focus productively on the ultimate source of the differences.
What Are Cognitive Biases?
Cognitive biases are mental errors caused by subconscious mental procedures for processing info. They’re not caused by emotional or intellectual predisposition toward a certain judgment, unlike cultural bias, organizational bias, or bias from one’s self-interest.
Biases in Evaluation of Evidence
The Vividness Criterion: Give little weight to anecdotes and personal case histories, unless they’re known to be typical. Give them no weight if aggregate data based on a more valid sample is available.
Biases in Perception of Cause and Effect
People overestimate the extent to which other countries are pursuing a coherent, coordinated, rational plan, and thus also overestimate their own ability to predict future events in those nations. People also tend to assume that causes are similar to their effects, in the sense that important or large effects must have large causes.
When inferring the causes of behavior, too much weight is accorded to personal qualities and dispositions of the actor and not enough to situational determinants of the actor’s behavior. People also overestimate their own importance as both a cause and a target of the behavior of others. Finally, people often perceive relationships that do not in fact exist, because they do not have an intuitive understanding of the kinds and amount of information needed to prove a relationship.
Bias in Favor of Causal Explanations: Random events often look patterned.
Bias Favoring Perception of Centralized Direction: A country’s inconsistent policies may be the result of weak leadership, vacillation, or bargaining among bureaucratic or political interests, rather than duplicity or Machiavellian maneuvers.
Similarity of Cause and Effect: Major effects may be the result of mistakes, accidents, or aberrant behavior of an individual, rather than major causes.
Internal vs. External Causes of Behavior
- Don’t overestimate the effect of a person’s or government’s internal personality or disposition on their behavior, and don’t underestimate the effect of their response to external situational constraints.
- Don’t overestimate the effect of your response to your situation on your behavior, and don’t underestimate the effect of your personality.
Overestimating Our Own Importance: Don’t overestimate the likelihood that actions that hurt you were intentionally directed at you, and don’t underestimate the likelihood that those actions were the unintended consequences of decisions not related to you.
Illusory Correlation
- To determine a causal relationship, you must build a 2 x 2 contingency table that shows a strong relationship between factors A, B, Not A, and Not B.
- There’s not enough data to say there’s a relationship between deception and high-stakes situations.
Biases in Estimating Probabilities
Anchoring: The final estimate lands close to the initial estimate. To combat it, consciously avoid using prior judgments as a starting point, or use formal statistical procedures.
Expression of Uncertainty: After vague expressions (“possible,” “probable,” “unlikely,” “may,” “could,” etc.), put the estimated odds or percentage range in parentheses.
Assessing Probability of a Scenario: multiply the probabilities of each individual event.
Hindsight Biases in Evaluation of Intelligence Reporting
Hindsight biases: Analysts normally overestimate the accuracy of their past judgments. Postmortems normally judge that events were more foreseeable than they were.
To overcome hindsight biases, remind yourself of the uncertainty prior to a situation by asking yourself, “If the opposite outcome had occurred, would I have been surprised? If this report had told me the opposite, would I have believed it? If the opposite outcome had occurred, would it have been predictable given the info available at the time?”
Improving Intelligence Analysis
Analytical process
- Defining the problem: be sure to ask the right questions
- Generating hypotheses: identify all plausible hypotheses, then reduce them to a workable number of reasonable hypotheses
- Collecting information: collect info to evaluate all reasonable hypotheses
- Evaluating hypotheses: look for evidence to disprove hypotheses; consider using ACH
- Selecting the most likely hypothesis: choose the hypothesis with the least evidence against it; list other hypotheses and why they were rejected
- Ongoing monitoring of new information: specify criteria that would require reevaluation of hypotheses