Top Ten Biases in How We See Reality in Conflict

It is critical in conflict that we understand the typical biases we face in making decisions and handling the conflict. This posting draws heavily on the work Daniel Kahneman and Amos Tversky on biases and heuristics,  and also the work of Howard Raiffa on conflict and Max Bazerman on decision making. This is Dan who got a Nobel Prize in Economics for this work


And this is Amos who died before he could win the Nobel for this work:



Bias Check 1: Ease of Recall

Individuals judge events that are more easily recalled from memory. Ease of recall is based on vividness or recency. More recent or vivid events are thought to be more numerous than events of equal frequency whose instances are less easily recalled. Individuals are biased in their assessments of the frequency of events based on how their memory structures affect the search process. In conflict situations, this can be result in distortion in how we see the situation. Both sides can use memorable examples to prove their point and sway the sense of the how frequently particular issues are arising. So try to step away from the situation by collecting some factual data uninfluenced by the ease of recall. But also be wary of the effect in business or other organizations where the organization structure has a profound impact on how data is collected, analyzed and conclusions are drawn.

Bias Check 2: Presumed Associations

Individuals tend to overestimate the probability of two events occurring together based on the number of similar associations they can easily recall whether from experience or social influence. This may allow a negotiation partner to simply brush off some evidence that they are cheating on a deal as simply coincidence. It may be, but the important thing is to be grounded in a realistic sense of the probability of the coincidence and use some more objective probability calculations to avoid this bias.

Bias Check 3: Insensitivity to Base Rates or Sample Size

When assessing the likelihood of events, individuals tend to ignore base rates if any other descriptive information is provided, even if it is irrelevant. The famous example is whether a woman is more likely to be feminist or a feminist librarian. Even though many librarians may be feminist, some are not; so it is more likely by definition that she is a librarian than a feminist librarian. Be aware that your negotiation partner may exploit this bias with vivid anecdotes that override the underlying base rate or frequency that an occurrence actually happens in the world. When assessing the reliability of sample information individuals frequently fail to appreciate the role of sample size. Equally, individuals expect that a sequence of data generated by a random process will look random even when the sequence is too short for those expectations to be statistically valid. Always try to use the real frequency rates to establish real probabilities.

Bias Check 4: Regression to the Mean

Individuals tend to ignore the fact that extreme events tend to regress to the mean on subsequent trials. This may be one reason that bosses tend to punish failures but not reward successes. Failures are often followed by return to average performance: so the punishment seems to work. While successes are often followed by return to average performance: so the reward doesn’t seem to work. Timing is everything and many parties to a negotiation may time the start of discussions to take advantage of an appropriate up tick in the data that favors their argument. So use a statistically significant run of data to judge the average tendency. Try different start and end dates for averages to see what effect they have, especially when dealing with politicians! Avoid using major deviations from what is statistically reasonable as the basis for decisions.

Bias Check 5: The Conjunction Fallacy:

Individuals falsely judging that conjunctions (two events co-occurring) are more probable than a more global set of occurrences of which the conjunction is a subset. This is the classic error and we need to ensure that we really think through whether what we are looking at is such a sub-set.

Bias Check 6: Insufficient Anchor Adjustment

Individuals make estimates for values based upon an initial value (derived from past events, random assignment, the opening offer of one side in a conflict or whatever information is available) and typically make insufficient adjustments from that anchor when establishing a final value. Usually adjustments are insufficient to negate the effects of the anchor. In all cases answers may be biased toward the initial anchor even if it is irrelevant. Different starting points yield different answers. Even knowing the initial answer is randomly generated does not impact the outcome. First impression syndrome anchors our view of someone. Because of it, it is very difficult to change your decision making strategies as a result of what you read here. Heuristics we identify are currently serving as your cognitive anchors and are central to your judgement processes. So any cognitive strategy suggested must be presented and understood in a manner that will force you to break your existing cognitive anchors.

This is a major source of problems in negotiations and is probably the most common tactic used by skilled, but not necessarily very ethical negotiators. It is hard to defend against because so much of our cognitive processing lunches off instant impressions. It would shake our cognitive anchors about how things are. These are central to our judgement processes and based on our existing short cut heuristics. So the key is to be aware of these phenomena and perhaps to rehearse alternative anchors or first impressions before the negotiation even starts and use this to counter prime the effect of anchoring. This may be one positive use of positional thinking, provided we are then willing to move beyond it.

Bias Check 7: Conjunctive and Disjunctive Events Bias

Individuals exhibit a bias toward over-estimating the probability of conjunctive events and underestimating the probability of disjunctive events. Thus when multiple events all need to occur, we overestimate their true likelihood, while if only one event needs to occur we underestimate the true likelihood. The probability of one event occurring provides a natural anchor for judgement of total probability. There is a natural human tendency to underestimate the likelihood of disjunctive events. A complex system fails if any of its essential components fails. Even when the likelihood of failure in one component is slight, the probability of an overall failure can be high if many components are involved. Society underestimates the potential for disaster because of our judgemental failure to recognize the multitude of things that can go wrong in incredibly complex and interactive systems such as nuclear reactors.

Bias Check 8: Overconfidence

Individuals tend to be overconfident of the infallibility of their judgements when answering moderately to extremely difficult questions. Most of us are overconfident in our estimation ability and do not acknowledge true uncertainty. We don’t then seek accurate data to make up for our ignorance. Moreover, we are most overconfident in cases of extreme to moderate difficulty. As a person’s knowledge of a subject decreases they don’t reduce their confidence sufficiently. Groups make better judgements, especially where there is broad diversity or variance in individual knowledge, specialist skills and a relatively high degree of disagreement if this can be leverage in good dialogue. But such diverse groups may still be just as prone to overestimate their ability. This is caused by anchoring as they set a confidence range around their answers, so their initial estimate serves as an anchor, which biases their estimation of confidence both directions. Adjustments from anchor are usually insufficient, resulting in an overly narrow confidence band. This can be overcome by giving feedback on overconfident judgements or asking people to explain why their answers might be wrong or far off the mark: it leads them to recognize contradictions in their judgement.

Bias Check 9: The Confirmation Trap

Individuals tend to seek confirmatory information for what they think is true and fail to search for disconfirmatory evidence. This is strongly reinforced by our bias towards self interest. We seek out data that is in line with our self interest, which might sound like a good idea, but can lead to delusions about the reality we face. And of course a manipulative opponent has their own agenda to provide confirmatory data to support. So a disciplined approach of seeking contrary data to our own prejudices as well as to our opponent’s is a powerful way to get real.

People seek confirmatory information rather than disconfirming information, even when the latter is more powerful and important. If you think all swans are white, it is more useful to go looking for black swans in places where they have been reported sighted, than simply adding more and more white swans to your data. Most of us search for data to confirm our decision. It may be more important to find out negative data that would contradict our position than confirmatory data that supports it. And in consulting we reward or appreciate people who back us up. Few of us hire a Devils advocate consultant who tells them ten ways a project won’t work, either to improve the project by anticipating and correcting the issues or making a different decision.

Bias Check 10: Hindsight and the Curse of Knowledge

After finding out whether or not an event occurred individuals tend to overestimate the degree to which they would have predicted the correct outcome. Further more they fail to ignore information they possess that other’s do not when predicting other’s behavior. In the Cuban Missile crisis the US ignored the fact that the Cubans didn’t know that the US has no future plan to invade Cuba; so they didn’t understand why the Cubans wanted nuclear missiles. People are not very good at recalling or reconstructing the way an uncertain situation appeared to them before finding out the results of the decision. They therefore, tend to overestimate what they knew before hand, based on what they later learned. Knowledge of an outcome increases an individual’s belief about the degree to which he or she would have predicted that outcome without the benefit of that knowledge.

Anchoring is often used to explain the hindsight bias. Knowledge of an event’s outcome becomes an anchor by which individuals interpret their prior judgements of the event’s likelihood. This known outcome is more cognitively salient and more available to memory. Hindsight bias reduces our ability to learn from the past and to evaluate decisions objectively. Individuals should be judged by the process and logic of their decisions, not on their results, because results are affected by a variety of outside control factors. If we rely on hindsight offered by results; we will inappropriately evaluate the decision maker’s logic judging their outcomes rather than their methods.

The curse of knowledge occurs if, when assessing other’s knowledge, people are unable to ignore knowledge they have that the others do not. This is a reversion to some typical childhood errors and a deficiency in the theory of mind of others: putting ourselves in their shoes, in this case their ignorance of what we know. Knowledge that is psychologically available is hard to forget when a person is imagining how much others know. And we harbor false belief that people understand our ambiguous messages. This is absolutely critical to conflict work.

Footnote: this posting is dedicated to Shelley at SEE who gives me better spectacles to see the world and avoid bias

About creativeconflictwisdom

I spent 32 years in a Fortune Five company working on conflict: organizational, labor relations and senior management. I have consulted in a dozen different business sectors and the US Military. I work with a local environmental non profit. I have written a book on the neuroscience of conflict, and its implications for conflict handling called Creative Conflict Wisdom (forthcoming).
This entry was posted in Conflict Processes, Neuro-science of conflict. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s