Modern political discourse in the US seems to have become detached from reality in so many ways in the current 2016 Presidential Election campaign and before. So I thought it a good time to post one of our Top Ten Conflict Tips; this time around how we can become more modestly realistic in conflict. Realism about the conflict landscape we face is one of the core principles of this blog. So here goes:
- We need to see that as the Jewish Torah puts it: “We don’t see the world as it is, but as we are”
- Accordingly, we need to become aware of our biases in how we perceive the conflict situation. Daniel Kahneman and Amos Tversky revolutionized our understanding of our tendency to use mental short cuts that misrepresent the world, such as our tendency to ignore base rates or probability and prefer emotionally vivid but improbable stories, our mistakes in framing reality or anchoring in irrelevant frames and other heuristics we unreflectively knee jerk to. (Daniel Kahneman’s fine book “Thinking Fast, Thinking Slow” covers this well.)
- One of our most prevalent short cuts is to distort the world via the lense of self-interest: portraying it in self-serving ways. Our forecasting often tends to be biased to what we want to happen, rather than what is likely to happen based on trends and data, for example.
- One way to overcome our blindness to our biases, is in a conflict situation to notice the biases of the other side, which we are usually quick to spot and then to assume and uncover identical if opposing biases in ourselves. Matthew 7.3 nailed this issue pretty well: “Why do you look at the speck of sawdust in your brother’s eye and pay no attention to the plank in your own eye?”
- Once we are begining to uncover our biases and create a less biased picture of the reality we face, we need to become aware of another major bias: confirmation bias: selectively seeking only that data, which confirms our existing view and prejudices (the word prejudice coming from the Latin roots prae and judicium or pre-judgement.)
- To assist in overcoming our confirmation bias, I suggest an approach I call Reverse Bayesianism. Bayes Theorem suggests how much we might change our mind in response to new data. I suggest we flip this and explicitly list our beliefs (writing things down being another key principle of this blog) about a situation we face and then ask ourselves: “what data would make us change our mind about this?‘ And then go seek, go look for it, with an open mind to see if it exists, the very opposite of confirmation bias, a dis-confirmation bias. This is also consistent with Karl Popper’s view of scientific theory that it is always provisional and under scrutiny for testing. In politics, think your candidate is reliable and consistent? Go seek evidence they are not. Think the other side’s candidate is not reliable and consistent? Go seek evidence they are.
- It also helps, as Isaiah Berlin suggested in his famous insight: “The Fox knows many things, the Hedgehog just one” to use multiple lenses to see the world as a Fox, and try to understand it, rather than one blinkered ideological single lens as a Hedgehog. Different lenses not only for different situations, but different lenses for the same situation, to get more of a 3D take on it, more stereoscopy. For instance, look at an economic situation via the lens of Keynes, Marshall, Adam Smith, Milton Friedman, whoever.
- The aim is to achieve what Philip Kitchen calls Modest Realism: a take on reality that is provisional, but in which we grow the confidence to act, as we reduce our bias and gather a sound set of data that refine our multi lens view of reality. And of course there will be times of emergency or immediate threat when we have to act on our best flawed take as there is no time for much reflection. But avoid where possible the flawed syllogism: “We must do something; this is something; therefore we must do this.” Tempting though this is. Sometimes inaction is the best option….and rare is the conflict situation where a few minutes fast reflection is not possible.
- We should also learn from our mistakes, admitting when our take on reality was flawed and digging into the source of error to do better next time rather than making excuses or denying our error. Which is why writing down our assumptions explicitly is so useful: hard to fake success then. Create a conflict learning loop perhaps using the US Military’s After Action Review: looking at our assumptions about reality in the conflict, which assumptions worked well and why; which didn’t work out, were shown to be flawed and why; and what, in the light of these insights, would we do differently next time in terms of establishing the reality of the conflict situation.
- And in all of this I guess I maintain my stance of absolute opposition to the Post Modernist idea that multiple narratives exist (they do), but that none are to be privileged over any other. Which to me is clearly delusional: there are more accurate, more useful takes on reality (as we drive cars on crowded roads) and there are deeply useless, implausible ones. Reality does indeed bite at some stage as Orwell said, sometimes on the battlefield and almost always at some stage in any form of conflict, personal, economic, political, religious or indeed military.
I hope this is of some modest help in the conflicts you are involved in and also that it may guide your consideration of how you vote. 🙂