I am reading Daniel Kahneman’s new book Thinking, Fast and Slow. It is throwing up all sorts of new insights relevant to conflict and I heartily recommend you read it. At its core is a model of the brain consisting of two systems: System 1 which is largely unconscious, automatic and System 2 which is conscious and required effort for us to work things through. This approaches mirrors the Elephant/Rider distinction that Jon Haidt uses and I have previously referred to. It is an important element in any understanding of how we handle conflict.
But Daniel also introduces us to a range of what he calls heuristics and biases which our System 1 jumps to and our System 2 finds hard to resist. One of these is our tendency to be risk averse. There are major evolutionary advantages to paying more attention to threats than opportunities as the former can get us killed and the latter often do not offer compensatory offsets in the environment we evolved in. We are therefore typically wired to need significantly more gain to offset a given risk of loss. Typically we need twice us as much potential gain to offset a 50% chance of a given loss in the sort of experiments that Daniel has undertaken.
This has an important implication for conflict. When we enter conflict we often with our self righteous System 1’s help jump to a position, a stance about what is it that we demand in the conflict. We are right, we are entitled to a particular outcome. The problem is that my experience and a lot of research suggests we can get a much better outcome in many conflict situations by thinking in terms of our interests. Using our System 2 brain to do the hard work of figuring out our interests and the other side’s interests and creatively figuring out a way to make a deal that meets both side’s interests and even grows the size of the ‘cake’ being divided up.
Our aversion to risk of loss can interfere with this. Our System 1 brain tends to see our immediate position we adopt in conflict as right and no less than our entitlement. It does not ask how realistic or justified our position is because we have evolved to believe in ourselves and our positions, the better to convince the other side they are wrong. Moreover, to move from the position we have adopted in any direction probably feels like a loss and we need quite a bit of upside to compensate us for any risk of loss. Unfortunately, that means we dig in on a position and have to suffer real losses before we start questioning it. I found that in Labor Relations: my management and unions could not imagine their way through potential losses and had to experience them before they were willing to move off their positions. It is why wars happen and eventually we have to make peace and often in that process finally give up our positions and think about our interests. Thinking about our interests before we experience of the pain of war is often beyond us.
So I think Daniel throws a powerful light on our positional thinking and the need to use our System 2 brain in any situation of seriously consequential conflict to unpick our real interests using the Creative Conflict Model or its equivalent.
This is Daniel: