Bertold Brecht(1898-1956): Top Ten Conflict Tips

Having recently used one of his quotes I found interesting, time for one of your Top Ten Conflict Tips from one of the 20th Century’s most conflicted thinkers: Bertold Brecht.

  1. Because things are the way they are, things will not stay the way they are.
  2. The law was made for one thing alone, for the exploitation of those who don’t understand it, or are prevented by naked misery from obeying it.
  3. What is the robbing of a bank compared to the founding of a bank?
  4. To live means to finesse the processes to which one is subjugated.
  5. What they could do with ’round here is a good war. What else can you expect with peace running wild all over the place? You know what the trouble with peace is? No organization.
  6. The finest plans have always been spoiled by the littleness of them that should carry them out. Even emperors can’t do it all by themselves.
  7. Don’t tell me peace has broken out.
  8. The aim of science is not to open the door to infinite wisdom, but to set a limit to infinite error.”
  9. Nowadays, anyone who wishes to combat lies and ignorance and to write the truth must overcome at least five difficulties. He must have the courage to write the truth when truth is everywhere opposed; the keenness to recognize it, although it is everywhere concealed; the skill to manipulate it as a weapon; the judgment to select those in whose hands it will be effective; and the running to spread the truth among such persons.”
  10. After the uprising of the 17th of June
    The Secretary of the Writers’ Union
    Had leaflets distributed in the Stalinalle
    Stating that the people
    Had forfeited the confidence of the government
    And could win it back only
    By redoubled efforts. Would it not be easier
    In that case for the government
    To dissolve the people
    And elect another?

See also: http://en.wikipedia.org/wiki/Bertolt_Brecht

Posted in Conflict Art, Conflict History, Conflict Humor, Conflict Poetry, Conflict Processes, Philosophy of Conflict, Top Ten Conflict Tips from Great Thinkers, Uncategorized | Tagged | 1 Comment

Moral Foundations of Conservatives and Liberals: Building on Jonathan Haidt

I always found the Jonathan Haidt Moral Foundations work on the differences in the Moral Foundations of liberals and conservatives very useful. But I also had concerns that it missed some of the liberal moral concerns around Sanctity, Authority, Loyalty and didn’t include Gender Attitudes, and the very different takes on Caring and Justice. Now Jon Haidt’s work is research based rising out of intensive questioning of conservatives and liberals in many countries, here is my hypothesis of what the respective Moral Foundations of liberals and conservatives would be if research questions were a little different. 

For the original Jonathan Haidt research see: http://www.moralfoundations.org/

Here’s my take on this:

Liberal and Conservative moral foundations completed

Posted in Academic Conflict, Conflict Processes, Philosophy of Conflict, US Political Conflict, Ways to handle conflict | Tagged , , , | Leave a comment

Seneca the Younger (4BCE-65CE) Top Ten Conflict Tips

Nassim Taleb in his book Anti-Fragile is very keen on the Roman Philosopher Seneca and so I thought it time we did a one of our Top Ten Conflict Tips on Seneca, the Stoic Roman Philosopher.

  1. True happiness is… to enjoy the present, without anxious dependence upon the future.
  2. If one does not know to which port one is sailing, no wind is favorable.
  3. Wherever there is a human being, there is an opportunity for a kindness.
  4. Anger, if not restrained, is frequently more hurtful to us than the injury that provokes it.
  5. While we are postponing, life speeds by.
  6. Most powerful is he who has himself in his own power.
  7. A kingdom founded on injustice never lasts.
  8. That is never too often repeated, which is never sufficiently learned.
  9. Shall I tell you what the real evil is? To cringe to the things that are called evils, to surrender to them our freedom, in defiance of which we ought to face any suffering.
  10. A quarrel is quickly settled when deserted by one party; there is no battle unless there be two.

See also: http://en.wikipedia.org/wiki/Seneca_the_Younger

Posted in Conflict Processes, PERSONAL CONFLICT RESOLUTION: CREATIVE STRATEGIES, Philosophy of Conflict | Tagged , | Leave a comment

List of Cognitive Biases

Marvelously self challenging list of cognitive biases from Wikipedia as pointed out to me by my friend John.

 Cognitive bias describes the inherent thinking errors that humans make in processing information. Some of these have been verified empirically in the field of psychology, while others are considered general categories of bias. These thinking errors prevent one from accurately understanding reality, even when confronted with all the needed data and evidence to form an accurate view. Many conflicts between science and religion are due to cognitive biases preventing people from coming to the same conclusions with the same evidence. Cognitive bias is intrinsic to human thought, and therefore any systematic system of acquiring knowledge that attempts to describe reality must include mechanisms to control for bias or it is inherently invalid.

The best known system for vetting and limiting the consequences of cognitive bias is the scientific method, as it places evidence and methodology behind the idea under open scrutiny. By this, many opinions and separate analyses can be used to compensate for the bias of any one individual. It is important to remember, however, that in every day life, just knowing about these biases doesn’t necessarily free you from them.[1]


Many of these biases are studied for how they affect belief formation and business decisions and scientific research.
Decision-making and behavioral biases

  • Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, crowd psychology, herd behaviour, and manias.
  • Bias blind spot — the tendency not to compensate for one’s own cognitive biases.
  • Choice-supportive bias — the tendency to remember one’s choices as better than they actually were.
  • Confirmation bias — the tendency to search for or interpret information in a way that confirms one’s preconceptions.
  • Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
  • Contrast effect — the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
  • Déformation professionnelle — the tendency to look at things according to the conventions of one’s own profession, forgetting any broader point of view.
  • Endowment effect — “the fact that people often demand much more to give up an object than they would be willing to pay to acquire it”.[2]
  • Exposure-suspicion bias — a knowledge of a subject’s disease in a medical study may influence the search for causes.
  • Extreme aversion — most people will go to great lengths to avoid extremes. People are more likely to choose an option if it is the intermediate choice.
  • Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
  • Framing — drawing different conclusions from the same information, depending on how that information is presented.
  • Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, the closer to the present both payoffs are.
  • Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
  • Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
  • Information bias — the tendency to seek information even when it cannot affect action.
  • Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
  • Loss aversion — “the disutility of giving up an object is greater than the utility associated with acquiring it”.[3] (see also sunk cost effects and Endowment effect).
  • Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
  • Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
  • Obsequiousness bias — the tendency to systematically alter responses in the direction they perceive desired by the investigator.
  • Omission bias — the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
  • Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
  • Planning fallacy — the tendency to underestimate task-completion times. Also formulated as Hofstadter’s Law: “It always takes longer than you expect, even when you take into account Hofstadter’s Law.”
  • Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
  • Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
  • Reactance — the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
  • Selective perception — the tendency for expectations to affect perception.
  • Status quo bias — the tendency for people to like things to stay relatively the same (see also Loss aversion and Endowment effect).[4]
  • Survivorship bias — a form of selection bias focusing on what has survived to the present and ignoring what must have been lost.
  • Unacceptability bias — questions that may embarrass or invade privacy are refused or evaded.
  • Unit bias — the tendency to want to finish a given unit of a task or an item with strong effects on the consumption of food in particular
  • Von Restorff effect — the tendency for an item that “stands out like a sore thumb” to be more likely to be remembered than other items.
  • Zero-risk bias — the preference for reducing a small risk to zero over a greater reduction in a larger risk. It is relevant e.g. to the allocation of public health resources and the debate about nuclear power.

Biases in probability and belief

Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research.

  • Ambiguity effect — the avoidance of options for which missing information makes the probability seem “unknown”.
  • Anchoring — the tendency to rely too heavily, or “anchor,” on a past reference or on one trait or piece of information when making decisions.
  • Anthropic bias — the tendency for one’s evidence to be biased by observation selection effects.
  • Attentional bias — neglect of relevant data when making judgments of a correlation or association.
  • Availability heuristic — a biased prediction, due to the tendency to focus on the most salient and emotionally-charged outcome.
  • Clustering illusion — the tendency to see patterns where actually none exist.
  • Conjunction fallacy — the tendency to assume that specific conditions are more probable than general ones.
  • Frequency illusion — the phenomenon in which people who just learn or notice something start seeing it everywhere. Also known as the Baader-Meinhof Phenomenon.[5]
  • Gambler’s fallacy — the tendency to assume that individual random events are influenced by previous random events. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”
  • Hindsight bias — sometimes called the “I-knew-it-all-along” effect: the inclination to see past events as being predictable, based on knowledge of later events.
  • Hostile media effect — the tendency to perceive news coverage as biased against your position on an issue.
  • Illusory correlation — beliefs that inaccurately suppose a relationship between a certain type of action and an effect.
  • Ludic fallacy — the analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-gaussian distribution of many things.
  • Neglect of prior base rates effect — the tendency to fail to incorporate prior known probabilities which are pertinent to the decision at hand.
  • Observer-expectancy effect — when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
  • Optimism bias — the systematic tendency to be over-optimistic about the outcome of planned actions. Found to be linked to the “left inferior frontal gyrus” section of the brain, and disrupting this section of the brain removes the bias. Article summarising this finding
  • Overconfidence effect — the tendency to overestimate one’s own abilities.
  • Positive outcome bias — a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias and valence effect).
  • Primacy effect — the tendency to weigh initial events more than subsequent events.
  • Recency effect — the tendency to weigh recent events more than earlier events (see also ‘peak-end rule’).
  • Reminiscence bump — the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
  • Rosy retrospection — the tendency to rate past events more positively than they had actually rated them when the event occurred.
  • Subadditivity effect — the tendency to judge probability of the whole to be less than the probabilities of the parts.
  • Telescoping effect — the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.
  • Texas sharpshooter fallacy — the fallacy of selecting or adjusting a hypothesis after the data are collected, making it impossible to test the hypothesis fairly.

Social biases

Most of these biases are labeled as attributional biases.

  • Actor-observer bias — the tendency for explanations for other individual’s behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation. This is coupled with the opposite tendency for the self in that one’s explanations for their own behaviors overemphasize their situation and underemphasize the influence of their personality. (see also fundamental attribution error).
  • Dunning-Kruger effect — “…when people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it. Instead, …they are left with the mistaken impression that they are doing just fine.”[6] (See also the Lake Wobegon effect, and overconfidence effect).
  • Egocentric bias — occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
  • Forer effect (aka Barnum Effect) — the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
  • False consensus effect — the tendency for people to overestimate the degree to which others agree with them.
  • Fundamental attribution error — the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).
  • Halo effect — the tendency for a person’s positive or negative traits to “spill over” from one area of their personality to another in others’ perceptions of them (see also physical attractiveness stereotype).
  • Herd instinct — a common tendency to adopt the opinions and follow the behaviors of the majority to feel safer and to avoid conflict.
  • Illusion of asymmetric insight — people perceive their knowledge of their peers to surpass their peers’ knowledge of them.
  • Illusion of transparency — people overestimate others’ ability to know them, and they also overestimate their ability to know others.
  • Ingroup bias — the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
  • Just-world phenomenon — the tendency for people to believe that the world is “just” and therefore people “get what they deserve.”
  • Lake Wobegon effect — the human tendency to report flattering beliefs about oneself and believe that one is above average (see also worse-than-average effect, and overconfidence effect).
  • Notational bias — a form of cultural bias in which a notation induces the appearance of a nonexistent natural law.
  • Outgroup homogeneity bias — individuals see members of their own group as being relatively more varied than members of other groups.
  • Projection bias — the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
  • Self-serving bias — the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
  • Modesty bias — The tendency to blame failures on oneself while attributing successes to situational factors. Opposite of self-serving bias.
  • Self-fulfilling prophecy — the tendency to engage in behaviors that elicit results which will (consciously or subconsciously) confirm our beliefs.
  • System justification — the tendency to defend and bolster the status quo, i.e. existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest.
  • Trait ascription bias — the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.
  • Ultimate attribution error — A sub-type of the fundamental attribution error above, the ultimate attribution error occurs when negative behavior in one’s own group is explained away as circumstantial, but negative behavior among outsiders is believed to be evidence of flaws in character.

Memory errors

  • Beneffectance — perceiving oneself as responsible for desirable outcomes but not responsible for undesirable ones. (Term coined by Greenwald (1980))
  • Consistency bias — incorrectly remembering one’s past attitudes and behaviour as resembling present attitudes and behaviour.
  • Cryptomnesia — a form of misattribution where a memory is mistaken for imagination.
  • Egocentric bias — recalling the past in a self-serving manner, e.g. remembering one’s exam grades as being better than they were, or remembering a caught fish as being bigger than it was
  • Confabulation or false memory — Remembering something that never actually happened.
  • Hindsight bias — filtering memory of past events through present knowledge, so that those events look more predictable than they actually were; also known as the ‘I-knew-it-all-along effect’.
  • Selective Memory and selective reporting
  • Suggestibility — a form of misattribution where ideas suggested by a questioner are mistaken for memory. Often a key aspect of hypnotherapy.

Common theoretical causes of some cognitive biases

  • Attribution theory, especially:
    • Salience
  • Cognitive dissonance, and related:
    • Impression management
    • Self-perception theory
  • Heuristics, including:
    • Availability heuristic
    • Representativeness heuristic
  • Adaptive Bias

Posted in Conflict Humor, Conflict Processes, Marital and Relationship Conflict, PERSONAL CONFLICT RESOLUTION: CREATIVE STRATEGIES, Ways to handle conflict | Tagged | Leave a comment

“The Myth of Sisyphus”: Albert Camus

As my friends who work hard, start their week on Monday morning round the world, I offer the quote from Albert Camus’s marvelous essay “The Myth of Sisyphus” about the Greek hero Sisyphus (whom he calls the proletarian of the Greek heroes), who defied the Gods, and in punishment was sentenced for all eternity to roll a rock upwards, and then have it crash back down the slope and have to repeat the task. Bit like working on an assembly line?

This quote kept me smiling in the craziness of Labor Relations 24/7.

“I leave Sisyphus at the foot of the mountain. One always finds one’s burden again. But Sisyphus teaches the higher fidelity that negates the Gods and raises rocks. He too concludes that all is well. This universe henceforth without a master seems to him neither sterile nor futile. Each atom of that stone, each mineral flake of that night-filled mountain, in itself, forms a world. The struggle itself toward the heights is enough to fill a man’s heart. One must imagine Sisyphus happy.”

Or as my friend Natalie put it in a birthday card she sent me at the time:

Coffee Mug - Far Side Just Not Reaching That Guy

Posted in Conflict Humor, PERSONAL CONFLICT RESOLUTION: CREATIVE STRATEGIES, Philosophy of Conflict, Religious Conflict, Ways to handle conflict | Tagged , , | 2 Comments

The Fraternity of Failure Paul Krugman

A very important article in today’s New York Times by Paul Krugman, likely to be ignored by all Republicans who most need to read it: 

Jeb Bush wants to stop talking about past controversies. And you can see why. He has a lot to stop talking about. But let’s not honor his wish. You can learn a lot by studying recent history, and you can learn even more by watching how politicians respond to that history.

The big “Let’s move on” story of the past few days involved Mr. Bush’s response when asked in an interview whether, knowing what he knows now, he would have supported the 2003 invasion of Iraq. He answered that yes, he would. No W.M.D.? No stability after all the lives and money expended? No problem.

Then he tried to walk it back. He “interpreted the question wrong,” and isn’t interested in engaging “hypotheticals.” Anyway, “going back in time”is a “disservice” to those who served in the war.

Take a moment to savor the cowardice and vileness of that last remark. And, no, that’s not hyperbole. Mr. Bush is trying to hide behind the troops, pretending that any criticism of political leaders — especially, of course, his brother, the commander in chief — is an attack on the courage and patriotism of those who paid the price for their superiors’ mistakes. That’s sinking very low, and it tells us a lot more about the candidate’s character than any number of up-close-and-personal interviews.

Wait, there’s more: Incredibly, Mr. Bush resorted to the old passive-voice dodge, admitting only that “mistakes were made.” Indeed. By whom? Well, earlier this year Mr. Bush released a list of his chief advisers on foreign policy, and it was a who’s-who of mistake-makers, people who played essential roles in the Iraq disaster and other debacles.

Seriously, consider that list, which includes such luminaries as Paul Wolfowitz, who insisted that we would be welcomed as liberators and that the war would cost almost nothing, and Michael Chertoff, who as director of the Department of Homeland Security during Hurricane Katrina was unaware of the thousands of people stranded at the New Orleans convention center without food and water.

In Bushworld, in other words, playing a central role in catastrophic policy failure doesn’t disqualify you from future influence. If anything, a record of being disastrously wrong on national security issues seems to be a required credential.

Voters, even Republican primary voters, may not share that view, and the past few days have probably taken a toll on Mr. Bush’s presidential prospects. In a way, however, that’s unfair. Iraq is a special problem for the Bush family, which has a history both of never admitting mistakes and of sticking with loyal family retainers no matter how badly they perform. But refusal to learn from experience, combined with a version of political correctness in which you’re only acceptable if you have been wrong about crucial issues, is pervasive in the modern Republican Party.

What’s going on here? My best explanation is that we’re witnessing the effects of extreme tribalism. On the modern right, everything is a political litmus test. Anyone who tried to think through the pros and cons of the Iraq war was, by definition, an enemy of President George W. Bush and probably hated America; anyone who questioned whether the Federal Reserve was really debasing the currency was surely an enemy of capitalism and freedom.

It doesn’t matter that the skeptics have been proved right. Simply raising questions about the orthodoxies of the moment leads to excommunication, from which there is no coming back. So the only “experts” left standing are those who made all the approved mistakes. It’s kind of a fraternity of failure: men and women united by a shared history of getting everything wrong, and refusing to admit it. Will they get the chance to add more chapters to their reign of error?

Posted in Conflict History, Conflict Processes, US Political Conflict | Tagged , , | Leave a comment

Good Boys Are Hard To Find

I mentor a lot of young people, almost exclusively young women. And they often complaint to me that: “Good boys are hard to find.”  And based on my experience and research and talking to my female mentees over the years, here’s what I found. Clearly this is not all guys, but many or most:

1) Guys when lost don’t ask the way, so get even more lost
2) Guys lack what I call “emotional conservation”: the ability to continue to feel what they felt on Tuesday evening on Wednesday morning
3) Guys don’t really get in touch with their feelings (hence their feelings’ short life span), which they regard a bit like a cold to be shucked off asap.
4) Testosterone is not big in the neuro-chemistry of empathy or listening
5) Guys lack good role models aka their role models suck.
6) As discriminatory barriers to women’s education and career drop, many women are doing better than men and men don’t like this in a partner. The historical repressive misogyny clearly served a purpose…for men.
7) Too many women find self-absorbed, brooding “difficult” narcissistical Heathcliffs (what my “sisters” and I at Ford used to call “bolt-neck fuckheads”) to be fascinating “fixers up” to take on, only thereby encouraging their propagation, when their foundations are fundamentally unstable.

Sorry about this. No male takers for the local college male emotional literacy classes I hear.  :( And my ratio of the last 30 years of female mentees to male is about 25 to 1, not through lack of trying, though the male mentees are usually referred to me by a female in their life. And last on average two sessions.:)

And yes I am sure there are magnificent exceptions, but their rarity value means they are snapped up pretty damn quickly….And it took my wife over 30 years to fix me up…

cartoon5219

'I don't mind you earning more money than I do, Gretchen, or driving a more expensive car, but do you have to bench press more than I do, too?'

Posted in Conflict Humor, Conflict Processes, Marital and Relationship Conflict | Tagged , | Leave a comment