Archive for December 13th, 2010
“From the children’s point of view it was hard to tell a neighbor from a relative. She is like a sister to me was said in all sincerity. Door-to-door living over long periods of time made these people true kin to each other. The only difference between neighbors and relatives was that the neighbors went home to sleep; the relatives could climb into bed with you.” (Sam Levenson, Everything but Money).
The fact that neighbors went home to sleep and relatives could climb into your bed was information that helped a small child differentiate relatives from neighborhood friends in a crowded, confusing world encompassing the tenements of East Harlem in the early 1900s. Information, we are always searching for more in order to help us make sense of our world, to help us interpret the events by which we are surrounded, to help us make better decisions, but then we are often selective about which pieces we are going to view as credible, accepting some bits while seemingly randomly rejecting others.
There is an old tale coming out of the Middle East that is often spoken of in terms of conflict resolution, but seems to have more to do with information or the lack thereof. It goes something like this. There was a nomad who sensed he was nearing the end of his life, and so called his three sons together. He spoke to them, “I want to tell you how I plan on bequeathing the family’s 17 camels. To my oldest son I give half of my camels. To my middle son, I give a third of my camels and to my youngest son I give one ninth of my camels.” A week later the old nomad passed away and the 3 sons took to fighting over how to split the herd of camels between them. They went to the wise woman of the tribe, who mediated disputes and described the situation. She said to them, “I don’t know how to resolve your dispute, but here, I have one camel, take it and see if it makes you happy.” So now the three sons had 18 camels to divvy up. The oldest took half of the camels, or 9 of them. The middle son took a third of the camels or 6 of them and the youngest took one ninth of the camels or 2 of them. Well, 9 plus 6 plus 2 equals 17. So they had one camel left, which they gave back to the wise woman. It is a fun math exercise and makes you stop and think. What is the missing piece of information that helps you understand the story? The old nomad did not bequeath all of his camels in the first place, only 17/18th of them (1/2+1/3+1/9 = 9/18+6/18+2/18 = 17/18), which of course was impossible to do if you only had 17 camels to start with.
The conflict resolution part of this comes from an outside observer, the wise woman, being somewhat removed from the situation, being able to see a way forward from the impasse – how to divide up the 17 camels according to the nomads desires. The information part of this comes from the understanding that what was originally specified was not mathematically possible. But what is possible, and what needs to get done anyway is not always in alignment. We often need to think beyond what conventional wisdom says is possible and figure out ways to accomplish goals and mankind, in spite of our inherent flaws, is pretty good at that. And information helps, it can help a lot, but sometimes information, even compelling information is not only rejected but triggers a response of trying to get everyone else to reject the compelling information as well.
Some messages carry more effective information than others. Information that is unexpected or surprising tends to have more impact. Sean Carroll, a noted physicist, writes in From Eternity to Here, “If I tell you that the Sun is going to rise in the East tomorrow morning, I’m not actually conveying much information, because you already expected that was going to happen. But If I tell you the peak temperature tomorrow is going to be exactly 25 degrees Celsius, the message contains more information, because without the message you wouldn’t have known precisely what temperature to expect….Roughly speaking, then the information content of a message goes up as the probability of a given message taking that form goes down.” So out of the world of physics comes the notion that if a piece of information, a message, is unique, unexpected, or novel, it carries with it inherently more content, and more important content than often repeated, or completely expected information and messaging.
David Brooks produces a column summarizing notable social and psychological research, and in his December 7th column he wrote, “Classic research has suggested that the more people doubt their own beliefs the more, paradoxically, they are inclined to proselytize in favor of them. David Gal and Derek Rucker published a study in Psychological Science, call “When in Doubt, Shout”, in which they presented some research subjects with evidence that undermined their core convictions. The subjects who were forced to confront the counterevidence went on to more forcefully advocate their original beliefs, thus confirming the earlier findings.” (NY Times, 12/07/10). This coping process was originally proposed by Festinger, the father of cognitive dissonance theory, which states that when people’s behavior and thought patterns are incongruent, I advocate one thing verbally, but actually behave not according to those beliefs, that dissonance sets in which must be resolved by changing beliefs or behavior. So here is a notion that appears to go against the world as physicists know it. In the human mind, or at least among some of us anyway, if strongly held beliefs are challenged to the core, rather than giving up on that belief and saying, “Oh well, I now have better information, it was very meaningful since it was unexpected, going against my core beliefs and now I can make a better more informed decision”, there is a tendency to not only hang on to those core, now challenged beliefs but to actively try to get others to sign on to the belief as well, a belief that the person who is proselytizing about it may no longer fully believe him or herself. By getting others to embrace the shaky belief it shores up one’s own doubts and the dissonance that exists can be resolved.
Think of the implications of this in the business world. For instance, say I was selling lousy, junk mortgages. I am presented with information that says “if you proceed on this path you will put not only your own company at risk by the entire economy.” My reaction could be, rather than stopping my behavior, to try to get others to emulate my risk taking to resolve any dissonance that has set up within myself.
Think of the survey that was just conducted on the Don’t Ask, Don’t Tell policy in the US military. Each time the evidence suggests that the vast majority of those in the military feel that repeal of the legislation would have no effect on battle readiness, there are members of congress who raise additional barriers and continue to try to persuade others, to proselytize others, to their point of view. When faced with clear evidence that threatens their core beliefs, rather than accepting that evidence and changing they simply try to be more convincing to others in order to resolve potentially dissonant feelings. As an aside, I took a look at the survey itself that was used to collect the information on feelings towards Don’t Ask, Don’t Tell and my professional judgment is that the survey took a very conservative approach, asking questions in such a way that would lead to the least favorable result possible. Not that the research team was trying to bias the results, on the contrary, it appeared that they were setting the bar very high so that when the data did come in the results would be uncontestable, but contested the results are regardless.
Let’s say you are working in a company where an executive has an idea regarding how the future of the company should unfold. He or she has a lot of skin in the game regarding that idea or concept. That executive is presented with incontrovertible evidence that the idea is a dud. Given what we have just reviewed what might be the executive’s course of action? And more importantly how can organizations of any type overcome the bias that might arise?
Some of the techniques that can be used in these instances to overcome the inherent bias include:
- Oversight – of the individual by others within the organization who can pass informed judgments on the concept or idea.
- Checks and balances – on the absoluteness of power. Rather than one person having the authority to send the organization off in a new direction, a board-type approach can be of benefit, especially for big decisions and especially if the board solicits from its members…
- Independently arrived at judgments – one method to derive better decisions from a group is to have each member of the group develop independently arrived at judgments prior to comparing notes.
- And, independent assessment of the concept or decision by an outside group without a special interest in the outcome. Using what is perceived as an unbiased outside party, who can pass professional judgment on the concept, can lend additional credence to the conclusions drawn.
Even with these techniques, and even with the best of intentions you will still have some people without the ability to let go of their cherished beliefs and notions even when the facts indicate that those beliefs are clearly in error. The behavior by some will be to dig in their heels and to do their upmost to convince others of the correctness of their unsubstantiated beliefs as they struggle to come to grips with the information that they have received.
© 2010 by Jeffrey M. Saltzman. All rights reserved.
Visit OV: www.orgvitality.com