How our motivations remain hidden even from ourselves
I have been reading David Brooks’s masterpiece, The Social Animal, and now see many things with newly opened eyes. Brooks has synthesized a huge amount of recent research into neuropsychology and chose to communicate it in an extremely readable “story” format. This device is an act of pure genius, it seems to me.
We all know the limitations of “do as I say” as a means of teaching life lessons. Every parent has tried and failed to help children learn something the easy way. Simply put, telling others the lessons we have learned does not work — it does not prevent them from learning things the hard way just as we did.
That does not mean, however, that we cannot pass on our accumulated wisdom to others. To do so effectively, instead of telling them the conclusion, we must share the whole story. Narratives immerse us in the full experience that resulted in learning. A good story lets us feel the same emotions that the participants did, and that emotional context is what made the lasting impression on their memory.
This is why we enjoy and can learn from others via novels, dramas, and even campfire story sharing. We go through a vicarious, foreshortened version of the storyteller’s experience. It lets us empathize deeply with those unlike us or experiences we will never have. The story need not even be a long one for this work.
For example, this morning I was reading a newspaper article about something happening at Arlington National Cemetery containing this phrase: “At one grave was a baby's sonogram.” Did you experience the same visceral reaction as I? I instantly filled in the story: some military man’s widow is pregnant, and this is the only way she can share the news with him. I doubt you have to be a former military spouse, as I am, to get a jolt over the life-altering sacrifices going on around us, largely unremarked, every day. This is the power of story.
The point of the book: we don’t really know ourselves
Brooks cites utterly convincing evidence that the majority of our feelings and reactions remain unconscious. We do things for reasons we do not see or understand, and our conscious, rationalizing mind tries to explain them after the fact.
Complex and sophisticated decision-making goes on all the time below the level of our awareness. We constantly sift through the data from our senses for important clues to guide our behavior. Where do I need to be in a second or two to catch that ball? Is that person’s smile genuine or devious? Does the speed and trajectory of that truck give me time to cross the intersection safely? Is that approaching dog a threat?
How we assess and prioritize all this data depends upon our lifetime of previous experiences. The trained athlete can calculate and predict the ball’s behavior. Our vast store of human interaction data helps us judge the authenticity of a smile. A practiced driver knows how quickly that vehicle will close with his. Our past acquaintanceship with dogs informs our understanding and expectations of them. But all these sophisticated assessments are unconscious. We are impelled to act in certain ways without realizing why.
This explains so much. Why we impulsively do things that we know are not good for us. Why we cannot simply resolve, consciously, to change our ways and stick to that decision. Why some folks are confirmed cynics or persistent conspiracy theorists, certain that no one else can be trusted. Why others are naive and overly trusting. What we have experienced, especially repeatedly, is what we expect to happen.
One interesting implication re conflict of interest
Then I read an op-ed piece in the New York Times by Harvard and Notre Dame business professors on ethical blind spots. Max Bazerman and Ann Tenbrunsel wrote the new book “Blind Spots: Why We Fail to Do What’s Right and What to Do About It.” They note how hidden influences on behavior let us behave unethically without realizing it.
From Bernie Madoff on down, many perpetrators of financial fraud have had a disconcerting tendency to rationalize and excuse the most egregiously wrong behavior. Madoff told The Times that banks and hedge funds were “complicit” in his massive Ponzi scheme, that they “had to know” that something wasn’t right. He was deluding himself that he is not responsible, and he was (correctly) pointing out that his fraud was obvious but ignored by others who benefited from it.
The professors cite research that substantiates this idea. Many experiments reveal that a focus on group goals produces an “ethical fading.” We overlook transgressions and conflicts of interest when it is in our interest to do so. For example, long-term attorneys and auditors, having invested in and developed a relationship with clients, are no longer objective in their advice. They unconsciously tell the clients what they want to hear.
The really interesting data reveals that “sanctions, like fines and penalties, can have the perverse effect of increasing the undesirable behaviors they are designed to discourage.” When people face fines for wrongdoing, they tend to cheat more, because they see the situation as a financial rather than ethical dilemma. When there is no fine, they are more conscious of their ethical responsibilities and behave better.
If fines don’t work, how about transparency? In recent discussions about how to deal with conflicts of interest, I have held the opinion that the important thing is to disclose them: as long as they are not hidden, then others can judge our actions and motivations. But other research “found that disclosure can exacerbate such conflicts by causing people to feel absolved of their duty to be objective.” Yikes.
Think of the implications here. Fining BP will not only not prevent a repeat of the behavior leading to the Gulf oil disaster, it may actually encourage it. [I’m not ignoring the absolute benefit of fines to pay for the damages and clean-up.] Having public officials declare their conflict of interest before a vote may actually set them free to vote with bias. Even, dare I say, financial penalties for not achieving high enough test scores in schools may actually encourage cheating.
Bazerman and Tenbrunsel believe we need safeguards that prevent misbehavior rather than threaten to punish it. In their field, they recommend strict division of responsibilities to minimize ethical conflicts. Auditors should only audit. Credit-rating agencies should not be financially intertwined with those they rate. And I’m leaning toward conflict of interest policies that require more than disclosure. Perhaps office-holders should not be allowed to have some kinds of relationships — or should abstain from voting when they do.
No comments :
Post a Comment