Reinforcing our cognitive biases: telling ourselves what we want to hear

in #psychology8 years ago (edited)

Two days ago I had an interesting conversation over lunch with an acquaintance who works as a “big data” analyst at a company in Tokyo. Much of their work is basically consulting for other companies. She told me that typically the lower-level people from the client company interact directly with their “big data” company, who tend to come in asking, “What evidence can you find in support of X?” rather than “What can we learn from the data?”

Of course my acquaintance and others at her company provide the kind of information that their paying customers want. She said that when they find things contradictory to a client’s expectations, they usually include those as part of the report, but that they emphasize the results that agree with the client’s expectations.

Cognitive Biases

This made me think of Daniel Kahneman’s fascinating book, Thinking Fast and Slow, which I’m most of the way through listening as an audiobook. I very much enjoy his examples and in-depth discussions of various aspects of cognitive biases.

We humans tend to interpret our observations in ways consistent with our pre-conceived notions. I see this in science, where I’ve personally found out how difficult it is to persuade even professional scientists to give up a familiar, and quite comfortable, explanation for a newer one that agrees better with the observations. I tried that with this paper in 2009, and again with this one in 2014, both addressing (in new ways with newer data) a problem which others had recognized over 30 years before. As best I can tell we are having only modest success so far. Even in the natural sciences, where the observational evidence is supposed to be the “gold standard” (as Richard Feynman put it), mere evidence is only so effective at changing minds.

Although Kahneman himself is not so optimistic that humans can overcome their cognitive biases, he does make some suggestions for doing so. He mentions that, based on his own experience teaching, anecdotal evidence (specific examples, told as stories) is much convincing to students than is statistical evidence. This goes to his more specific point that we humans are in generally not very good at thinking statistically.

Incentives to tell people what they want to hear

Thomas Sowell, in his masterful book Knowledge and Decisions, takes a more systemic view of how human beings use knowledge and make decisions, while also consider the motivations of the individual actors. He makes the point quite strongly that human beings tend to act based on the incentives presented to them, rather than to serve the stated goals of their employers or any other organization to which they belong.
This has some not so encouraging implications, of course. Here the obvious one is that consultants have the incentive to tell their clients what they want to hear, even if there are other interpretations more consistent with the available evidence. Similarly the lower-level employees of the client companies have the incentive to obtain from the consultants evidence supporting their superiors’ pre-conceived notions, in order to get good job evaluations.

Towards the end of the interview in the video below, Kahneman himself mentions the example of Malcolm Gladwell’s best-selling book Blink. Kahneman points out that even though Gladwell himself does not believe that intuition is magical, much of what he wrote has been interpreted that way by a great many people. This reinforcement of a widely held cognitive bias has most likely been quite beneficial to Gladwell, whether or not he planned it that way.

There are far worse implications, though, of the primary incentive facing politicians and government employees, which is to protect their own jobs. Given that, it’s not at all surprising that ‘government officials’ tell the rest of us what we want to hear, no matter the truth nor the likelihood that their promises could ever be fulfilled.

Can decentralized decision-making mitigate the ill effects of cognitive biases?

Whether in private companies or in government, centralized decision-making gives a great deal of power to a few individuals, which means that their cognitive biases could have adverse consequences for a great many others humans. Here’s a great speech by Patrick Byrne, CEO of Overstock.com, where he discusses the system that he has implemented to allow information to flow within his company of about 2,000 employees. It was from Byrne that I learned about the ideas of Friedrich Hayek and Thomas Sowell concerning how knowledge is actually transmitted and used to make decisions.

Another angle on distributed knowledge and distributed intelligence is presented in Too Big to Know I haven’t read it yet, but it’s on my list given what I’ve heard and read about it. I’d appreciate comments from anyone who has read it, and suggestions for further reading.

S. Lan Smith

Kamakura, Japan

August 30, 2016

Sort:  

Thanks for the post!
I'm including it in my TOP5 Lucky Find Psychology articles for today. :)

Thank you!
Glad you liked it.

Coin Marketplace

STEEM 0.16
TRX 0.15
JST 0.029
BTC 55350.85
ETH 2319.50
USDT 1.00
SBD 2.33