Menu

These 3 Cognitive Biases Can Kill Innovation Guest contributor Greg Satell - July 25, 2019

Organizing Innovation Taras Chernus R Wkp Nxq E R M Unsplash Min 5Aefaa7Efaf470F4Bca08A0B1784E504

Probably the biggest myth about innovation is that it’s about ideas. It’s not. It’s about solving problems. The truth is that nobody cares about what ideas you have, they care about the problems you can solve for them. So don’t worry about coming up with a brilliant idea. If you find a meaningful problem, the ideas will come.

The problem with ideas is that so many of them are bad. Remember New Coke? It seemed like a great idea at first. The new formula tested well among consumers and even had some initial success in the market. Yet what the marketers missed is that many had an emotional attachment to the old formula and created a huge backlash.

Our minds tend to play tricks on us. We think we’ve done our homework and that we base our ideas on solid insights, but often that’s not the case. We see what we want to see and then protect our ideas by ignoring or explaining away facts that don’t fit the pattern. In particular, we need to learn to identify and avoid these three cognitive biases that kill innovation.

1. Availability Bias

It’s easy to see where the marketers at Coke went wrong. They had done extensive market testing and the results came back wildly positive. People consistently preferred the new Coke formula over the old one. The emotional ties that people had to the old formula, however, were harder to see.

Psychologists call these types of errors availability bias. We tend to base our judgments on the information that is most easily available, such as market testing, and neglect other factors, such as emotional bonds. Often the most important factors are the ones that you don’t see and therefore don’t figure into your decision making.

The way to limit availability bias is to push yourself to get uncomfortable facts in front of you. In his new book, Farsighted, Steven Johnson notes two techniques that can help. The first, called pre-mortems, asks you to imagine that the project has failed and figure out why it happened. The second, called red teaming sets up an independent team to find holes in the idea.

Amazon’s innovation process is specifically set up to overcome availability bias. Project managers are required to write a 6-page memo at the start of every project, which includes a press release of both positive and negative reactions. Through a series of meetings, other stakeholders do their best to poke holes in the idea. None of this guarantees success, but Amazon’s track record is exceptionally good.

2. Confirmation Bias

Availability bias isn’t the only way we come to believe things that aren’t true. The machinery in our brains is naturally geared towards making quick judgments.  We tend to lock onto the first information we see (called priming) and that affects how we see subsequent data (framing). Sometimes, we just get bad information from a seemingly trustworthy, but unreliable source.

In any case, once we come to believe something, we will tend to look for information that confirms it and discount contrary evidence. We will also interpret new information differently according to our preexisting beliefs. When presented with a set of relatively ambiguous set of facts, we are likely to see them as supporting out position.

This dynamic plays out in groups as well. We tend to want to form an easy consensus with those around us. Dissent and conflict are uncomfortable. In one study that asked participants to solve a murder mystery, the more diverse teams came up with better answers, but reported doubt and discomfort. The more homogenous teams performed worse, but were more confident.

Imagine yourself sitting in a New Coke planning meeting. How much courage would it have taken to challenge the consensus view? How much confidence would you have in your dissent? What repercussions would you be willing to risk? We’d all like to think that we’d speak up, but would we?

3. The Semmelweis Effect

In 1847, a young doctor named Ignaz Semmelweis had a major breakthrough. Working in a maternity ward, he discovered that a regime of hand washing could dramatically lower the incidence of childbed fever. Unfortunately, instead of being lauded for his accomplishment, he was castigated and considered a quack. The germ theory of disease didn’t take hold until decades later.

The phenomenon is now known as the Semmelweis effect, the tendency for professionals in a particular field to reject new knowledge that contradicts established beliefs. The Semmelweis effect is, essentially, confirmation bias on a massive scale. It is simply very hard for people to discard ideas that they feel have served them well.

However, look deeper into the Semmelweis story and you will find a second effect that is just as damaging. When the young doctor found that his discovery met some initial resistance, he railed against the establishment instead of collecting more evidence and formatting and communicating his data more clearly. He thought it just should have been obvious.

Compare that to the story of Jim Allison, who discovered cancer immunotherapy. At first, pharmaceutical companies refused to invest in Jim’s idea. Yet unlike Semmelweis, he kept working to gather more data and convince others that his idea could work. Unlike Semmelweis, who ended up dying in an insane asylum, Jim won the Nobel Prize.

We all have a tendency to reject those who reject our ideas. Truly great innovators like Jim Allison, however, just look at that as another problem to solve.

Don’t Believe Everything You Think

When I’m in the late stages of writing a book, I always start sending out sections to be fact checked by experts and others who have first-person knowledge of events. In some cases, these are people I have interviewed extensively, but in others sending out the fact checks is my first contact with them.

I’m always amazed how generous people are with their time, willing in some cases to go through material thoroughly just to help me get the story straight. Nevertheless, whenever something comes back wrong, I always feel defensive. I know I shouldn’t, but I do. When told that I’m wrong, I just have the urge to push back.

But I don’t. I fight that urge because I know how dangerous it is to believe everything you think, which is why I go to so much effort to send out the fact checks in the first place. That’s why, instead of publishing work that’s riddled with errors and misinterpretations, my books have held up even after being read thousands of times. I’d rather feel embarrassed at my desk than in the real world.

The truth is that our most fervently held beliefs are often wrong. That’s why we need to make the effort to overcome the flawed machinery in our minds. Whether that is through a formal process like pre-mortems and red teams, or simply seeking out a fresh pair of eyes, we need to avoid believing everything we think.

That’s much easier said than done, but if you want to innovate consistently, that’s what it takes.

This article was firt published on www.digitaltonto.com.

Greg Satell 2Bw
Guest contributor Greg Satell

Greg Satell is an author, speaker, and advisor. Go to 

Find out more about our Bootcamps, Tours & Innovation platform!

Want more inspiration about innovation?