1
Introduction
YOUR GREAT-GREAT-GREAT-GREAT-GREAT-GREAT- (YOU GET THE POINT) grandfather and his friend are out walking in the jungle one bright, fine day. They hear a rustling in the bushes. The friend goes out to investigate. He finds himself cornered by a tiger andâpardon the horrorâmeets his demise as a delicious side of man-kabobs for the hungry predator. Your ancestor runs the other direction, and so lives to see another day. His takeaway? What you canât see might kill you. So itâs safe to assume the unknown might just be another tiger looking for a meal. Bottom line: stay away from the unknown.
Fast-forward a few hundred generations. The same kind of reaction toward the unknown occurs again and againâthose who take quick, decisive action to avoid novelty are more likely to have stuck around long enough to pass on their DNA. We are the descendants of the savants of risk aversion. We come by this impulse honestly. Our bodies and minds are still structured for the needs of millennia past; and just as the real threat to our bodies now isnât caloric scarcity, but caloric overabundance (despite having bodies still wired to avoid caloric scarcity), our minds are also still wired to avoid existential threats and other now-rare problems. Many of our instincts are still prepared to deal with a reality that no longer exists.
We learn from the field of Behavioral Decision Making, which draws on Psychology and Behavioral Economics, that this is just one of the many Cognitive Biases humans have developedâNegativity Bias. Cognitive Biases are a collection of mental shortcuts that have evolved in our brains over time, shaping our judgment of the world. And social scientists confirm that all humans have themâacross age groups, across cultures. To be human is to have Cognitive Biases.
Although they were once a useful tool to keep us safe from threats (like hungry tigers) and aid in quick decision makingâall important when survival was what most aspired toâwhen it comes to the world of innovation, Negativity Bias and other Cognitive Biases can stymie even the most adept thinkers. Youâve seen it time and time again. A manager whoâs afraid to try a new product development process because itâs just too unproven. Or a researcher who discounts suspect dataâ nonconsciouslyâto confirm what he thinks he already knows. Or a group of like-minded colleagues who are eager to enact real change, but still encounter roadblocks, often within themselves, without entirely understanding why.
Cognitive Biases make innovation difficult. From the momentum-killing lead shoes of Negativity Bias to the limiting blinders of Availability Bias; from the âstay-in-boundsâ electric fence of Conformity Bias to the revisionist history we get from the Curse of Knowledge, we see these biases every day, unnecessarily making the challenge of innovation trickier. And weâre far from immuneâwe see it in ourselves daily.
But we have a secret. With the right tools and techniques itâs possible to transform biases from roadblocks that hinder innovation into exciting opportunities to think differently. We believe that with awareness of these Cognitive Biases, we can not only overcome them, but increase our understanding of all the people involved in the effort to go from concept to marketâour customers, our partners, and ourselves.
Itâs why we were compelled to write this book. We think the best place to start is awareness. So letâs start at the beginning and focus on the star of this show: your brain.
System 1 and System 2 Thinking
Your brain is one of the hardest-working organs in your body, consuming about 20 percent of your energy while representing only 2 percent of your body mass. As such, your brain has developed ingenious ways to conserve energyâpowering up and down its processing power based upon the demands of each task.
In his book Thinking, Fast and Slow, Daniel Kahneman describes how our brains have two distinct modes of thinking to help us make the most of our resources.
System 1 (Fast) is the âeasyâ type of thinking that our mind defaults to unless thereâs a compelling reason to take on harder thinking. Itâs also referred to as intuitive thinking. Itâs nonconscious and automatic. It takes almost no effort for System 1 thinking to kick in and direct the show, so itâs also pretty energy-efficient. Behind this efficiency is a very sophisticated pattern-recognition system. It seems simple to us because lots of hard work has already been devoted to programing System 1 for âautomatic pilotâ by dealing with complex subtleties and inferring consistencies, all outside our awareness. For example, you might remember how much attention and concentration it took to learn how to drive when you were a teenager. You had to pay attention to the hundreds of details such as the sides of the road, the car ahead of you, the car behind you (that you saw in that little rearview mirror in front of you), and when it was time to put on the blinker before you made a left-hand turn at a busy intersection. After many years on the road, the process has become automatic, and we donât always know when to switch back from âautomatic pilotâ to more careful attention.
In the same manner, our brains shift most of our habitual decision-making to automatic pilot, and for many daily tasks and decisions, System 1 thinking does most of the workâthe heavy lifting. You have undoubtedly driven all the way home from work without any conscious memory of going through the motions of driving the car. The same process takes place when we decide what we like and donât like, or what we think will work and what wonât work. We donât take the time to evaluate every variable and look for new or subtle sources of information. We react automatically and direct our own behavior accordingly. We couldnât get through the day without the efficiencies of System 1.
One of the places we see System 1 thinking most frequently in our business is during the idea-generation phase. This phase is usually fast paced and lots of fun. Our brains are making lots of nonconscious and intuitive connections. As facilitators, weâre often directing participants in drawing pictures and playing games to stimulate new thinking and get past the habitual, logical, and limited. For most people, itâs an engaging and enjoyable process.
But, the System 1 thinking that makes all that efficiency possible is also home to our nonconscious Cognitive Biases. These mental shortcuts are always influencing our thinking, and potentially hindering our range of ideas, even when it feels like ideas are flowing freely.
For example, these Cognitive Biases often influence our decision making when itâs time to choose which ideas to move forward. Because a large share of processing happens outside of awareness, we can be blind to the cognitive errors that ensue. And to make matters worse, System 1 can also lead us to be unjustifiably confidentâlike Steve Carellâs character, Michael Scott, in the US version of The Officeâso when we rely solely on this intuitive thinking, we can make errors and omissions, often unknowingly. Itâs like the old adage, âIt is when you think you are the most right that you are at risk of being the most wrong.â Anyone who has ever been married can vouch for this assertion. We need to be aware of our various System 1 instincts that both make our life easier and can trip us up at important moments.
System 2 (Slow) is thinking that requires more effort, more focus, and more conscious thought. Itâs often called conscious thinking, or reflective thinking. Itâs the more deliberate and deliberative part of our thinking process. Think of Rodinâs statue âThe Thinker.â A hard cognitive task just might require us to sit down and ruminate for a bit, perhaps with our chin resting on our hand. System 2 requires a lot more energy. To conserve our mental âbandwidth,â our brains donât like to be bothered unnecessarily if System 1 thinking can handle the task. While both modes of thinking are happening all the time, System 2 is generally relegated to monitoring and ratifying the decisions of System 1. System 2 is only activated in response to particular circumstancesâlike when the stakes are high, when an obvious error is detected, or when careful reasoning is required.
Robust innovation and creative problem-solving processes require some serious System 2 thinking. While thereâs plenty of System 1 thinking involved along the way, if you neglect System 2 thinking when itâs needed, you will miss out on some really good ideas; you might even make some bad judgment calls that could have been avoided if you had engaged System 2 more.
We frequently see our clients trying to avoid engaging in System 2 thinking immediately after Idea Generation, when itâs time to select which ideas will move forward. Thatâs the time when we have to take a deliberate look at all the great ideas we have generated and narrow them down to a manageable set to move forward. Suddenly, it all becomes . . . A lot. Less. Fun.
At this point, itâs pretty typical for teams to start avoiding System 2 thinking (even though theyâre not consciously aware of their resistance), and itâs our job as Facilitators to counter the objections and ensure that the needed deliberate thinking will happen. The objections will be couched in seemingly rational arguments. For example, people will say, âIt takes too long to review all the ideas. We donât have time,â or âLetâs just have everyone champion a few ideas instead of reviewing all of them. The ones we remember are probably the best ones anyway.â But donât be fooled by these clever excuses. Itâs merely a group of brains trying to conserve energy.
But to be clear, System 1 thinking isnât just random associations. There are some real benefits that can be used to our advantage for fast decisions when the stakes arenât too high:
- System 1 works well when the choice between Option A and Option B simply isnât a big deal or is a habitual choice. Think about different brands of pasta sauce, or weekend video-on-demand options. These choices arenât ones youâre going to take too much time on . . . unless youâre a really discriminating marinara connoisseur or have a deep-seated love for John Cusack movies.
- System 1 is especially helpful when we have developed enough expertise in a given area that we can rely on our well-honed powers of pattern recognition to guide us quickly without having to think very hard. This is actually the brilliant observation behind Malcolm Gladwellâs best-selling book Blink. Experts have ways of knowing (System 1) that they canât even explain (with System 2). Gladwell points out that an experienced art appraiser can spot a forgery but may not be able to say exactly whatâs wrong with the piece. You and I will also have our âgut reactions,â but we wouldnât want to spend a lot of money without consulting an expert, who has trained intuitions from facing certain decisions so many times that the patterns become automatic.
Bounded Rationality
A slightly different take on this process is the concept of Bounded Rationality. Classical economics has at its root the idea of the Rational Actor, who is assumed to have:
- Unlimited time.
- Unfettered access to information.
Under these mythical conditions, the hypothetical âRational Actorâ will always make the most logical and value-maximizing decisions that are always in their own best interests. And while computers can do the calculations to determine how the âRational Actorâ would behave, real people never make decisions that way. The fact is, time and access to information are always limitedâor at least bounded by practical constraints and daily pressures. Herbert Simon, who first proposed this concept, suggests that we more often use rules of thumb, or âheuristics,â rather than rigid rules of optimization, because we have other important decisions to make elsewhere, and our brains arenât designed to make these kinds of calculations at all. Life is short. The modern trend of âso much to do, so little time,â requires more and more efficiency and multitasking. Luckily, a lot of our decisions donât really require much attention, so itâs often adaptive to make use of our own heuristics. And they serve us pretty wellâas long as we donât mind âthinking inside the box.â
Behavioral Economics Moves Forward
Weâve recently seen the key insights from Behavioral Economics (BE) move into other fields of human endeavor, such as Behavioral Ethics, Behavioral Law, and Behavioral Health Policy. One of the lead BE researchers, Professor Cass Sunstein, even served as the Administrator of the Office of Information and Regulatory Affairs in the Obama administration. One of his primary contributions was to incorporate key BE insights in communicating and incentivizing regulatory compliance, and this approach has also been embraced by British Prime Minister David Cameronâs team. BE principles transcend political parties, as they speak to what it really means to be human, on both sides of the aisle.
In a field that is so dependent on applying a better understanding of human behavior, itâs time for Innovation to embrace Behavioral Innovationâ˘. We know by now that we are all âpredictably irrationalâ (as Dan Ariely observed)âhardly the Rational Actor that was proposed by Simon in the 1970s. Through the lens of Behavioral Innovation, many conundrums that âinnovationistasâ have wrestled with for decades are becoming more understandable. We now know better than ever how to recognize and overcome some of the more vexing obstacles in our own cognition, and facilitate more creative thinking in the stakeholders we serveâincluding customers, channel partners, and regulators.
The Cognitive Biases of Innovation
So, now that we know we are fighting against our own prehistoric cognitive wiring, what can we do? We need to work constantly and consciously on compensating for these âbugsâ in our cognition to reduce their innovation-inhibiting forces. The first step is to recognize the role of specific Cognitive Biases when they are operating nonconsciously in System 1 so we can strategically call upon System 2 to take us to another level of innovation. While itâs helpful to be aware of the longer list of Cognitive Biases, there are eight biases that we see impeding innovation on a regular basis, including:
- Negativity Bias: âBad is stronger than good.â Since the Cognitive Biases were developed to keep us safe from perceived threats, âbadâ is always more salient than good. We fear loss more than we appreciate gain. As such, negativity has a more powerful effect on our thinking and behavior than positivity does. Metaphor: lead shoes unnecessarily slowing you down when youâre trying to move forward.
- Availability Bias: âWhat you see is all there is.â When making decisions, we tend to go straight to what we can recall most quickly and easily, and often miss thingsâsuch as valuable information and a more realistic sense of likely outcomesâthat could lead to better decisions. More vivid memories (including more negative ones) will stand out and skew our judgment. Metaphor: horse blinders focusing you only on whatâs right in front of you.
- Curse of Knowledge: âWell, itâs just obvious that . . .â Once we become experts in something, we have a hard time placing ourselves back into a position of someone who lacks that knowledge, no matter how much we believe we can. We assume that others have waaay too much prior knowledge and have trouble taking their perspective and explaining our ideas in a way that they understand. As difficult as it can be to know something, it is also difficult to âunknowâ it. Metaphor: revisionist history that comes to be accepted as conventional wisdom, but isnât quite the whole story.
- Status Quo Bias: âThe bird in the hand.â In every proposed course of action, the automatic default/safest position is to keep things as they are and to ratify previous decisions. We believe that we canât be criticized for making a bad decision if we merely endorse the Status Quoâincumbency gives it power that it often doesnât deserve. Metaphor: balloon ballast keeping you from getting very far off the well-trodden ground.
- Confabulation: âOf course thatâs why I did that!â We often make decisions emotionallyâand not as systematically as we believeâ and then pull together plausible-enough justification for these decisions. We usually believe that our manufactured rationale is true. Weâre not lying, weâre just not fully aware of why we chose what we chose. And besides, it sounds so plausible. Metaphor: unreliable eyewitnesses who believe fervently they saw what they say, but in reality saw only a fraction of what happened, and even that was influenced by the emotion of the moment.
- Conformity Bias: âPlay along to get along.â This is the need for agreement in a group that keeps us from exploring alternative perspectives. By putting more focus on agreement than on the quality of the decision itself, we end up making worse decisions. Metaphor: a petâs âelectric fenceâ collar keeping us well within the safest boundaries, usually to excess.
- Confirmation Bias: âJust as I thought.â This is the tendency to seek out evidence that supports the position weâve already embracedâ regardless of whether the information is trueâand ignore anything that contradicts our preconceived notion. We subconsciously skew how new evidence is evaluated depending on its support of our previous decisions. Metaphor: the royal courtiers in the âEmperorâs New Clothesâ who strain to maintain the agreed-upon reality.
- Framing: âLike a fish in water.â This is how individuals, groups, and societies organize, perceive, and communicate about reality. Our frame is the mental picture we have of our world; itâs the paradigm through which we perceive r...