Critical Thinking for Marketers, Volume II
eBook - ePub

Critical Thinking for Marketers, Volume II

  1. 85 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Critical Thinking for Marketers, Volume II

About this book

This second volume of Critical Thinking for Marketers expands your background knowledge of other areas of critical thinking that are making major contributions to both marketing as a social science and marketing as an applied science. Section I, Think Better, provides introductory discussions of - marketing as a science; the difference between correlation and causation; the meaning of what a "concept" is and why it is critical for marketers to develop good concept definitions (e.g., "What is customer satisfaction?"); why the 18th century Scottish philosopher David Hume is relevant to marketers today; and the impact that behavioral economics is having on how marketers do their job. Section II, Cognitive Biases and Their Importance, talks about recent discoveries in cognitive psychology and neuroscience that have relevance to marketers. You'll learn that marketers need to be aware of their own cognitive biases and irrational thinking processes, which often lead to making bad decisions, and that the retail and business customers we market to are not as rational as we may think and hope they are. Finally, Section III, Conclusions, draws on both Volumes I and II to summarize the book's primary messages with helpful hints on applying your new tools and making better marketing decisions.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Critical Thinking for Marketers, Volume II by David Dwight, Terry Grapentine, David Soorholtz in PDF and/or ePUB format, as well as other popular books in Business & Marketing. We have over one million books available in our catalogue for you to explore.

Information

Subtopic
Marketing
SECTION II
Cognitive Biases and Their Importance
CHAPTER 9
Introduction
Cognitive bias is a “systematic error in judgment and decision making common to all human beings, which can be due to cognitive limitations, motivational factors, and/or adaptations to natural environments.”1 Some of the logical fallacies discussed earlier in this book can be a source of cognitive bias, such as interpreting correlation as causation in the Affirming the Consequent fallacy. Other sources are covered in some of the Think Better sections (see the discussion on Behavioral Economics). But could another source be the very evolutionary processes that created us homo sapiens?
Having a comprehensive understanding of all these sources of cognitive biases helps us develop “epistemic humility,” which is being humble about the knowledge we think we possess, and which is a necessary requirement for being a good critical thinker.
The remainder of this section is excerpted from Kevin deLaplante’s discussion of cognitive biases, which can be found on his Critical Thinking Academy website.2 We include it in our book because it is an excellent and relevant discussion of this subject matter.
Additionally, at various points in this section, we include shaded sidebar discussions focusing on how deLaplante’s article relates to critical thinking in marketing.
* * *
CHAPTER 10
What They Are and Why They’re Important
Everyone agrees that logic and argumentation are important for critical thinking. (And an important component of improving one’s critical thinking skills is) background knowledge ....
There are different types of background knowledge that are relevant to critical thinking in different ways. One of the most important types of background knowledge is knowledge of how our minds actually work—how human beings actually think and reason, how we actually form beliefs, how we actually make decisions.
There are a lot of different scientific fields that study how our minds actually work. These include behavioral psychology, social psychology, cognitive psychology, cognitive neuroscience, and other fields. Over the past 40 years we’ve learned an awful lot about human reasoning and decision making.
A lot of this research was stimulated by the work of two important researchers, Daniel Kahneman and Amos Tversky, going back to the early 1970s. They laid the foundations for what is now called the “biases and heuristics” tradition in psychology.1
Normative Versus Descriptive Theories of Human Reasoning
To get a feel for the importance of this research, let’s back up a bit. When studying human reasoning you can ask two sorts of question. One is a purely description question—how do human beings in fact reason? The other is a prescriptive or normative question—how should human beings reason? What’s the difference between good reasoning and bad reasoning?
When we study logic and argumentation, we’re learning a set of rules and concepts that permit us to answer this second question—how should we reason ....
... [O]ver time, we’ve developed a number of different theories of rationality that give us norms for correct reasoning in different domains.
This is great, of course, (as) these are very powerful and useful tools. (Some of which are the focus of this book).
Now, when it comes to the study of how human reasoning actually works, before Kahneman and Tversky’s work in the 1970s, there was a widely shared view that, more often than not, the mind, or the brain, processes information in ways that mimic the formal models of reasoning and decision making that were familiar from our normative models of reasoning, from formal logic, probability theory, and decision theory.
This “widely shared view” has influenced the methods marketing researchers often use. For example, three commonly used methods researchers employ to model brand choice are (1) conjoint analysis, (2) regression analysis, or (3) the combination of brand attribute performance ratings with attribute “importance” ratings. Findings from behavioral economics suggest that these kinds of models do not tell us the complete story—in fact they may weave a false story—of how consumers select brands. In short, consumers are not as “rational” as we think they are.
What Kahneman and Tversky showed is that, more often than not, this is NOT the way our minds work—they showed that there’s a gap between how our normative theories say we should reason and how we in fact reason.
This gap can manifest itself in different ways, and there’s no one single explanation for it. One reason, for example, is that in real-world situations, the reasoning processes prescribed by our normative theories of rationality can be computationally very intensive. Our brains would need to process an awful lot of information to implement our best normative theories of reasoning. But that kind of information processing takes time, and in the real world we often need to make decisions much quicker, sometimes in milliseconds. You can imagine this time pressure being even more intense if you think about the situations facing our homo sapiens ancestors; if there’s a big animal charging you and you wait too long to figure out what to do, you’re dead.
This is an important point for marketers. Often, marketing managers need to make relatively quick decisions based on either too little or too much information. This frequently leads to using “rules of thumb” that we fall back on to save time. These shortcuts in making decisions are called “heuristics”—practical, time-saving processes used to make quick decisions that are not necessary optimal or perfect. Sometimes, these heuristics employ one or more of the 60 logical fallacies discussed in this book—and when they do, you can be certain that the likelihood of a bad decision being made will increase. Note that a “heuristic” can also refer to ways “for thinking about phenomena or questions in a way that might give you new insights and ideas,” 2 which can be used in argument development. Our Think Better piece on The Five Whys is an example.
Biases and Heuristics (Rules of Thumb)
So, the speculation is that our brains have evolved various short-cut mechanisms for making decisions, especially when the problems we’re facing are complex, we have incomplete information, and there’s risk involved. In these situations we sample the information available to us, we focus on just those bits that are most relevant to our decision task, and we make a decision based on a rule of thumb, or a shortcut, that does the job.
These rules of thumb are the “heuristics” in the “biases and heuristics” literature.
Two important things to note: One is that we’re usually not consciously aware of the heuristics that we’re using, or the information that we’re focusing on. Most of this is going on below the surface.
The second thing to note is these heuristics aren’t designed to give us the best solutions to our decision problems, all things considered. What they’re designed to do is give us solutions that are “good enough” for our immediate purposes.
But “good enough” might mean “good enough in our ancestral environments where these cognitive mechanisms evolved.” In contexts that are more removed from those ancestral environments (say in a marketing committee), we can end up making systematically bad choices or errors in reasoning, because we’re automatically, subconsciously invoking the heuristic in a situation where that heuristic isn’t necessarily the best rule to follow.
So, the term bias in this context refers to this systematic gap between how we’re actually disposed to behave or reason, and how we ought to behave or reason, by the standards of some normative theory of reasoning or decision-making. The “heuristic” is the rule of thumb that we’re using to make the decision or the judgment; the “bias” is the predictable effect of using that rule of thumb in situations where it doesn’t give an optimal result.
Some examples of heuristics in marketing decision-making we’ve observed are as follows:
  • “What’s happened in the past will happen tomorrow”: For example, making financial projections based simply on historical trends and not taking into account other factors that can affect future financial performance, such as projected GNP growth or the prime interest rate.
  • Argument to moderation: “Splitting the difference” between alternative marketing projections (e.g., a sales forecast) because one does not have sufficient time to develop a better forecast supported by good evidence and logic.
  • “Shooting from the hip”: Simply guessing what action to take because one does not have enough time to properly investigate a particular issue.
  • Alleged certainty: Relying on conventional wisdom—“Everyone knows that ...”—to make a decision as opposed to thinking through a problem in greater detail.
An Example: The Anchoring Effect
This is all pretty general, so let me give you an example of a cognitive bias and its related heuristic. This is known as the anchoring heuristic, or the anchoring effect.
Kahneman and Tversky did a famous experiment in the early 1970s where they asked a group of subjects to estimate the percentage of countries in Africa that are members of the United Nations. Of course most aren’t going to know this, for most of us this is just going to be a guess.
But for one group of subjects, they asked the question “Is this percentage more or less than 10 percent?” For another group of subjects, they asked the question “Is it more or less than 65 percent?”
The average of the answers of the two groups differed significantly. In the first group, the average answer was 25 percent. In the second group, the average answer was 45 percent. The second group estimated higher than the first group.
Why? Well, this is what seems to be going on. If subjects are exposed to a higher number, their estimates were “anchored” to that number. Give them a high number, they estimate higher; give them a lower number, they estimate lower.
So, the idea behind this anchoring heuristic is that when people are asked to estimate a probability or an uncertain number, rather than try to perform a complex calculation in their heads, they start with an implicitly suggested reference point—the anchor—and make adjustments from that reference point to reach their estimate. This is a shortcut; it’s a rule of thumb.
Now, you might think in this case, it’s not just the number; it’s the way the question is phrased that biases the estimates. The subjects are assuming that the researchers know the answer and that the reference number is therefore related in some way to the actual answer. But researchers have redone this experiment many times in different ways.
In one version, for example, the subjects are asked the same question, to estimate the percentage of African nations in the United Nations, but before they answer, the researcher spins a roulette wheel in front of the group, wait for it to land on a number so they can all see the number, and then ask them if the percentage of African nations is larger or smaller than the number on the wheel.
The results are the same. If the number is high people estimate high, if the number is low people estimate low. And in this case the subjects couldn’t possibly assume that the number on the roulette wheel had any relation to the actual percentage of African nations in the United Nations. But their estimates were anchored to this number anyway.
Results like these have proven to be really important for understanding how human beings process information and make judgments on the basis of information. The anchoring effect shows up in strategic negotiation behavior, consumer shopping behavior, in the behavior of stock and real estate markets—it shows up everywhere; it’s a very widespread and robust effect.
Note, also, that this behavior is, by the standards of our normative theories of correct reasoning, systematically irrational.
Linda Henman, Missouri-based business consultant, provides an example of when the anchoring effect can have a negative outcome: If a team leader asks his subordinates whether marketing efforts in a particular region should be increased by, say 20 percent, his employees will use that number as a cue. Perhaps the figure should be even higher, Henman notes, or the company needs to eliminate marketing efforts in that region altogether. By using the anchor of 20 percent, the boss has already planted a seed in his subordinates’ minds, which will be difficult to erase.3
Why This Is Important
So this is an example of a cognitive bias. Now, this would be interesting but not deeply significant if the anchoring effect was the only cognitive bias that we’ve discovered. But if you go to the Wikipedia entry under “list of cognitive biases,” you’ll find a page that lists just over a hundred of these biases, and the list is not exhaustive. I encourage everyone to check it out.
So what’s the upshot of all this for us as critical thinkers?
At the very least, we all need to acquire a certain level of cognitive bias “literacy.” We don’t need to become experts, but we should all be able to recognize the most important and most discussed cognitive biases. We should all know what “confirmation bias” is, what “base rate bias” is, what the “gambler’s fallacy” is, and so on. These are just as important as understandin...

Table of contents

  1. Cover
  2. Title
  3. Copyright
  4. Acknowledgments
  5. Section I: Think Better
  6. Section II: Cognitive Biases and Their Importance
  7. Section III: Conclusions
  8. Notes
  9. References
  10. Index
  11. Adpage